You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Theodore Van Rooy <mu...@gmail.com> on 2008/03/06 23:32:57 UTC

using a perl script with argument variables which point to config files on the DFS as a mapper

I would like to convert a perl script that currently uses argument variables
to run with Hadoop Streaming.

Normally I would use the script like

'cat datafile.txt | myscript.pl  folder/myfile1.txt  folder/myfile2.txt'

where the two argument variables are actually the names of configuration
files for the myscript.pl.

The question I have is, how do I get the perl script to either look in the
local directory for the config files, or how would I go about getting them
to look on the DFS for the config files? Once the configurations are passed
in there is no problem using the STDIN to process the datafile passed into
it by hadoop.