You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-user@hadoop.apache.org by John Hancock <jh...@gmail.com> on 2013/08/09 03:23:46 UTC

alternative to $HADOOP_HOME/lib

Where else might one put .jar files that a map/reduce job will need?

Re: alternative to $HADOOP_HOME/lib

Posted by Harsh J <ha...@cloudera.com>.
John,

I assume you do not wish to be using the DistributedCache (or a HDFS
location for DistributedCache), which is the most ideal way to ship
jars.

You can place your jars onto the TT classpaths by placing them at an
arbitrary location such as /opt/jars, and editing the TT's
hadoop-env.sh to extend HADOOP_CLASSPATH to include this extra
location. This would still require administrative configuration edits,
and service restarts each time you want to add a new jar or change a
jar. With DC, these aren't required.

On Fri, Aug 9, 2013 at 6:59 AM, Sanjeev Verma <sa...@gmail.com> wrote:
> On 08/08/2013 09:23 PM, John Hancock wrote:
>>
>> Where else might one put .jar files that a map/reduce job will need?
>
> Why do you need an alternative location? Is there a constraint on being able
> to place your library jars under $HADOOP_HOME/lib?



-- 
Harsh J

Re: alternative to $HADOOP_HOME/lib

Posted by Harsh J <ha...@cloudera.com>.
John,

I assume you do not wish to be using the DistributedCache (or a HDFS
location for DistributedCache), which is the most ideal way to ship
jars.

You can place your jars onto the TT classpaths by placing them at an
arbitrary location such as /opt/jars, and editing the TT's
hadoop-env.sh to extend HADOOP_CLASSPATH to include this extra
location. This would still require administrative configuration edits,
and service restarts each time you want to add a new jar or change a
jar. With DC, these aren't required.

On Fri, Aug 9, 2013 at 6:59 AM, Sanjeev Verma <sa...@gmail.com> wrote:
> On 08/08/2013 09:23 PM, John Hancock wrote:
>>
>> Where else might one put .jar files that a map/reduce job will need?
>
> Why do you need an alternative location? Is there a constraint on being able
> to place your library jars under $HADOOP_HOME/lib?



-- 
Harsh J

Re: alternative to $HADOOP_HOME/lib

Posted by Harsh J <ha...@cloudera.com>.
John,

I assume you do not wish to be using the DistributedCache (or a HDFS
location for DistributedCache), which is the most ideal way to ship
jars.

You can place your jars onto the TT classpaths by placing them at an
arbitrary location such as /opt/jars, and editing the TT's
hadoop-env.sh to extend HADOOP_CLASSPATH to include this extra
location. This would still require administrative configuration edits,
and service restarts each time you want to add a new jar or change a
jar. With DC, these aren't required.

On Fri, Aug 9, 2013 at 6:59 AM, Sanjeev Verma <sa...@gmail.com> wrote:
> On 08/08/2013 09:23 PM, John Hancock wrote:
>>
>> Where else might one put .jar files that a map/reduce job will need?
>
> Why do you need an alternative location? Is there a constraint on being able
> to place your library jars under $HADOOP_HOME/lib?



-- 
Harsh J

Re: alternative to $HADOOP_HOME/lib

Posted by Harsh J <ha...@cloudera.com>.
John,

I assume you do not wish to be using the DistributedCache (or a HDFS
location for DistributedCache), which is the most ideal way to ship
jars.

You can place your jars onto the TT classpaths by placing them at an
arbitrary location such as /opt/jars, and editing the TT's
hadoop-env.sh to extend HADOOP_CLASSPATH to include this extra
location. This would still require administrative configuration edits,
and service restarts each time you want to add a new jar or change a
jar. With DC, these aren't required.

On Fri, Aug 9, 2013 at 6:59 AM, Sanjeev Verma <sa...@gmail.com> wrote:
> On 08/08/2013 09:23 PM, John Hancock wrote:
>>
>> Where else might one put .jar files that a map/reduce job will need?
>
> Why do you need an alternative location? Is there a constraint on being able
> to place your library jars under $HADOOP_HOME/lib?



-- 
Harsh J

Re: alternative to $HADOOP_HOME/lib

Posted by Sanjeev Verma <sa...@gmail.com>.
On 08/08/2013 09:23 PM, John Hancock wrote:
> Where else might one put .jar files that a map/reduce job will need?
Why do you need an alternative location? Is there a constraint on being 
able to place your library jars under $HADOOP_HOME/lib?

Re: alternative to $HADOOP_HOME/lib

Posted by Sanjeev Verma <sa...@gmail.com>.
On 08/08/2013 09:23 PM, John Hancock wrote:
> Where else might one put .jar files that a map/reduce job will need?
Why do you need an alternative location? Is there a constraint on being 
able to place your library jars under $HADOOP_HOME/lib?

Re: alternative to $HADOOP_HOME/lib

Posted by Sanjeev Verma <sa...@gmail.com>.
On 08/08/2013 09:23 PM, John Hancock wrote:
> Where else might one put .jar files that a map/reduce job will need?
Why do you need an alternative location? Is there a constraint on being 
able to place your library jars under $HADOOP_HOME/lib?

Re: alternative to $HADOOP_HOME/lib

Posted by Sanjeev Verma <sa...@gmail.com>.
On 08/08/2013 09:23 PM, John Hancock wrote:
> Where else might one put .jar files that a map/reduce job will need?
Why do you need an alternative location? Is there a constraint on being 
able to place your library jars under $HADOOP_HOME/lib?