You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by Donatella Firmani <do...@yahoo.com> on 2011/05/01 14:32:03 UTC

Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH

Dear Alex,

thanks for you kind assistance.

I ran the job giving the option with a 
-Dmapred.child.env="LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3"
  flag.

Checking the job.xml file via JT UI, I can verify that the parameters have the 
correct values for the job. 

There appears a line with :

mapred.child.env    

LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3

Unfortunately, the value of the call System.getenv("LD_LIBRARY_PATH") is 
different (without the libs) and so the job does not still work.


What do you think about it?
Cheers,
DF



________________________________
From: Alex Kozlov <al...@cloudera.com>
To: mapreduce-user@hadoop.apache.org
Sent: Fri, April 29, 2011 8:01:30 PM
Subject: Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH

The next step is to find the job.xml file and check (either in the 
mapred.local.dir in local FS or in the JT web UI)...


On Fri, Apr 29, 2011 at 10:59 AM, Donatella Firmani <do...@yahoo.com> 
wrote:

Dear Alex,
>
>that's exactly the point. I made my mapper process dump on log files the result 
>of 
>
>
>
>System.getenv("LD_LIBRARY_PATH")
>System.getProperty("java.library.path")
>
>and none of the values seem to be affected neither by the setting of 
>mapred.child.java.opts or of mapred.child.env. in the mapred-site.xml file. 
>
>
>Maybe is there something else that I have to do to make LD_LIBRARY_PATH in the 
>JVM environment be correctly set? There are some restrictions on the values that 
>it can  assume (i.e. under HDFS an non in the FS of the node)?
>
>Cheers,
>DF
>
>
>
>
________________________________
From: Alex Kozlov <al...@cloudera.com>
>To: mapreduce-user@hadoop.apache.org
>Sent: Fri, April 29, 2011 7:52:19 PM
>
>Subject: Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>
>
>The option should be passed to the child JVM environment when it is started.  
>You can set most of the environment variables to garbage with no side-effect.  A 
>more important question what is the LD_LIBRARY_PATH in your JVM environment.
>
>Once again, check the job.xml file in the mapred.local.dir (should be 
>/tmp/cache/${user.name}/... or something like this in the pseudo-config 
>environment) or try to print out the environment variables directly in your 
>map/reduce task.
>
>Alex K
>
>
>On Fri, Apr 29, 2011 at 10:37 AM, Donatella Firmani <do...@yahoo.com> 
>wrote:
>
>I just tried giving the option 
>-Dmapred.child.env="LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3"
> writing no-sense environment variables like  -Dmapred.child.env="blahblablah".
>>
>>It continues working... so I think that the option is completely ignored by the 
>>bin/hadoop script.
>>
>>Do you think it is an expected behavior?
>>
>>Cheers,
>>DF
>>
>>
>>
>>
________________________________
From: Alex Kozlov <al...@cloudera.com>
>>
>>To: mapreduce-user@hadoop.apache.org
>>Sent: Fri, April 29, 2011 7:03:50 PM
>>
>>Subject: Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>>
>>
>>You need only to edit the config files on the client or give the option with a 
>>-Dmapred.child.env="LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3"
>> flag (if you implement Tool).  You can check the job.xml file via JT UI to 
>>verify that the parameters have the correct values for the job.
>>
>>
>>On Fri, Apr 29, 2011 at 9:05 AM, Donatella Firmani <do...@yahoo.com> 
>>wrote:
>>
>>Dear Yin,
>>>
>>>Good point: I can try to install 0.19 and reproduce the problem. I'll let you 
>>>know ASAP.
>>>
>>>Thanks,
>>>DF
>>>
>>>
>>>
>>>
>>>
>>>
________________________________
From: Yin Lou <yi...@gmail.com>
>>>
>>>To: mapreduce-user@hadoop.apache.org
>>>Sent: Fri, April 29, 2011 5:59:14 PM
>>>Subject: Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>>>
>>>
>>>Just curious, can we do this in 0.19?
>>>
>>>Thanks,
>>>Yin
>>>
>>>
>>>On Fri, Apr 29, 2011 at 10:29 AM, Robert Evans <ev...@yahoo-inc.com> wrote:
>>>
>>>DF,
>>>>
>>>>You can set mapred.child.java.opts to set java options, but you can also set 
>>>>mapred.child.env to set environment variables, be careful because they are space 
>>>>separated with an = in between them.
>>>>
>>>>     <property>
>>>>
>>>>   <name>mapred.child.env</name>
>>>>>    <value>LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3</value>
>>>>>
>>>>>    </property>
>>>
>>>--Bobby
>>>
>>>
>>>On 4/29/11 5:58 AM, "Donatella Firmani" <do...@yahoo.com> wrote:
>>>
>>>
>>>To solve the issue addressed in my previous message, i tried setting property
>>>>mapred.child.java.opts in mapred-site.xml. But - even if it seems the right
>>>>approach in relation to what said in blogs & forums - there is a big problem
>>>>with it.
>>>>
>>>>Following the tutorial (hadoop website) as section Task Execution & 
>>>Environment,
>>>>
>>>>my xml looks like:
>>>>
>>>><configuration>
>>>>     <property>
>>>>         <name>mapred.job.tracker</name>
>>>>         <value>localhost:9001</value>
>>>>     </property>
>>>>     <property>
>>>>         <name>mapred.child.java.opts</name>
>>>>         <value>
>>>>-Djava.library.path=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3
>>>>
>>>>
>>>>
>>>>         </value>
>>>>     </property>
>>>></configuration>
>>>>
>>>>The problem arises when executing the job, because it is thrown an 
exception:
>>>>
>>>>Exception in thread "main" java.lang.NoClassDefFoundError:
>>>>-Djava/library/path=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3
>>>>
>>>>
>>>>
>>>>
>>>>Any help would be appreciated.
>>>>Thanks in advance,
>>>>
>>>>DF
>>>>
>>>>
>>>>
>>>>----- Original Message ----
>>>>From: Donatella Firmani <do...@yahoo.com>
>>>>To: mapreduce-user@hadoop.apache.org
>>>>Sent: Fri, April 29, 2011 12:57:52 PM
>>>>Subject: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>>>>
>>>>
>>>>
>>>>Hi to all,
>>>>
>>>>I just subscribed to this mailing list and I'd like to ask you if anyone 
>knows
>>>>how to deal with LD_LIBRARY_PATH.
>>>>I have a Java application that needs a proper setting of this environment
>>>>variable to work under Linux-Ubuntu.
>>>>I want to use tis application from a mapreduce job, unfortunately I could 
not
>>>>find a way to make things work against the LD_LIBRARY_PATH environment 
>>>variable.
>>>>
>>>>I tried so many different strategies and I am stuck. Maybe someone of you 
can
>>>>help.
>>>>
>>>>Thanks in advance,
>>>>Cheers.
>>>>
>>>>DF
>>>>
>>>>PS: I use hadoop-0-20-2
>>>>
>>>>
>>>
>>
>

Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH

Posted by Alex Kozlov <al...@cloudera.com>.
In the "standalone" hadoop application, try setting `export
HADOOP_OPTS="-Djava.library.path=..."`
w/o explicitly setting LD_LIBRARY_PATH.

On Mon, May 2, 2011 at 8:07 AM, Alex Kozlov <al...@cloudera.com> wrote:

> Try adding -Djava.library.path=... to the *mapred.child.java.opts*string.  Otherwise, upgrade to 0.21.
>
> On Mon, May 2, 2011 at 2:24 AM, Donatella Firmani <
> donatellafirmani@yahoo.com> wrote:
>
>>  Hi Alex,
>>
>> thanks for your reply. The value of *mapred.child.java.opts* in the
>> job.xml is just -Xmx200m.
>>
>> Yes, for "standalone" I mean exactly a single JVM. To test my program in
>> this environment I followed the mepreduce tutorial at the section "Standalone
>> Operation".
>>
>> With a single JVM my program, if I - instead exporting the LD_LIBRARY_PATH
>> - use the "-Djava.library.path=..."
>> option of the bin/hadoop command, does not work. Same situation with a
>> similar experiment, i.e. using the option  "-Dmapred.child.env="
>> LD_LIBRARY_PATH=....".
>> It works just if I explicitly set the system environment variable
>> LD_LIBRARY_PATH.
>>
>> Do you think that using the 0.21 could be a good idea?
>>
>> Thanks,
>> DF
>>
>>
>> ------------------------------
>> *From:* Alex Kozlov <al...@cloudera.com>
>> *To:* mapreduce-user@hadoop.apache.org
>> *Sent:* Mon, May 2, 2011 5:11:08 AM
>>
>> *Subject:* Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>>
>> And what is the value of *mapred.child.java.opts* in the job.xml?  Is it
>> something like `-Djava.library.path=...`?  The problem might be that *
>> mapred.child.env* was introduced only in 0.21.
>>
>> I assume by the "standalone" installation you mean a single JVM.  Does
>> your program works with -Dava.library.path=... in this environment?
>>
>> On Sun, May 1, 2011 at 9:01 AM, Donatella Firmani <
>> donatellafirmani@yahoo.com> wrote:
>>
>>>
>>> Maybe can help that in a standalone installation of hadoop map reduce it
>>> works.
>>> I know that it is trivial because it is sufficient to type "export
>>> LD_LIBRARY_PATH=..." in the user shell...
>>> It is just to be sure that I did not forget anything that may be useful.
>>>
>>> Cheers,
>>> DF
>>>
>>> ------------------------------
>>> *From:* Donatella Firmani <do...@yahoo.com>
>>> *To:* mapreduce-user@hadoop.apache.org
>>> *Sent:* Sun, May 1, 2011 2:32:03 PM
>>>
>>> *Subject:* Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>>>
>>> Dear Alex,
>>>
>>> thanks for you kind assistance.
>>>
>>> I ran the job giving the option with a -Dmapred.child.env="
>>> LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3"
>>> flag.
>>>
>>> Checking the job.xml file via JT UI, I can verify that the parameters
>>> have the correct values for the job.
>>> There appears a line with :
>>>
>>> mapred.child.env
>>>
>>> LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/
>>> mylibpath3/lib3
>>>
>>> Unfortunately, the value of the call System.getenv("LD_LIBRARY_PATH") is
>>> different (without the libs) and so the job does not still work.
>>>
>>> What do you think about it?
>>> Cheers,
>>> DF
>>>
>>> ------------------------------
>>> *From:* Alex Kozlov <al...@cloudera.com>
>>> *To:* mapreduce-user@hadoop.apache.org
>>> *Sent:* Fri, April 29, 2011 8:01:30 PM
>>> *Subject:* Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>>>
>>> The next step is to find the job.xml file and check (either in the
>>> mapred.local.dir in local FS or in the JT web UI)...
>>>
>>> On Fri, Apr 29, 2011 at 10:59 AM, Donatella Firmani <
>>> donatellafirmani@yahoo.com> wrote:
>>>
>>>> Dear Alex,
>>>>
>>>> that's exactly the point. I made my mapper process dump on log files the
>>>> result of
>>>>
>>>> System.getenv("LD_LIBRARY_PATH")
>>>> System.getProperty("java.library.path")
>>>>
>>>> and none of the values seem to be affected neither by the setting of mapred.child.java.opts
>>>> or of mapred.child.env. in the mapred-site.xml file.
>>>>
>>>> Maybe is there something else that I have to do to make LD_LIBRARY_PATH
>>>> in the JVM environment be correctly set? There are some restrictions on
>>>> the values that it can assume (i.e. under HDFS an non in the FS of the
>>>> node)?
>>>>
>>>> Cheers,
>>>> DF
>>>>
>>>> ------------------------------
>>>> *From:* Alex Kozlov <al...@cloudera.com>
>>>> *To:* mapreduce-user@hadoop.apache.org
>>>> *Sent:* Fri, April 29, 2011 7:52:19 PM
>>>>
>>>> *Subject:* Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>>>>
>>>> The option should be passed to the child JVM environment when it is
>>>> started.  You can set most of the environment variables to garbage with no
>>>> side-effect.  A more important question what is the LD_LIBRARY_PATH in
>>>> your JVM environment.
>>>>
>>>> Once again, check the job.xml file in the mapred.local.dir (should be
>>>> /tmp/cache/${user.name}/... or something like this in the pseudo-config
>>>> environment) or try to print out the environment variables directly in your
>>>> map/reduce task.
>>>>
>>>> Alex K
>>>>
>>>> On Fri, Apr 29, 2011 at 10:37 AM, Donatella Firmani <
>>>> donatellafirmani@yahoo.com> wrote:
>>>>
>>>>> I just tried giving the option -Dmapred.child.env="LD_LIBRARY_PATH=/
>>>>> home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3"
>>>>> writing no-sense environment variables like  -Dmapred.child.env="
>>>>> blahblablah".
>>>>>
>>>>> It continues working... so I think that the option is completely
>>>>> ignored by the bin/hadoop script.
>>>>>
>>>>> Do you think it is an expected behavior?
>>>>>
>>>>> Cheers,
>>>>> DF
>>>>>
>>>>> ------------------------------
>>>>> *From:* Alex Kozlov <al...@cloudera.com>
>>>>>
>>>>> *To:* mapreduce-user@hadoop.apache.org
>>>>> *Sent:* Fri, April 29, 2011 7:03:50 PM
>>>>>
>>>>> *Subject:* Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>>>>>
>>>>> You need only to edit the config files on the client or give the option
>>>>> with a -Dmapred.child.env="LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/
>>>>> mylibpath2/lib2;home/mylibpath3/lib3" flag (if you implement Tool).
>>>>> You can check the job.xml file via JT UI to verify that the parameters have
>>>>> the correct values for the job.
>>>>>
>>>>> On Fri, Apr 29, 2011 at 9:05 AM, Donatella Firmani <
>>>>> donatellafirmani@yahoo.com> wrote:
>>>>>
>>>>>> Dear Yin,
>>>>>>
>>>>>> Good point: I can try to install 0.19 and reproduce the problem. I'll
>>>>>> let you know ASAP.
>>>>>>
>>>>>> Thanks,
>>>>>> DF
>>>>>>
>>>>>>
>>>>>> ------------------------------
>>>>>> *From:* Yin Lou <yi...@gmail.com>
>>>>>>
>>>>>> *To:* mapreduce-user@hadoop.apache.org
>>>>>> *Sent:* Fri, April 29, 2011 5:59:14 PM
>>>>>> *Subject:* Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>>>>>>
>>>>>> Just curious, can we do this in 0.19?
>>>>>>
>>>>>> Thanks,
>>>>>> Yin
>>>>>>
>>>>>> On Fri, Apr 29, 2011 at 10:29 AM, Robert Evans <ev...@yahoo-inc.com>wrote:
>>>>>>
>>>>>>>  DF,
>>>>>>>
>>>>>>> You can set mapred.child.java.opts to set java options, but you can
>>>>>>> also set mapred.child.env to set environment variables, be careful because
>>>>>>> they are space separated with an = in between them.
>>>>>>>
>>>>>>>      <property>
>>>>>>>
>>>>>>>    <name>mapred.child.env</name>
>>>>>>>
>>>>>>>     <value>LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3</value>
>>>>>>>
>>>>>>>     </property>
>>>>>>>
>>>>>>> --Bobby
>>>>>>>
>>>>>>>
>>>>>>> On 4/29/11 5:58 AM, "Donatella Firmani" <do...@yahoo.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>> To solve the issue addressed in my previous message, i tried setting
>>>>>>> property
>>>>>>> mapred.child.java.opts in mapred-site.xml. But - even if it seems the
>>>>>>> right
>>>>>>> approach in relation to what said in blogs & forums - there is a big
>>>>>>> problem
>>>>>>> with it.
>>>>>>>
>>>>>>> Following the tutorial (hadoop website) as section Task Execution &
>>>>>>> Environment,
>>>>>>>
>>>>>>> my xml looks like:
>>>>>>>
>>>>>>> <configuration>
>>>>>>>      <property>
>>>>>>>          <name>mapred.job.tracker</name>
>>>>>>>          <value>localhost:9001</value>
>>>>>>>      </property>
>>>>>>>      <property>
>>>>>>>          <name>mapred.child.java.opts</name>
>>>>>>>          <value>
>>>>>>>
>>>>>>> -Djava.library.path=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3
>>>>>>>
>>>>>>>
>>>>>>>          </value>
>>>>>>>      </property>
>>>>>>> </configuration>
>>>>>>>
>>>>>>> The problem arises when executing the job, because it is thrown an
>>>>>>> exception:
>>>>>>>
>>>>>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>>>>>>
>>>>>>> -Djava/library/path=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Any help would be appreciated.
>>>>>>> Thanks in advance,
>>>>>>>
>>>>>>> DF
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> ----- Original Message ----
>>>>>>> From: Donatella Firmani <do...@yahoo.com>
>>>>>>> To: mapreduce-user@hadoop.apache.org
>>>>>>> Sent: Fri, April 29, 2011 12:57:52 PM
>>>>>>> Subject: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> Hi to all,
>>>>>>>
>>>>>>> I just subscribed to this mailing list and I'd like to ask you if
>>>>>>> anyone knows
>>>>>>> how to deal with LD_LIBRARY_PATH.
>>>>>>> I have a Java application that needs a proper setting of this
>>>>>>> environment
>>>>>>> variable to work under Linux-Ubuntu.
>>>>>>> I want to use tis application from a mapreduce job, unfortunately I
>>>>>>> could not
>>>>>>> find a way to make things work against the LD_LIBRARY_PATH
>>>>>>> environment variable.
>>>>>>>
>>>>>>> I tried so many different strategies and I am stuck. Maybe someone of
>>>>>>> you can
>>>>>>> help.
>>>>>>>
>>>>>>> Thanks in advance,
>>>>>>> Cheers.
>>>>>>>
>>>>>>> DF
>>>>>>>
>>>>>>> PS: I use hadoop-0-20-2
>>>>>>>
>>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH

Posted by Alex Kozlov <al...@cloudera.com>.
Try adding -Djava.library.path=... to the *mapred.child.java.opts* string.
Otherwise, upgrade to 0.21.

On Mon, May 2, 2011 at 2:24 AM, Donatella Firmani <
donatellafirmani@yahoo.com> wrote:

>  Hi Alex,
>
> thanks for your reply. The value of *mapred.child.java.opts* in the
> job.xml is just -Xmx200m.
>
> Yes, for "standalone" I mean exactly a single JVM. To test my program in
> this environment I followed the mepreduce tutorial at the section "Standalone
> Operation".
>
> With a single JVM my program, if I - instead exporting the LD_LIBRARY_PATH
> - use the "-Djava.library.path=..."
> option of the bin/hadoop command, does not work. Same situation with a
> similar experiment, i.e. using the option  "-Dmapred.child.env="
> LD_LIBRARY_PATH=....".
> It works just if I explicitly set the system environment variable
> LD_LIBRARY_PATH.
>
> Do you think that using the 0.21 could be a good idea?
>
> Thanks,
> DF
>
>
> ------------------------------
> *From:* Alex Kozlov <al...@cloudera.com>
> *To:* mapreduce-user@hadoop.apache.org
> *Sent:* Mon, May 2, 2011 5:11:08 AM
>
> *Subject:* Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>
> And what is the value of *mapred.child.java.opts* in the job.xml?  Is it
> something like `-Djava.library.path=...`?  The problem might be that *
> mapred.child.env* was introduced only in 0.21.
>
> I assume by the "standalone" installation you mean a single JVM.  Does your
> program works with -Dava.library.path=... in this environment?
>
> On Sun, May 1, 2011 at 9:01 AM, Donatella Firmani <
> donatellafirmani@yahoo.com> wrote:
>
>>
>> Maybe can help that in a standalone installation of hadoop map reduce it
>> works.
>> I know that it is trivial because it is sufficient to type "export
>> LD_LIBRARY_PATH=..." in the user shell...
>> It is just to be sure that I did not forget anything that may be useful.
>>
>> Cheers,
>> DF
>>
>> ------------------------------
>> *From:* Donatella Firmani <do...@yahoo.com>
>> *To:* mapreduce-user@hadoop.apache.org
>> *Sent:* Sun, May 1, 2011 2:32:03 PM
>>
>> *Subject:* Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>>
>> Dear Alex,
>>
>> thanks for you kind assistance.
>>
>> I ran the job giving the option with a -Dmapred.child.env="
>> LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3"
>> flag.
>>
>> Checking the job.xml file via JT UI, I can verify that the parameters have
>> the correct values for the job.
>> There appears a line with :
>>
>> mapred.child.env
>>
>> LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/
>> mylibpath3/lib3
>>
>> Unfortunately, the value of the call System.getenv("LD_LIBRARY_PATH") is
>> different (without the libs) and so the job does not still work.
>>
>> What do you think about it?
>> Cheers,
>> DF
>>
>> ------------------------------
>> *From:* Alex Kozlov <al...@cloudera.com>
>> *To:* mapreduce-user@hadoop.apache.org
>> *Sent:* Fri, April 29, 2011 8:01:30 PM
>> *Subject:* Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>>
>> The next step is to find the job.xml file and check (either in the
>> mapred.local.dir in local FS or in the JT web UI)...
>>
>> On Fri, Apr 29, 2011 at 10:59 AM, Donatella Firmani <
>> donatellafirmani@yahoo.com> wrote:
>>
>>> Dear Alex,
>>>
>>> that's exactly the point. I made my mapper process dump on log files the
>>> result of
>>>
>>> System.getenv("LD_LIBRARY_PATH")
>>> System.getProperty("java.library.path")
>>>
>>> and none of the values seem to be affected neither by the setting of mapred.child.java.opts
>>> or of mapred.child.env. in the mapred-site.xml file.
>>>
>>> Maybe is there something else that I have to do to make LD_LIBRARY_PATH
>>> in the JVM environment be correctly set? There are some restrictions on
>>> the values that it can assume (i.e. under HDFS an non in the FS of the
>>> node)?
>>>
>>> Cheers,
>>> DF
>>>
>>> ------------------------------
>>> *From:* Alex Kozlov <al...@cloudera.com>
>>> *To:* mapreduce-user@hadoop.apache.org
>>> *Sent:* Fri, April 29, 2011 7:52:19 PM
>>>
>>> *Subject:* Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>>>
>>> The option should be passed to the child JVM environment when it is
>>> started.  You can set most of the environment variables to garbage with no
>>> side-effect.  A more important question what is the LD_LIBRARY_PATH in
>>> your JVM environment.
>>>
>>> Once again, check the job.xml file in the mapred.local.dir (should be
>>> /tmp/cache/${user.name}/... or something like this in the pseudo-config
>>> environment) or try to print out the environment variables directly in your
>>> map/reduce task.
>>>
>>> Alex K
>>>
>>> On Fri, Apr 29, 2011 at 10:37 AM, Donatella Firmani <
>>> donatellafirmani@yahoo.com> wrote:
>>>
>>>> I just tried giving the option -Dmapred.child.env="LD_LIBRARY_PATH=/
>>>> home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3" writing
>>>> no-sense environment variables like  -Dmapred.child.env="blahblablah".
>>>>
>>>> It continues working... so I think that the option is completely ignored
>>>> by the bin/hadoop script.
>>>>
>>>> Do you think it is an expected behavior?
>>>>
>>>> Cheers,
>>>> DF
>>>>
>>>> ------------------------------
>>>> *From:* Alex Kozlov <al...@cloudera.com>
>>>>
>>>> *To:* mapreduce-user@hadoop.apache.org
>>>> *Sent:* Fri, April 29, 2011 7:03:50 PM
>>>>
>>>> *Subject:* Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>>>>
>>>> You need only to edit the config files on the client or give the option
>>>> with a -Dmapred.child.env="LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/
>>>> mylibpath2/lib2;home/mylibpath3/lib3" flag (if you implement Tool).
>>>> You can check the job.xml file via JT UI to verify that the parameters have
>>>> the correct values for the job.
>>>>
>>>> On Fri, Apr 29, 2011 at 9:05 AM, Donatella Firmani <
>>>> donatellafirmani@yahoo.com> wrote:
>>>>
>>>>> Dear Yin,
>>>>>
>>>>> Good point: I can try to install 0.19 and reproduce the problem. I'll
>>>>> let you know ASAP.
>>>>>
>>>>> Thanks,
>>>>> DF
>>>>>
>>>>>
>>>>> ------------------------------
>>>>> *From:* Yin Lou <yi...@gmail.com>
>>>>>
>>>>> *To:* mapreduce-user@hadoop.apache.org
>>>>> *Sent:* Fri, April 29, 2011 5:59:14 PM
>>>>> *Subject:* Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>>>>>
>>>>> Just curious, can we do this in 0.19?
>>>>>
>>>>> Thanks,
>>>>> Yin
>>>>>
>>>>> On Fri, Apr 29, 2011 at 10:29 AM, Robert Evans <ev...@yahoo-inc.com>wrote:
>>>>>
>>>>>>  DF,
>>>>>>
>>>>>> You can set mapred.child.java.opts to set java options, but you can
>>>>>> also set mapred.child.env to set environment variables, be careful because
>>>>>> they are space separated with an = in between them.
>>>>>>
>>>>>>      <property>
>>>>>>
>>>>>>    <name>mapred.child.env</name>
>>>>>>
>>>>>>     <value>LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3</value>
>>>>>>
>>>>>>     </property>
>>>>>>
>>>>>> --Bobby
>>>>>>
>>>>>>
>>>>>> On 4/29/11 5:58 AM, "Donatella Firmani" <do...@yahoo.com>
>>>>>> wrote:
>>>>>>
>>>>>> To solve the issue addressed in my previous message, i tried setting
>>>>>> property
>>>>>> mapred.child.java.opts in mapred-site.xml. But - even if it seems the
>>>>>> right
>>>>>> approach in relation to what said in blogs & forums - there is a big
>>>>>> problem
>>>>>> with it.
>>>>>>
>>>>>> Following the tutorial (hadoop website) as section Task Execution &
>>>>>> Environment,
>>>>>>
>>>>>> my xml looks like:
>>>>>>
>>>>>> <configuration>
>>>>>>      <property>
>>>>>>          <name>mapred.job.tracker</name>
>>>>>>          <value>localhost:9001</value>
>>>>>>      </property>
>>>>>>      <property>
>>>>>>          <name>mapred.child.java.opts</name>
>>>>>>          <value>
>>>>>>
>>>>>> -Djava.library.path=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3
>>>>>>
>>>>>>
>>>>>>          </value>
>>>>>>      </property>
>>>>>> </configuration>
>>>>>>
>>>>>> The problem arises when executing the job, because it is thrown an
>>>>>> exception:
>>>>>>
>>>>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>>>>>
>>>>>> -Djava/library/path=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3
>>>>>>
>>>>>>
>>>>>>
>>>>>> Any help would be appreciated.
>>>>>> Thanks in advance,
>>>>>>
>>>>>> DF
>>>>>>
>>>>>>
>>>>>>
>>>>>> ----- Original Message ----
>>>>>> From: Donatella Firmani <do...@yahoo.com>
>>>>>> To: mapreduce-user@hadoop.apache.org
>>>>>> Sent: Fri, April 29, 2011 12:57:52 PM
>>>>>> Subject: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>>>>>>
>>>>>>
>>>>>>
>>>>>> Hi to all,
>>>>>>
>>>>>> I just subscribed to this mailing list and I'd like to ask you if
>>>>>> anyone knows
>>>>>> how to deal with LD_LIBRARY_PATH.
>>>>>> I have a Java application that needs a proper setting of this
>>>>>> environment
>>>>>> variable to work under Linux-Ubuntu.
>>>>>> I want to use tis application from a mapreduce job, unfortunately I
>>>>>> could not
>>>>>> find a way to make things work against the LD_LIBRARY_PATH environment
>>>>>> variable.
>>>>>>
>>>>>> I tried so many different strategies and I am stuck. Maybe someone of
>>>>>> you can
>>>>>> help.
>>>>>>
>>>>>> Thanks in advance,
>>>>>> Cheers.
>>>>>>
>>>>>> DF
>>>>>>
>>>>>> PS: I use hadoop-0-20-2
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH

Posted by Donatella Firmani <do...@yahoo.com>.
Hi Alex,

thanks for your reply. The value of mapred.child.java.optsin the job.xml is just 
-Xmx200m.

Yes, for "standalone" I mean exactly a single JVM. To test my program in this 
environment I followed the mepreduce tutorial at the section "Standalone 
Operation".

With a single JVM my program, if I - instead exporting the LD_LIBRARY_PATH - use 
the "-Djava.library.path=..."
option of the bin/hadoop command, does not work. Same situation with a similar 
experiment, i.e. using the option  "-Dmapred.child.env="LD_LIBRARY_PATH=....".
It works just if I explicitly set the system environment variable 
LD_LIBRARY_PATH.

Do you think that using the 0.21 could be a good idea?

Thanks,
DF
  



________________________________
From: Alex Kozlov <al...@cloudera.com>
To: mapreduce-user@hadoop.apache.org
Sent: Mon, May 2, 2011 5:11:08 AM
Subject: Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH

And what is the value of mapred.child.java.opts in the job.xml?  Is it something 
like `-Djava.library.path=...`?  The problemmight be that mapred.child.env was 
introduced only in 0.21.

I assume by the "standalone" installation you mean a single JVM.  Does your 
program works with -Dava.library.path=... in this environment?


On Sun, May 1, 2011 at 9:01 AM, Donatella Firmani <do...@yahoo.com> 
wrote:


>
>Maybe can help that in a standalone installation of hadoop map reduce it works.
>I know that it is trivial because it is sufficient to type "export 
>LD_LIBRARY_PATH=..." in the user shell... 
>It is just to be sure that I did not forget anything that may be useful.
>
>
>Cheers,
>DF
>
>
>
________________________________
 
>From: Donatella Firmani <do...@yahoo.com>
>To: mapreduce-user@hadoop.apache.org
>Sent: Sun, May  1, 2011 2:32:03 PM
>
>Subject: Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>
>
>
>Dear Alex,
>
>thanks for you kind assistance.
>
>I ran the job giving the option with a 
>-Dmapred.child.env="LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3"
>  flag.
>
>Checking the job.xml file via JT UI, I can verify that the parameters have the 
>correct values for the job. 
>
>There appears a line with :
>
>mapred.child.env    
>
>LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3
>
>Unfortunately, the value of the call System.getenv("LD_LIBRARY_PATH") is 
>different (without the libs) and so the job does not still work.
>
>
>What do you think about it?
>Cheers,
>DF
>
>
>
________________________________
From: Alex Kozlov <al...@cloudera.com>
>To: mapreduce-user@hadoop.apache.org
>Sent: Fri, April 29, 2011 8:01:30 PM
>Subject: Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>
>The next step is to find the job.xml file and check (either in the 
>mapred.local.dir in local FS or in the JT web UI)...
>
>
>On Fri, Apr 29, 2011 at 10:59 AM, Donatella Firmani <do...@yahoo.com> 
>wrote:
>
>Dear Alex,
>>
>>that's exactly the point. I made my mapper process dump on log files the result 
>>of 
>>
>>
>>
>>System.getenv("LD_LIBRARY_PATH")
>>System.getProperty("java.library.path")
>>
>>and none of the values seem to be affected neither by the setting of 
>>mapred.child.java.opts or of mapred.child.env. in the mapred-site.xml file. 
>>
>>
>>Maybe is there something else that I have to do to make LD_LIBRARY_PATH in the 
>>JVM environment be correctly set? There are some restrictions on the values that 
>>it can  assume (i.e. under HDFS an non in the FS of the node)?
>>
>>Cheers,
>>DF
>>
>>
>>
>>
________________________________
From: Alex Kozlov <al...@cloudera.com>
>>To: mapreduce-user@hadoop.apache.org
>>Sent: Fri, April 29, 2011 7:52:19 PM
>>
>>Subject: Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>>
>>
>>The option should be passed to the child JVM environment when it is started.  
>>You can set most of the environment variables to garbage with no side-effect.  A 
>>more important question what is the LD_LIBRARY_PATH in your JVM environment.
>>
>>Once again, check the job.xml file in the mapred.local.dir (should be 
>>/tmp/cache/${user.name}/... or something like this in the pseudo-config 
>>environment) or try to print out the environment variables directly in your 
>>map/reduce task.
>>
>>Alex K
>>
>>
>>On Fri, Apr 29, 2011 at 10:37 AM, Donatella Firmani <do...@yahoo.com> 
>>wrote:
>>
>>I just tried giving the option 
>>-Dmapred.child.env="LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3"
>> writing no-sense environment variables like  -Dmapred.child.env="blahblablah".
>>>
>>>It continues working... so I think that the option is completely ignored by the 
>>>bin/hadoop script.
>>>
>>>Do you think it is an expected behavior?
>>>
>>>Cheers,
>>>DF
>>>
>>>
>>>
>>>
________________________________
From: Alex Kozlov <al...@cloudera.com>
>>>
>>>To: mapreduce-user@hadoop.apache.org
>>>Sent: Fri, April 29, 2011 7:03:50 PM
>>>
>>>Subject: Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>>>
>>>
>>>You need only to edit the config files on the client or give the option with a 
>>>-Dmapred.child.env="LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3"
>>> flag (if you implement Tool).  You can check the job.xml file via JT UI to 
>>>verify that the parameters have the correct values for the job.
>>>
>>>
>>>On Fri, Apr 29, 2011 at 9:05 AM, Donatella Firmani <do...@yahoo.com> 
>>>wrote:
>>>
>>>Dear Yin,
>>>>
>>>>Good point: I can try to install 0.19 and reproduce the problem. I'll let you 
>>>>know ASAP.
>>>>
>>>>Thanks,
>>>>DF
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
________________________________
From: Yin Lou <yi...@gmail.com>
>>>>
>>>>To: mapreduce-user@hadoop.apache.org
>>>>Sent: Fri, April 29, 2011 5:59:14 PM
>>>>Subject: Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>>>>
>>>>
>>>>Just curious, can we do this in 0.19?
>>>>
>>>>Thanks,
>>>>Yin
>>>>
>>>>
>>>>On Fri, Apr 29, 2011 at 10:29 AM, Robert Evans <ev...@yahoo-inc.com> wrote:
>>>>
>>>>DF,
>>>>>
>>>>>You can set mapred.child.java.opts to set java options, but you can also set 
>>>>>mapred.child.env to set environment variables, be careful because they are space 
>>>>>separated with an = in between them.
>>>>>
>>>>>     <property>
>>>>>
>>>>>   <name>mapred.child.env</name>
>>>>>>    <value>LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3</value>
>>>>>>
>>>>>>    </property>
>>>>
>>>>--Bobby
>>>>
>>>>
>>>>On 4/29/11 5:58 AM, "Donatella Firmani" <do...@yahoo.com> wrote:
>>>>
>>>>
>>>>To solve the issue addressed in my previous message, i tried setting 
property
>>>>>mapred.child.java.opts in mapred-site.xml. But - even if it seems the right
>>>>>approach in relation to what said in blogs & forums - there is a big 
problem
>>>>>with it.
>>>>>
>>>>>Following the tutorial (hadoop website) as section Task Execution & 
>>>>Environment,
>>>>>
>>>>>my xml looks like:
>>>>>
>>>>><configuration>
>>>>>     <property>
>>>>>         <name>mapred.job.tracker</name>
>>>>>         <value>localhost:9001</value>
>>>>>     </property>
>>>>>     <property>
>>>>>         <name>mapred.child.java.opts</name>
>>>>>         <value>
>>>>>-Djava.library.path=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3
>>>>>
>>>>>
>>>>>
>>>>>         </value>
>>>>>     </property>
>>>>></configuration>
>>>>>
>>>>>The problem arises when executing the job, because it is thrown an 
>exception:
>>>>>
>>>>>Exception in thread "main" java.lang.NoClassDefFoundError:
>>>>>-Djava/library/path=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>Any help would be appreciated.
>>>>>Thanks in advance,
>>>>>
>>>>>DF
>>>>>
>>>>>
>>>>>
>>>>>----- Original Message ----
>>>>>From: Donatella Firmani <do...@yahoo.com>
>>>>>To: mapreduce-user@hadoop.apache.org
>>>>>Sent: Fri, April 29, 2011 12:57:52 PM
>>>>>Subject: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>>>>>
>>>>>
>>>>>
>>>>>Hi to all,
>>>>>
>>>>>I just subscribed to this mailing list and I'd like to ask you if anyone 
>>knows
>>>>>how to deal with LD_LIBRARY_PATH.
>>>>>I have a Java application that needs a proper setting of this environment
>>>>>variable to work under Linux-Ubuntu.
>>>>>I want to use tis application from a mapreduce job, unfortunately I could 
>not
>>>>>find a way to make things work against the LD_LIBRARY_PATH environment 
>>>>variable.
>>>>>
>>>>>I tried so many different strategies and I am stuck. Maybe someone of you 
>can
>>>>>help.
>>>>>
>>>>>Thanks in advance,
>>>>>Cheers.
>>>>>
>>>>>DF
>>>>>
>>>>>PS: I use hadoop-0-20-2
>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH

Posted by Alex Kozlov <al...@cloudera.com>.
And what is the value of *mapred.child.java.opts* in the job.xml?  Is it
something like `-Djava.library.path=...`?  The problem might be that *
mapred.child.env* was introduced only in 0.21.

I assume by the "standalone" installation you mean a single JVM.  Does your
program works with -Dava.library.path=... in this environment?

On Sun, May 1, 2011 at 9:01 AM, Donatella Firmani <
donatellafirmani@yahoo.com> wrote:

>
> Maybe can help that in a standalone installation of hadoop map reduce it
> works.
> I know that it is trivial because it is sufficient to type "export
> LD_LIBRARY_PATH=..." in the user shell...
> It is just to be sure that I did not forget anything that may be useful.
>
> Cheers,
> DF
>
> ------------------------------
> *From:* Donatella Firmani <do...@yahoo.com>
> *To:* mapreduce-user@hadoop.apache.org
> *Sent:* Sun, May 1, 2011 2:32:03 PM
>
> *Subject:* Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>
> Dear Alex,
>
> thanks for you kind assistance.
>
> I ran the job giving the option with a -Dmapred.child.env="
> LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3"
> flag.
>
> Checking the job.xml file via JT UI, I can verify that the parameters have
> the correct values for the job.
> There appears a line with :
>
> mapred.child.env
>
> LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/
> mylibpath3/lib3
>
> Unfortunately, the value of the call System.getenv("LD_LIBRARY_PATH") is
> different (without the libs) and so the job does not still work.
>
> What do you think about it?
> Cheers,
> DF
>
> ------------------------------
> *From:* Alex Kozlov <al...@cloudera.com>
> *To:* mapreduce-user@hadoop.apache.org
> *Sent:* Fri, April 29, 2011 8:01:30 PM
> *Subject:* Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>
> The next step is to find the job.xml file and check (either in the
> mapred.local.dir in local FS or in the JT web UI)...
>
> On Fri, Apr 29, 2011 at 10:59 AM, Donatella Firmani <
> donatellafirmani@yahoo.com> wrote:
>
>> Dear Alex,
>>
>> that's exactly the point. I made my mapper process dump on log files the
>> result of
>>
>> System.getenv("LD_LIBRARY_PATH")
>> System.getProperty("java.library.path")
>>
>> and none of the values seem to be affected neither by the setting of mapred.child.java.opts
>> or of mapred.child.env. in the mapred-site.xml file.
>>
>> Maybe is there something else that I have to do to make LD_LIBRARY_PATH
>> in the JVM environment be correctly set? There are some restrictions on
>> the values that it can assume (i.e. under HDFS an non in the FS of the
>> node)?
>>
>> Cheers,
>> DF
>>
>> ------------------------------
>> *From:* Alex Kozlov <al...@cloudera.com>
>> *To:* mapreduce-user@hadoop.apache.org
>> *Sent:* Fri, April 29, 2011 7:52:19 PM
>>
>> *Subject:* Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>>
>> The option should be passed to the child JVM environment when it is
>> started.  You can set most of the environment variables to garbage with no
>> side-effect.  A more important question what is the LD_LIBRARY_PATH in
>> your JVM environment.
>>
>> Once again, check the job.xml file in the mapred.local.dir (should be
>> /tmp/cache/${user.name}/... or something like this in the pseudo-config
>> environment) or try to print out the environment variables directly in your
>> map/reduce task.
>>
>> Alex K
>>
>> On Fri, Apr 29, 2011 at 10:37 AM, Donatella Firmani <
>> donatellafirmani@yahoo.com> wrote:
>>
>>> I just tried giving the option -Dmapred.child.env="LD_LIBRARY_PATH=/
>>> home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3" writing
>>> no-sense environment variables like  -Dmapred.child.env="blahblablah".
>>>
>>> It continues working... so I think that the option is completely ignored
>>> by the bin/hadoop script.
>>>
>>> Do you think it is an expected behavior?
>>>
>>> Cheers,
>>> DF
>>>
>>> ------------------------------
>>> *From:* Alex Kozlov <al...@cloudera.com>
>>>
>>> *To:* mapreduce-user@hadoop.apache.org
>>> *Sent:* Fri, April 29, 2011 7:03:50 PM
>>>
>>> *Subject:* Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>>>
>>> You need only to edit the config files on the client or give the option
>>> with a -Dmapred.child.env="LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/
>>> mylibpath2/lib2;home/mylibpath3/lib3" flag (if you implement Tool).  You
>>> can check the job.xml file via JT UI to verify that the parameters have the
>>> correct values for the job.
>>>
>>> On Fri, Apr 29, 2011 at 9:05 AM, Donatella Firmani <
>>> donatellafirmani@yahoo.com> wrote:
>>>
>>>> Dear Yin,
>>>>
>>>> Good point: I can try to install 0.19 and reproduce the problem. I'll
>>>> let you know ASAP.
>>>>
>>>> Thanks,
>>>> DF
>>>>
>>>>
>>>> ------------------------------
>>>> *From:* Yin Lou <yi...@gmail.com>
>>>>
>>>> *To:* mapreduce-user@hadoop.apache.org
>>>> *Sent:* Fri, April 29, 2011 5:59:14 PM
>>>> *Subject:* Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>>>>
>>>> Just curious, can we do this in 0.19?
>>>>
>>>> Thanks,
>>>> Yin
>>>>
>>>> On Fri, Apr 29, 2011 at 10:29 AM, Robert Evans <ev...@yahoo-inc.com>wrote:
>>>>
>>>>>  DF,
>>>>>
>>>>> You can set mapred.child.java.opts to set java options, but you can
>>>>> also set mapred.child.env to set environment variables, be careful because
>>>>> they are space separated with an = in between them.
>>>>>
>>>>>      <property>
>>>>>
>>>>>    <name>mapred.child.env</name>
>>>>>
>>>>>     <value>LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3</value>
>>>>>
>>>>>     </property>
>>>>>
>>>>> --Bobby
>>>>>
>>>>>
>>>>> On 4/29/11 5:58 AM, "Donatella Firmani" <do...@yahoo.com>
>>>>> wrote:
>>>>>
>>>>> To solve the issue addressed in my previous message, i tried setting
>>>>> property
>>>>> mapred.child.java.opts in mapred-site.xml. But - even if it seems the
>>>>> right
>>>>> approach in relation to what said in blogs & forums - there is a big
>>>>> problem
>>>>> with it.
>>>>>
>>>>> Following the tutorial (hadoop website) as section Task Execution &
>>>>> Environment,
>>>>>
>>>>> my xml looks like:
>>>>>
>>>>> <configuration>
>>>>>      <property>
>>>>>          <name>mapred.job.tracker</name>
>>>>>          <value>localhost:9001</value>
>>>>>      </property>
>>>>>      <property>
>>>>>          <name>mapred.child.java.opts</name>
>>>>>          <value>
>>>>>
>>>>> -Djava.library.path=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3
>>>>>
>>>>>
>>>>>          </value>
>>>>>      </property>
>>>>> </configuration>
>>>>>
>>>>> The problem arises when executing the job, because it is thrown an
>>>>> exception:
>>>>>
>>>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>>>>
>>>>> -Djava/library/path=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3
>>>>>
>>>>>
>>>>>
>>>>> Any help would be appreciated.
>>>>> Thanks in advance,
>>>>>
>>>>> DF
>>>>>
>>>>>
>>>>>
>>>>> ----- Original Message ----
>>>>> From: Donatella Firmani <do...@yahoo.com>
>>>>> To: mapreduce-user@hadoop.apache.org
>>>>> Sent: Fri, April 29, 2011 12:57:52 PM
>>>>> Subject: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>>>>>
>>>>>
>>>>>
>>>>> Hi to all,
>>>>>
>>>>> I just subscribed to this mailing list and I'd like to ask you if
>>>>> anyone knows
>>>>> how to deal with LD_LIBRARY_PATH.
>>>>> I have a Java application that needs a proper setting of this
>>>>> environment
>>>>> variable to work under Linux-Ubuntu.
>>>>> I want to use tis application from a mapreduce job, unfortunately I
>>>>> could not
>>>>> find a way to make things work against the LD_LIBRARY_PATH environment
>>>>> variable.
>>>>>
>>>>> I tried so many different strategies and I am stuck. Maybe someone of
>>>>> you can
>>>>> help.
>>>>>
>>>>> Thanks in advance,
>>>>> Cheers.
>>>>>
>>>>> DF
>>>>>
>>>>> PS: I use hadoop-0-20-2
>>>>>
>>>>>
>>>>
>>>
>>
>

Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH

Posted by Donatella Firmani <do...@yahoo.com>.

Maybe can help that in a standalone installation of hadoop map reduce it works.
I know that it is trivial because it is sufficient to type "export 
LD_LIBRARY_PATH=..." in the user shell... 
It is just to be sure that I did not forget anything that may be useful.

Cheers,
DF



________________________________
From: Donatella Firmani <do...@yahoo.com>
To: mapreduce-user@hadoop.apache.org
Sent: Sun, May 1, 2011 2:32:03 PM
Subject: Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH


Dear Alex,

thanks for you kind assistance.

I ran the job giving the option with a 
-Dmapred.child.env="LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3"
  flag.

Checking the job.xml file via JT UI, I can verify that the parameters have the 
correct values for the job. 

There appears a line with :

mapred.child.env    

LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3

Unfortunately, the value of the call System.getenv("LD_LIBRARY_PATH") is 
different (without the libs) and so the job does not still work.


What do you think about it?
Cheers,
DF



________________________________
From: Alex Kozlov <al...@cloudera.com>
To: mapreduce-user@hadoop.apache.org
Sent: Fri, April 29, 2011 8:01:30 PM
Subject: Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH

The next step is to find the job.xml file and check (either in the 
mapred.local.dir in local FS or in the JT web UI)...


On Fri, Apr 29, 2011 at 10:59 AM, Donatella Firmani <do...@yahoo.com> 
wrote:

Dear Alex,
>
>that's exactly the point. I made my mapper process dump on log files the result 
>of 
>
>
>
>System.getenv("LD_LIBRARY_PATH")
>System.getProperty("java.library.path")
>
>and none of the values seem to be affected neither by the setting of 
>mapred.child.java.opts or of mapred.child.env. in the mapred-site.xml file. 
>
>
>Maybe is there something else that I have to do to make LD_LIBRARY_PATH in the 
>JVM environment be correctly set? There are some restrictions on the values that 
>it can  assume (i.e. under HDFS an non in the FS of the node)?
>
>Cheers,
>DF
>
>
>
>
________________________________
From: Alex Kozlov <al...@cloudera.com>
>To: mapreduce-user@hadoop.apache.org
>Sent: Fri, April 29, 2011 7:52:19 PM
>
>Subject: Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>
>
>The option should be passed to the child JVM environment when it is started.  
>You can set most of the environment variables to garbage with no side-effect.  A 
>more important question what is the LD_LIBRARY_PATH in your JVM environment.
>
>Once again, check the job.xml file in the mapred.local.dir (should be 
>/tmp/cache/${user.name}/... or something like this in the pseudo-config 
>environment) or try to print out the environment variables directly in your 
>map/reduce task.
>
>Alex K
>
>
>On Fri, Apr 29, 2011 at 10:37 AM, Donatella Firmani <do...@yahoo.com> 
>wrote:
>
>I just tried giving the option 
>-Dmapred.child.env="LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3"
> writing no-sense environment variables like  -Dmapred.child.env="blahblablah".
>>
>>It continues working... so I think that the option is completely ignored by the 
>>bin/hadoop script.
>>
>>Do you think it is an expected behavior?
>>
>>Cheers,
>>DF
>>
>>
>>
>>
________________________________
From: Alex Kozlov <al...@cloudera.com>
>>
>>To: mapreduce-user@hadoop.apache.org
>>Sent: Fri, April 29, 2011 7:03:50 PM
>>
>>Subject: Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>>
>>
>>You need only to edit the config files on the client or give the option with a 
>>-Dmapred.child.env="LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3"
>> flag (if you implement Tool).  You can check the job.xml file via JT UI to 
>>verify that the parameters have the correct values for the job.
>>
>>
>>On Fri, Apr 29, 2011 at 9:05 AM, Donatella Firmani <do...@yahoo.com> 
>>wrote:
>>
>>Dear Yin,
>>>
>>>Good point: I can try to install 0.19 and reproduce the problem. I'll let you 
>>>know ASAP.
>>>
>>>Thanks,
>>>DF
>>>
>>>
>>>
>>>
>>>
>>>
________________________________
From: Yin Lou <yi...@gmail.com>
>>>
>>>To: mapreduce-user@hadoop.apache.org
>>>Sent: Fri, April 29, 2011 5:59:14 PM
>>>Subject: Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>>>
>>>
>>>Just curious, can we do this in 0.19?
>>>
>>>Thanks,
>>>Yin
>>>
>>>
>>>On Fri, Apr 29, 2011 at 10:29 AM, Robert Evans <ev...@yahoo-inc.com> wrote:
>>>
>>>DF,
>>>>
>>>>You can set mapred.child.java.opts to set java options, but you can also set 
>>>>mapred.child.env to set environment variables, be careful because they are space 
>>>>separated with an = in between them.
>>>>
>>>>     <property>
>>>>
>>>>   <name>mapred.child.env</name>
>>>>>    <value>LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3</value>
>>>>>
>>>>>    </property>
>>>
>>>--Bobby
>>>
>>>
>>>On 4/29/11 5:58 AM, "Donatella Firmani" <do...@yahoo.com> wrote:
>>>
>>>
>>>To solve the issue addressed in my previous message, i tried setting property
>>>>mapred.child.java.opts in mapred-site.xml. But - even if it seems the right
>>>>approach in relation to what said in blogs & forums - there is a big problem
>>>>with it.
>>>>
>>>>Following the tutorial (hadoop website) as section Task Execution & 
>>>Environment,
>>>>
>>>>my xml looks like:
>>>>
>>>><configuration>
>>>>     <property>
>>>>         <name>mapred.job.tracker</name>
>>>>         <value>localhost:9001</value>
>>>>     </property>
>>>>     <property>
>>>>         <name>mapred.child.java.opts</name>
>>>>         <value>
>>>>-Djava.library.path=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3
>>>>
>>>>
>>>>
>>>>         </value>
>>>>     </property>
>>>></configuration>
>>>>
>>>>The problem arises when executing the job, because it is thrown an 
exception:
>>>>
>>>>Exception in thread "main" java.lang.NoClassDefFoundError:
>>>>-Djava/library/path=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3
>>>>
>>>>
>>>>
>>>>
>>>>Any help would be appreciated.
>>>>Thanks in advance,
>>>>
>>>>DF
>>>>
>>>>
>>>>
>>>>----- Original Message ----
>>>>From: Donatella Firmani <do...@yahoo.com>
>>>>To: mapreduce-user@hadoop.apache.org
>>>>Sent: Fri, April 29, 2011 12:57:52 PM
>>>>Subject: Hadoop Mapreduce jobs and LD_LIBRARY_PATH
>>>>
>>>>
>>>>
>>>>Hi to all,
>>>>
>>>>I just subscribed to this mailing list and I'd like to ask you if anyone 
>knows
>>>>how to deal with LD_LIBRARY_PATH.
>>>>I have a Java application that needs a proper setting of this environment
>>>>variable to work under Linux-Ubuntu.
>>>>I want to use tis application from a mapreduce job, unfortunately I could 
not
>>>>find a way to make things work against the LD_LIBRARY_PATH environment 
>>>variable.
>>>>
>>>>I tried so many different strategies and I am stuck. Maybe someone of you 
can
>>>>help.
>>>>
>>>>Thanks in advance,
>>>>Cheers.
>>>>
>>>>DF
>>>>
>>>>PS: I use hadoop-0-20-2
>>>>
>>>>
>>>
>>
>