You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Keith Wiley <kw...@keithwiley.com> on 2011/08/05 20:37:19 UTC

cmdenv LD_LIBRARY_PATH

I know you can do something like this:

-cmdenv LD_LIBRARY_PATH=./my_libs

if you have shared libraries in a subdirectory under the cwd (such as occurs when using -cacheArchive to load and unpack a jar full of .so files into the distributed cache)...but this destroys the existing path.  I think I want something more like this:

-cmdenv LD_LIBRARY_PATH=./my_libs:$LD_LIBRARY_PATH

but it interprets the environment variable as it constructs the command.  It uses the local version of the variable and converts it as it builds the hadoop command, it doesn't send the $ version to hadoop to be converted at that later time.

Is this something that can be fixed by some combination of single,double,back quotes and back slashes?  I'm uncertain of the proper sequence.

________________________________________________________________________________
Keith Wiley     kwiley@keithwiley.com     keithwiley.com    music.keithwiley.com

"The easy confidence with which I know another man's religion is folly teaches
me to suspect that my own is also."
                                           --  Mark Twain
________________________________________________________________________________


Re: cmdenv LD_LIBRARY_PATH

Posted by Allen Wittenauer <aw...@apache.org>.
On Aug 5, 2011, at 11:37 AM, Keith Wiley wrote:
> Is this something that can be fixed by some combination of single,double,back quotes and back slashes?  I'm uncertain of the proper sequence.

	Rather than depend upon environment variables, you may want to look into compiling using $ORIGIN as part of the runpath (-rpath on GNU ld).  This way you don't have to worry about setting LD_LIBRARY_PATH at all.