You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Jason Venner <ja...@attributor.com> on 2008/02/28 19:44:25 UTC

0.16.0: Question about LD_LIBRARY_PATH / java.library.path

It looks like setting LD_LIBRARY_PATH in the hadoop-env or setting a 
-Djava.library.path are ignored now.
The startup shell scripts explicity set the -Djava.library.path to the 
path required for the native compression libraries .
Since these options are appended after the ones form hadoop-env, this 
-Djava.library.path overrides the user set one.

from the bin/hadoop
# setup 'java.library.path' for native-hadoop code if 
necessary                                                                                  

JAVA_LIBRARY_PATH=''
if [ -d "${HADOOP_HOME}/build/native" -o -d "${HADOOP_HOME}/lib/native" 
]; then
  JAVA_PLATFORM=`CLASSPATH=${CLASSPATH} ${JAVA} 
org.apache.hadoop.util.PlatformName | sed -e "s/ /_/g"`

  if [ -d "$HADOOP_HOME/build/native" ]; then
    JAVA_LIBRARY_PATH=${HADOOP_HOME}/build/native/${JAVA_PLATFORM}/lib
  fi
 
  if [ -d "${HADOOP_HOME}/lib/native" ]; then
    if [ "x$JAVA_LIBRARY_PATH" != "x" ]; then
      
JAVA_LIBRARY_PATH=${JAVA_LIBRARY_PATH}:${HADOOP_HOME}/lib/native/${JAVA_PLATFORM}
    else
      JAVA_LIBRARY_PATH=${HADOOP_HOME}/lib/native/${JAVA_PLATFORM}
    fi
  fi
fi


Is the only way now:

    6.3. Task Execution & Environment
    The TaskTracker executes the Mapper/ Reducer task as a child process
    in a separate
    jvm.
    The child-task inherits the environment of the parent TaskTracker.
    The user can specify
    additional options to the child-jvm via the mapred.child.java.opts
    configuration
    parameter in the JobConf such as non-standard paths for the run-time
    linker to search
    shared libraries via -Djava.library.path=<> etc. If the
    mapred.child.java.opts contains the symbol @taskid@ it is
    interpolated with value
    of taskid of the map/reduce task.


-- 
Jason Venner
Attributor - Publish with Confidence <http://www.attributor.com/>
Attributor is hiring Hadoop Wranglers, contact if interested