You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by Mohit Anchlia <mo...@gmail.com> on 2014/06/19 02:37:26 UTC

Unsatisfied link error

I installed hadoop and now when I try to run "hadoop fs" I get this error.
I am using openjdk 64 bit on a virtual machine on a centos vm. I am also
listing my environment variable and one specific message I get when running
resource manager.


Environment variables:

export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.55.x86_64
export HADOOP_HEAPSIZE="500"
export HADOOP_NAMENODE_INIT_HEAPSIZE="500"
export HADOOP_HOME=/opt/yarn/hadoop-2.4.0
export HADOOP_MAPRED_HOME=$HADOOP_HOME/
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME/
export HADOOP_YARN_HOME=$HADOOP_HOME/
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
export
HADOOP_CLASSPATH=$CLASSPATH:$HADOOP_CLASSPATH:$HADOOP_HOME/lib/*:$HADOOP_HOME/share/hadoop/tools/lib/*:$HADOOP_HOME/share/hadoop/yarn/lib/*:$HADOOP_HOME/share/hadoop/common/lib/*:$HADOOP_HOME/share/hadoop/mapreduce/lib/*:$HADOOP_HOME/share/hadoop/httpfs/tomcat/lib/*:$HADOOP_HOME/share/hadoop/httpfs/tomcat/webapps/webhdfs/WEB-INF/lib/*:$HADOOP_HOME/share/hadoop/httpfs/tomcat/webapps/webhdfs/WEB-INF/classes/org/apache/hadoop/lib/*:$HADOOP_HOME/share/hadoop/hdfs/lib/*
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"

-----------------------------

[yarn@localhost sbin]$ ./yarn-daemon.sh start nodemanager

starting nodemanager, logging to
/opt/yarn/hadoop-2.4.0/logs/yarn-yarn-nodemanager-localhost.localdomain.out

OpenJDK 64-Bit Server VM warning: You have loaded library
/opt/yarn/hadoop-2.4.0/lib/native/libhadoop.so.1.0.0 which might have
disabled stack guard. The VM will try to fix the stack guard now.

It's highly recommended that you fix the library with 'execstack -c
<libfile>', or link it with '-z noexecstack'.

[yarn@localhost sbin]$ jps

6257 Jps

3785 ResourceManager

5746 NodeManager

[yarn@localhost sbin]$
------------------------------------- Then the error when running hadoop fs
-ls -----------------------------------

[root@localhost yarn]# hadoop fs -ls /

SLF4J: Class path contains multiple SLF4J bindings.

SLF4J: Found binding in
[jar:file:/opt/yarn/hadoop-2.4.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: Found binding in
[jar:file:/opt/yarn/hadoop-2.4.0/share/hadoop/httpfs/tomcat/webapps/webhdfs/WEB-INF/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]

SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.

SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

-ls: Fatal internal error

java.lang.RuntimeException: java.lang.reflect.InvocationTargetException

at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:131)

at org.apache.hadoop.security.Groups.<init>(Groups.java:64)

at
org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:240)

at
org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:255)

at
org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:232)

at
org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:718)

at
org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:703)

at
org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:605)

at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2554)

at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2546)

at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2412)

at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)

at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:167)

at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:352)

at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)

at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:325)

at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:224)

at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:207)

at org.apache.hadoop.fs.shell.Command.processRawArguments(Command.java:190)

at org.apache.hadoop.fs.shell.Command.run(Command.java:154)

at org.apache.hadoop.fs.FsShell.run(FsShell.java:255)

at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)

at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)

at org.apache.hadoop.fs.FsShell.main(FsShell.java:308)

Caused by: java.lang.reflect.InvocationTargetException

at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)

at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

at java.lang.reflect.Constructor.newInstance(Constructor.java:526)

at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129)

... 23 more

Caused by: java.lang.UnsatisfiedLinkError:
org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative()V

at org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative(Native
Method)

at
org.apache.hadoop.security.JniBasedUnixGroupsMapping.<clinit>(JniBasedUnixGroupsMapping.java:49)

at
org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.<init>(JniBasedUnixGroupsMappingWithFallback.java:38)

Re: Unsatisfied link error

Posted by Mohit Anchlia <mo...@gmail.com>.
Could somebody suggest what might be wrong here?

On Wed, Jun 18, 2014 at 5:37 PM, Mohit Anchlia <mo...@gmail.com>
wrote:

> I installed hadoop and now when I try to run "hadoop fs" I get this error.
> I am using openjdk 64 bit on a virtual machine on a centos vm. I am also
> listing my environment variable and one specific message I get when running
> resource manager.
>
>
> Environment variables:
>
> export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.55.x86_64
> export HADOOP_HEAPSIZE="500"
> export HADOOP_NAMENODE_INIT_HEAPSIZE="500"
> export HADOOP_HOME=/opt/yarn/hadoop-2.4.0
> export HADOOP_MAPRED_HOME=$HADOOP_HOME/
> export HADOOP_COMMON_HOME=$HADOOP_HOME
> export HADOOP_HDFS_HOME=$HADOOP_HOME/
> export HADOOP_YARN_HOME=$HADOOP_HOME/
> export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
> export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
> export
> HADOOP_CLASSPATH=$CLASSPATH:$HADOOP_CLASSPATH:$HADOOP_HOME/lib/*:$HADOOP_HOME/share/hadoop/tools/lib/*:$HADOOP_HOME/share/hadoop/yarn/lib/*:$HADOOP_HOME/share/hadoop/common/lib/*:$HADOOP_HOME/share/hadoop/mapreduce/lib/*:$HADOOP_HOME/share/hadoop/httpfs/tomcat/lib/*:$HADOOP_HOME/share/hadoop/httpfs/tomcat/webapps/webhdfs/WEB-INF/lib/*:$HADOOP_HOME/share/hadoop/httpfs/tomcat/webapps/webhdfs/WEB-INF/classes/org/apache/hadoop/lib/*:$HADOOP_HOME/share/hadoop/hdfs/lib/*
> export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
> export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
>
> -----------------------------
>
> [yarn@localhost sbin]$ ./yarn-daemon.sh start nodemanager
>
> starting nodemanager, logging to
> /opt/yarn/hadoop-2.4.0/logs/yarn-yarn-nodemanager-localhost.localdomain.out
>
> OpenJDK 64-Bit Server VM warning: You have loaded library
> /opt/yarn/hadoop-2.4.0/lib/native/libhadoop.so.1.0.0 which might have
> disabled stack guard. The VM will try to fix the stack guard now.
>
> It's highly recommended that you fix the library with 'execstack -c
> <libfile>', or link it with '-z noexecstack'.
>
> [yarn@localhost sbin]$ jps
>
> 6257 Jps
>
> 3785 ResourceManager
>
> 5746 NodeManager
>
> [yarn@localhost sbin]$
> ------------------------------------- Then the error when running hadoop
> fs -ls -----------------------------------
>
> [root@localhost yarn]# hadoop fs -ls /
>
> SLF4J: Class path contains multiple SLF4J bindings.
>
> SLF4J: Found binding in
> [jar:file:/opt/yarn/hadoop-2.4.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>
> SLF4J: Found binding in
> [jar:file:/opt/yarn/hadoop-2.4.0/share/hadoop/httpfs/tomcat/webapps/webhdfs/WEB-INF/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
>
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>
> -ls: Fatal internal error
>
> java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
>
> at
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:131)
>
> at org.apache.hadoop.security.Groups.<init>(Groups.java:64)
>
> at
> org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:240)
>
> at
> org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:255)
>
> at
> org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:232)
>
> at
> org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:718)
>
> at
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:703)
>
> at
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:605)
>
> at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2554)
>
> at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2546)
>
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2412)
>
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
>
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:167)
>
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:352)
>
> at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
>
> at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:325)
>
> at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:224)
>
> at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:207)
>
> at org.apache.hadoop.fs.shell.Command.processRawArguments(Command.java:190)
>
> at org.apache.hadoop.fs.shell.Command.run(Command.java:154)
>
> at org.apache.hadoop.fs.FsShell.run(FsShell.java:255)
>
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>
> at org.apache.hadoop.fs.FsShell.main(FsShell.java:308)
>
> Caused by: java.lang.reflect.InvocationTargetException
>
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>
> at
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129)
>
> ... 23 more
>
> Caused by: java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative()V
>
> at
> org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative(Native
> Method)
>
> at
> org.apache.hadoop.security.JniBasedUnixGroupsMapping.<clinit>(JniBasedUnixGroupsMapping.java:49)
>
> at
> org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.<init>(JniBasedUnixGroupsMappingWithFallback.java:38)
>
>

Re: Unsatisfied link error

Posted by Mohit Anchlia <mo...@gmail.com>.
Could somebody suggest what might be wrong here?

On Wed, Jun 18, 2014 at 5:37 PM, Mohit Anchlia <mo...@gmail.com>
wrote:

> I installed hadoop and now when I try to run "hadoop fs" I get this error.
> I am using openjdk 64 bit on a virtual machine on a centos vm. I am also
> listing my environment variable and one specific message I get when running
> resource manager.
>
>
> Environment variables:
>
> export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.55.x86_64
> export HADOOP_HEAPSIZE="500"
> export HADOOP_NAMENODE_INIT_HEAPSIZE="500"
> export HADOOP_HOME=/opt/yarn/hadoop-2.4.0
> export HADOOP_MAPRED_HOME=$HADOOP_HOME/
> export HADOOP_COMMON_HOME=$HADOOP_HOME
> export HADOOP_HDFS_HOME=$HADOOP_HOME/
> export HADOOP_YARN_HOME=$HADOOP_HOME/
> export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
> export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
> export
> HADOOP_CLASSPATH=$CLASSPATH:$HADOOP_CLASSPATH:$HADOOP_HOME/lib/*:$HADOOP_HOME/share/hadoop/tools/lib/*:$HADOOP_HOME/share/hadoop/yarn/lib/*:$HADOOP_HOME/share/hadoop/common/lib/*:$HADOOP_HOME/share/hadoop/mapreduce/lib/*:$HADOOP_HOME/share/hadoop/httpfs/tomcat/lib/*:$HADOOP_HOME/share/hadoop/httpfs/tomcat/webapps/webhdfs/WEB-INF/lib/*:$HADOOP_HOME/share/hadoop/httpfs/tomcat/webapps/webhdfs/WEB-INF/classes/org/apache/hadoop/lib/*:$HADOOP_HOME/share/hadoop/hdfs/lib/*
> export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
> export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
>
> -----------------------------
>
> [yarn@localhost sbin]$ ./yarn-daemon.sh start nodemanager
>
> starting nodemanager, logging to
> /opt/yarn/hadoop-2.4.0/logs/yarn-yarn-nodemanager-localhost.localdomain.out
>
> OpenJDK 64-Bit Server VM warning: You have loaded library
> /opt/yarn/hadoop-2.4.0/lib/native/libhadoop.so.1.0.0 which might have
> disabled stack guard. The VM will try to fix the stack guard now.
>
> It's highly recommended that you fix the library with 'execstack -c
> <libfile>', or link it with '-z noexecstack'.
>
> [yarn@localhost sbin]$ jps
>
> 6257 Jps
>
> 3785 ResourceManager
>
> 5746 NodeManager
>
> [yarn@localhost sbin]$
> ------------------------------------- Then the error when running hadoop
> fs -ls -----------------------------------
>
> [root@localhost yarn]# hadoop fs -ls /
>
> SLF4J: Class path contains multiple SLF4J bindings.
>
> SLF4J: Found binding in
> [jar:file:/opt/yarn/hadoop-2.4.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>
> SLF4J: Found binding in
> [jar:file:/opt/yarn/hadoop-2.4.0/share/hadoop/httpfs/tomcat/webapps/webhdfs/WEB-INF/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
>
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>
> -ls: Fatal internal error
>
> java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
>
> at
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:131)
>
> at org.apache.hadoop.security.Groups.<init>(Groups.java:64)
>
> at
> org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:240)
>
> at
> org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:255)
>
> at
> org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:232)
>
> at
> org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:718)
>
> at
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:703)
>
> at
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:605)
>
> at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2554)
>
> at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2546)
>
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2412)
>
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
>
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:167)
>
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:352)
>
> at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
>
> at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:325)
>
> at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:224)
>
> at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:207)
>
> at org.apache.hadoop.fs.shell.Command.processRawArguments(Command.java:190)
>
> at org.apache.hadoop.fs.shell.Command.run(Command.java:154)
>
> at org.apache.hadoop.fs.FsShell.run(FsShell.java:255)
>
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>
> at org.apache.hadoop.fs.FsShell.main(FsShell.java:308)
>
> Caused by: java.lang.reflect.InvocationTargetException
>
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>
> at
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129)
>
> ... 23 more
>
> Caused by: java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative()V
>
> at
> org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative(Native
> Method)
>
> at
> org.apache.hadoop.security.JniBasedUnixGroupsMapping.<clinit>(JniBasedUnixGroupsMapping.java:49)
>
> at
> org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.<init>(JniBasedUnixGroupsMappingWithFallback.java:38)
>
>

Re: Unsatisfied link error

Posted by Mohit Anchlia <mo...@gmail.com>.
Could somebody suggest what might be wrong here?

On Wed, Jun 18, 2014 at 5:37 PM, Mohit Anchlia <mo...@gmail.com>
wrote:

> I installed hadoop and now when I try to run "hadoop fs" I get this error.
> I am using openjdk 64 bit on a virtual machine on a centos vm. I am also
> listing my environment variable and one specific message I get when running
> resource manager.
>
>
> Environment variables:
>
> export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.55.x86_64
> export HADOOP_HEAPSIZE="500"
> export HADOOP_NAMENODE_INIT_HEAPSIZE="500"
> export HADOOP_HOME=/opt/yarn/hadoop-2.4.0
> export HADOOP_MAPRED_HOME=$HADOOP_HOME/
> export HADOOP_COMMON_HOME=$HADOOP_HOME
> export HADOOP_HDFS_HOME=$HADOOP_HOME/
> export HADOOP_YARN_HOME=$HADOOP_HOME/
> export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
> export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
> export
> HADOOP_CLASSPATH=$CLASSPATH:$HADOOP_CLASSPATH:$HADOOP_HOME/lib/*:$HADOOP_HOME/share/hadoop/tools/lib/*:$HADOOP_HOME/share/hadoop/yarn/lib/*:$HADOOP_HOME/share/hadoop/common/lib/*:$HADOOP_HOME/share/hadoop/mapreduce/lib/*:$HADOOP_HOME/share/hadoop/httpfs/tomcat/lib/*:$HADOOP_HOME/share/hadoop/httpfs/tomcat/webapps/webhdfs/WEB-INF/lib/*:$HADOOP_HOME/share/hadoop/httpfs/tomcat/webapps/webhdfs/WEB-INF/classes/org/apache/hadoop/lib/*:$HADOOP_HOME/share/hadoop/hdfs/lib/*
> export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
> export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
>
> -----------------------------
>
> [yarn@localhost sbin]$ ./yarn-daemon.sh start nodemanager
>
> starting nodemanager, logging to
> /opt/yarn/hadoop-2.4.0/logs/yarn-yarn-nodemanager-localhost.localdomain.out
>
> OpenJDK 64-Bit Server VM warning: You have loaded library
> /opt/yarn/hadoop-2.4.0/lib/native/libhadoop.so.1.0.0 which might have
> disabled stack guard. The VM will try to fix the stack guard now.
>
> It's highly recommended that you fix the library with 'execstack -c
> <libfile>', or link it with '-z noexecstack'.
>
> [yarn@localhost sbin]$ jps
>
> 6257 Jps
>
> 3785 ResourceManager
>
> 5746 NodeManager
>
> [yarn@localhost sbin]$
> ------------------------------------- Then the error when running hadoop
> fs -ls -----------------------------------
>
> [root@localhost yarn]# hadoop fs -ls /
>
> SLF4J: Class path contains multiple SLF4J bindings.
>
> SLF4J: Found binding in
> [jar:file:/opt/yarn/hadoop-2.4.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>
> SLF4J: Found binding in
> [jar:file:/opt/yarn/hadoop-2.4.0/share/hadoop/httpfs/tomcat/webapps/webhdfs/WEB-INF/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
>
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>
> -ls: Fatal internal error
>
> java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
>
> at
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:131)
>
> at org.apache.hadoop.security.Groups.<init>(Groups.java:64)
>
> at
> org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:240)
>
> at
> org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:255)
>
> at
> org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:232)
>
> at
> org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:718)
>
> at
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:703)
>
> at
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:605)
>
> at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2554)
>
> at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2546)
>
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2412)
>
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
>
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:167)
>
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:352)
>
> at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
>
> at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:325)
>
> at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:224)
>
> at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:207)
>
> at org.apache.hadoop.fs.shell.Command.processRawArguments(Command.java:190)
>
> at org.apache.hadoop.fs.shell.Command.run(Command.java:154)
>
> at org.apache.hadoop.fs.FsShell.run(FsShell.java:255)
>
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>
> at org.apache.hadoop.fs.FsShell.main(FsShell.java:308)
>
> Caused by: java.lang.reflect.InvocationTargetException
>
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>
> at
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129)
>
> ... 23 more
>
> Caused by: java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative()V
>
> at
> org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative(Native
> Method)
>
> at
> org.apache.hadoop.security.JniBasedUnixGroupsMapping.<clinit>(JniBasedUnixGroupsMapping.java:49)
>
> at
> org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.<init>(JniBasedUnixGroupsMappingWithFallback.java:38)
>
>

Re: Unsatisfied link error

Posted by Mohit Anchlia <mo...@gmail.com>.
Could somebody suggest what might be wrong here?

On Wed, Jun 18, 2014 at 5:37 PM, Mohit Anchlia <mo...@gmail.com>
wrote:

> I installed hadoop and now when I try to run "hadoop fs" I get this error.
> I am using openjdk 64 bit on a virtual machine on a centos vm. I am also
> listing my environment variable and one specific message I get when running
> resource manager.
>
>
> Environment variables:
>
> export JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.55.x86_64
> export HADOOP_HEAPSIZE="500"
> export HADOOP_NAMENODE_INIT_HEAPSIZE="500"
> export HADOOP_HOME=/opt/yarn/hadoop-2.4.0
> export HADOOP_MAPRED_HOME=$HADOOP_HOME/
> export HADOOP_COMMON_HOME=$HADOOP_HOME
> export HADOOP_HDFS_HOME=$HADOOP_HOME/
> export HADOOP_YARN_HOME=$HADOOP_HOME/
> export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
> export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
> export
> HADOOP_CLASSPATH=$CLASSPATH:$HADOOP_CLASSPATH:$HADOOP_HOME/lib/*:$HADOOP_HOME/share/hadoop/tools/lib/*:$HADOOP_HOME/share/hadoop/yarn/lib/*:$HADOOP_HOME/share/hadoop/common/lib/*:$HADOOP_HOME/share/hadoop/mapreduce/lib/*:$HADOOP_HOME/share/hadoop/httpfs/tomcat/lib/*:$HADOOP_HOME/share/hadoop/httpfs/tomcat/webapps/webhdfs/WEB-INF/lib/*:$HADOOP_HOME/share/hadoop/httpfs/tomcat/webapps/webhdfs/WEB-INF/classes/org/apache/hadoop/lib/*:$HADOOP_HOME/share/hadoop/hdfs/lib/*
> export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
> export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
>
> -----------------------------
>
> [yarn@localhost sbin]$ ./yarn-daemon.sh start nodemanager
>
> starting nodemanager, logging to
> /opt/yarn/hadoop-2.4.0/logs/yarn-yarn-nodemanager-localhost.localdomain.out
>
> OpenJDK 64-Bit Server VM warning: You have loaded library
> /opt/yarn/hadoop-2.4.0/lib/native/libhadoop.so.1.0.0 which might have
> disabled stack guard. The VM will try to fix the stack guard now.
>
> It's highly recommended that you fix the library with 'execstack -c
> <libfile>', or link it with '-z noexecstack'.
>
> [yarn@localhost sbin]$ jps
>
> 6257 Jps
>
> 3785 ResourceManager
>
> 5746 NodeManager
>
> [yarn@localhost sbin]$
> ------------------------------------- Then the error when running hadoop
> fs -ls -----------------------------------
>
> [root@localhost yarn]# hadoop fs -ls /
>
> SLF4J: Class path contains multiple SLF4J bindings.
>
> SLF4J: Found binding in
> [jar:file:/opt/yarn/hadoop-2.4.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>
> SLF4J: Found binding in
> [jar:file:/opt/yarn/hadoop-2.4.0/share/hadoop/httpfs/tomcat/webapps/webhdfs/WEB-INF/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
>
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
>
> -ls: Fatal internal error
>
> java.lang.RuntimeException: java.lang.reflect.InvocationTargetException
>
> at
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:131)
>
> at org.apache.hadoop.security.Groups.<init>(Groups.java:64)
>
> at
> org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:240)
>
> at
> org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:255)
>
> at
> org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:232)
>
> at
> org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:718)
>
> at
> org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:703)
>
> at
> org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:605)
>
> at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2554)
>
> at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2546)
>
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2412)
>
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
>
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:167)
>
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:352)
>
> at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
>
> at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:325)
>
> at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:224)
>
> at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:207)
>
> at org.apache.hadoop.fs.shell.Command.processRawArguments(Command.java:190)
>
> at org.apache.hadoop.fs.shell.Command.run(Command.java:154)
>
> at org.apache.hadoop.fs.FsShell.run(FsShell.java:255)
>
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
>
> at org.apache.hadoop.fs.FsShell.main(FsShell.java:308)
>
> Caused by: java.lang.reflect.InvocationTargetException
>
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>
> at
> org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129)
>
> ... 23 more
>
> Caused by: java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative()V
>
> at
> org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative(Native
> Method)
>
> at
> org.apache.hadoop.security.JniBasedUnixGroupsMapping.<clinit>(JniBasedUnixGroupsMapping.java:49)
>
> at
> org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback.<init>(JniBasedUnixGroupsMappingWithFallback.java:38)
>
>