You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by John Bond <jo...@gmail.com> on 2011/11/29 20:11:05 UTC

Re: hadoop-fuse unable to find java

Still getting this using

Hadoop 0.20.2-cdh3u2



On 5 September 2011 16:08, John Bond <jo...@gmail.com> wrote:
> I have recently rebuilt a server with centos 6.0 and it seems that
> something caused hadoop-fuse to get confused and it is no longer able
> to find libjvm.so.  The error i get is
>
> find: `/usr/lib/jvm/java-1.6.0-sun-1.6.0.14/jre//jre/lib': No such
> file or directory
> /usr/lib/hadoop-0.20/bin/fuse_dfs: error while loading shared
> libraries: libjvm.so: cannot open shared object file: No such file or
> directory
>
> A dirty look around suggests /usr/lib/hadoop-0.20/bin/hadoop-config.sh
> is setting  JAVA_HOME to `/usr/lib/jvm/java-1.6.0-sun-1.6.0.14/jre/`
>
> /usr/bin/hadoop-fuse-dfs has the following which adds an extra /jre/
> to the path
>
>  for f in `find ${JAVA_HOME}/jre/lib -name client -prune -o -name
> libjvm.so -exec dirname {} \;`; do
>
> is there a need to specify the subfolder.  I think it would make
> things simpler to just change the above to
>
>  for f in `find ${JAVA_HOME} -name client -prune -o -name libjvm.so
> -exec dirname {} \;`; do
>
>
> The other option is to change
> /usr/lib/hadoop-0.20/bin/hadoop-config.sh so it sets the path without
> jre either remove ` /usr/lib/jvm/java-1.6.0-sun-1.6.0.*/jre/ \`.  Or
> reorder the search list so     /usr/lib/jvm/java-1.6.0-sun-1.6.0.*/ \
> is preferred
>
> regards
> John
>
> hadoop-fuse-dfs
> @@ -14,7 +14,7 @@
>
>  if [ "${LD_LIBRARY_PATH}" = "" ]; then
>   export LD_LIBRARY_PATH=/usr/lib
> -  for f in `find ${JAVA_HOME} -name client -prune -o -name libjvm.so
> -exec dirname {} \;`; do
> +  for f in `find ${JAVA_HOME}/jre/lib -name client -prune -o -name
> libjvm.so -exec dirname {} \;`; do
>     export LD_LIBRARY_PATH=$f:${LD_LIBRARY_PATH}
>   done
>  fi
>
> hadoop-config.sh
> @@ -68,8 +68,8 @@
>  if [ -z "$JAVA_HOME" ]; then
>   for candidate in \
>     /usr/lib/jvm/java-6-sun \
> -    /usr/lib/jvm/java-1.6.0-sun-1.6.0.* \
>     /usr/lib/jvm/java-1.6.0-sun-1.6.0.*/jre/ \
> +    /usr/lib/jvm/java-1.6.0-sun-1.6.0.* \
>     /usr/lib/j2sdk1.6-sun \
>     /usr/java/jdk1.6* \
>     /usr/java/jre1.6* \

Re: hadoop-fuse unable to find java

Posted by Brock Noland <br...@cloudera.com>.
Hi,

This specific issue is probably more appropriate on the CDH-USER list.
(BCC common-user) It looks like the JRE detection mechanism recently
added to BIGTOP would have this same issue:
https://issues.apache.org/jira/browse/BIGTOP-25

To resolve the immediate issue I would set an environment variable in
/etc/default/hadoop-0.20 or haoop-env.sh. You could set it static to a
particular version or perhaps use:
export JAVA_HOME=$(readlink -f /usr/java/latest)

Ultimately I think this will be fixed in BigTop but also may need to
be fixed in CDH3. As such I have filed a JIRA for you:

https://issues.cloudera.org/browse/DISTRO-349

If you are interested in seeing how the issue progresses you can
"Watch" the issue and receive email updates.

Cheers,
Brock

On Tue, Nov 29, 2011 at 1:11 PM, John Bond <jo...@gmail.com> wrote:
> Still getting this using
>
> Hadoop 0.20.2-cdh3u2
>
>
>
> On 5 September 2011 16:08, John Bond <jo...@gmail.com> wrote:
>> I have recently rebuilt a server with centos 6.0 and it seems that
>> something caused hadoop-fuse to get confused and it is no longer able
>> to find libjvm.so.  The error i get is
>>
>> find: `/usr/lib/jvm/java-1.6.0-sun-1.6.0.14/jre//jre/lib': No such
>> file or directory
>> /usr/lib/hadoop-0.20/bin/fuse_dfs: error while loading shared
>> libraries: libjvm.so: cannot open shared object file: No such file or
>> directory
>>
>> A dirty look around suggests /usr/lib/hadoop-0.20/bin/hadoop-config.sh
>> is setting  JAVA_HOME to `/usr/lib/jvm/java-1.6.0-sun-1.6.0.14/jre/`
>>
>> /usr/bin/hadoop-fuse-dfs has the following which adds an extra /jre/
>> to the path
>>
>>  for f in `find ${JAVA_HOME}/jre/lib -name client -prune -o -name
>> libjvm.so -exec dirname {} \;`; do
>>
>> is there a need to specify the subfolder.  I think it would make
>> things simpler to just change the above to
>>
>>  for f in `find ${JAVA_HOME} -name client -prune -o -name libjvm.so
>> -exec dirname {} \;`; do
>>
>>
>> The other option is to change
>> /usr/lib/hadoop-0.20/bin/hadoop-config.sh so it sets the path without
>> jre either remove ` /usr/lib/jvm/java-1.6.0-sun-1.6.0.*/jre/ \`.  Or
>> reorder the search list so     /usr/lib/jvm/java-1.6.0-sun-1.6.0.*/ \
>> is preferred
>>
>> regards
>> John
>>
>> hadoop-fuse-dfs
>> @@ -14,7 +14,7 @@
>>
>>  if [ "${LD_LIBRARY_PATH}" = "" ]; then
>>   export LD_LIBRARY_PATH=/usr/lib
>> -  for f in `find ${JAVA_HOME} -name client -prune -o -name libjvm.so
>> -exec dirname {} \;`; do
>> +  for f in `find ${JAVA_HOME}/jre/lib -name client -prune -o -name
>> libjvm.so -exec dirname {} \;`; do
>>     export LD_LIBRARY_PATH=$f:${LD_LIBRARY_PATH}
>>   done
>>  fi
>>
>> hadoop-config.sh
>> @@ -68,8 +68,8 @@
>>  if [ -z "$JAVA_HOME" ]; then
>>   for candidate in \
>>     /usr/lib/jvm/java-6-sun \
>> -    /usr/lib/jvm/java-1.6.0-sun-1.6.0.* \
>>     /usr/lib/jvm/java-1.6.0-sun-1.6.0.*/jre/ \
>> +    /usr/lib/jvm/java-1.6.0-sun-1.6.0.* \
>>     /usr/lib/j2sdk1.6-sun \
>>     /usr/java/jdk1.6* \
>>     /usr/java/jre1.6* \
>

Re: hadoop-fuse unable to find java

Posted by Harsh J <ha...@cloudera.com>.
Hey John,

Moving discussion to cdh-user@cloudera.org (bcc'd common-user@hadoop.apache.org), since this is https://ccp.cloudera.com/display/CDHDOC/Mountable+HDFS and CDH specific.

Thanks for hunting this down. Could you open up a DISTRO ticket for this at https://issues.cloudera.org//browse/DISTRO?

On 30-Nov-2011, at 12:41 AM, John Bond wrote:

> Still getting this using
> 
> Hadoop 0.20.2-cdh3u2
> 
> 
> 
> On 5 September 2011 16:08, John Bond <jo...@gmail.com> wrote:
>> I have recently rebuilt a server with centos 6.0 and it seems that
>> something caused hadoop-fuse to get confused and it is no longer able
>> to find libjvm.so.  The error i get is
>> 
>> find: `/usr/lib/jvm/java-1.6.0-sun-1.6.0.14/jre//jre/lib': No such
>> file or directory
>> /usr/lib/hadoop-0.20/bin/fuse_dfs: error while loading shared
>> libraries: libjvm.so: cannot open shared object file: No such file or
>> directory
>> 
>> A dirty look around suggests /usr/lib/hadoop-0.20/bin/hadoop-config.sh
>> is setting  JAVA_HOME to `/usr/lib/jvm/java-1.6.0-sun-1.6.0.14/jre/`
>> 
>> /usr/bin/hadoop-fuse-dfs has the following which adds an extra /jre/
>> to the path
>> 
>>  for f in `find ${JAVA_HOME}/jre/lib -name client -prune -o -name
>> libjvm.so -exec dirname {} \;`; do
>> 
>> is there a need to specify the subfolder.  I think it would make
>> things simpler to just change the above to
>> 
>>  for f in `find ${JAVA_HOME} -name client -prune -o -name libjvm.so
>> -exec dirname {} \;`; do
>> 
>> 
>> The other option is to change
>> /usr/lib/hadoop-0.20/bin/hadoop-config.sh so it sets the path without
>> jre either remove ` /usr/lib/jvm/java-1.6.0-sun-1.6.0.*/jre/ \`.  Or
>> reorder the search list so     /usr/lib/jvm/java-1.6.0-sun-1.6.0.*/ \
>> is preferred
>> 
>> regards
>> John
>> 
>> hadoop-fuse-dfs
>> @@ -14,7 +14,7 @@
>> 
>>  if [ "${LD_LIBRARY_PATH}" = "" ]; then
>>   export LD_LIBRARY_PATH=/usr/lib
>> -  for f in `find ${JAVA_HOME} -name client -prune -o -name libjvm.so
>> -exec dirname {} \;`; do
>> +  for f in `find ${JAVA_HOME}/jre/lib -name client -prune -o -name
>> libjvm.so -exec dirname {} \;`; do
>>     export LD_LIBRARY_PATH=$f:${LD_LIBRARY_PATH}
>>   done
>>  fi
>> 
>> hadoop-config.sh
>> @@ -68,8 +68,8 @@
>>  if [ -z "$JAVA_HOME" ]; then
>>   for candidate in \
>>     /usr/lib/jvm/java-6-sun \
>> -    /usr/lib/jvm/java-1.6.0-sun-1.6.0.* \
>>     /usr/lib/jvm/java-1.6.0-sun-1.6.0.*/jre/ \
>> +    /usr/lib/jvm/java-1.6.0-sun-1.6.0.* \
>>     /usr/lib/j2sdk1.6-sun \
>>     /usr/java/jdk1.6* \
>>     /usr/java/jre1.6* \