You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-user@hadoop.apache.org by Stuti Awasthi <st...@hcl.com> on 2012/01/04 11:58:50 UTC

Mounting HDFS

Hi All,

I am following http://wiki.apache.org/hadoop/MountableHDFS for HDFS mount.
I have successfully followed the steps till "Installing" and I am able mount it properly. After that I am trying with "Deploying" step and followed the steps:

1. add the following to /etc/fstab
fuse_dfs#dfs://hadoop_server.foo.com:9000 /export/hdfs fuse -oallow_other,rw,-ousetrash 0 0

2. added fuse_dfs to /sbin

3. export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/build/libhdfs

4. Mount using: mount /export/hdfs.

But getting error :
fuse_dfs: error while loading shared libraries : libhdfs.so.0: cannot open shared object file: No such file or directory.

How to fix this ?

Thanks

::DISCLAIMER::
-----------------------------------------------------------------------------------------------------------------------

The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
It shall not attach any liability on the originator or HCL or its affiliates. Any views or opinions presented in
this email are solely those of the author and may not necessarily reflect the opinions of HCL or its affiliates.
Any form of reproduction, dissemination, copying, disclosure, modification, distribution and / or publication of
this message without the prior written consent of the author of this e-mail is strictly prohibited. If you have
received this email in error please delete it and notify the sender immediately. Before opening any mail and
attachments please check them for viruses and defect.

-----------------------------------------------------------------------------------------------------------------------

RE: Mounting HDFS

Posted by Stuti Awasthi <st...@hcl.com>.
Hi Alex, Kartheek,

Thanks for response. I did a clean fresh setup on a new machine CentOS- 64bit currently with open-jdk installed and able to resolve the previous issue of No ClassDefinition found error.
http://wiki.apache.org/hadoop/MountableHDFS:
Steps worked properly till "Installing Step" and I am able to mount HDFS on /mnt through fuse_dfs_wrapper.sh script. When I try to follow the "Deploying Step", I get the following error :

[root@slave ~]# mount /mnt
port=54310,server=slave
fuse-dfs didn't recognize /mnt,-2
fuse-dfs ignoring option -oallow_other
fuse-dfs ignoring option -ousetrash
fuse-dfs ignoring option dev
fuse-dfs ignoring option suid
fuse: unknown option `-oallow_other'

I searched this and found out  "http://sourceforge.net/apps/mediawiki/fuse/index.php?title=FAQ#Why_does_fusermount_fail_with_an_Unknown_option_error.3F"
I tried removing fusermount file from /bin and also removed symlink from /usr/bin but still getting this issue.

Please suggest.
Thanks

From: kartheek muthyala [mailto:kartheek0274@gmail.com]
Sent: Tuesday, January 10, 2012 9:31 AM
To: hdfs-user@hadoop.apache.org
Subject: Re: Mounting HDFS

Stuti,
Can you send us the syslog details? (/var/log/syslog)
Thanks,
Kartheek.
On Tue, Jan 10, 2012 at 1:36 AM, alo.alt <wg...@googlemail.com>> wrote:
Hmm, do you have fuse installed?
http://fuse.sourceforge.net/

- Alex

--
Alexander Lorenz
http://mapredit.blogspot.com
On Jan 9, 2012, at 3:55 AM, Stuti Awasthi wrote:

> Hi Alo,
> I tried to fresh build of fuse-dfs and this time I mount in /mnt but facing same issues. I edited the "fuse_dfs_wrapper.sh" script and added $JAVA_HOME/jre/lib/*.jar in the CLASSPATH , echoed the env variable and then tried again but no luck. Getting same error. Below is the trace:
>
> [root@slave fuse-dfs]# ./fuse_dfs_wrapper.sh dfs://slave:54310 /mnt -d
>
> HADOOP_HOME= /root/MountHDFS1/hadoop-0.20.2
>
> CLASSPATH= :ls:/root/MountHDFS1/hadoop-0.20.2/lib/commons-cli-1.2.jar:/root/MountHDFS1/hadoop-0.20.2/lib/commons-codec-1.3.jar:/root/MountHDFS1/hadoop-0.20.2/lib/commons-el-1.0.jar:/root/MountHDFS1/hadoop-0.20.2/lib/commons-httpclient-3.0.1.jar:/root/MountHDFS1/hadoop-0.20.2/lib/commons-logging-1.0.4.jar:/root/MountHDFS1/hadoop-0.20.2/lib/commons-logging-api-1.0.4.jar:/root/MountHDFS1/hadoop-0.20.2/lib/commons-net-1.4.1.jar:/root/MountHDFS1/hadoop-0.20.2/lib/core-3.1.1.jar:/root/MountHDFS1/hadoop-0.20.2/lib/hsqldb-1.8.0.10.jar:/root/MountHDFS1/hadoop-0.20.2/lib/jasper-compiler-5.5.12.jar:/root/MountHDFS1/hadoop-0.20.2/lib/jasper-runtime-5.5.12.jar:/root/MountHDFS1/hadoop-0.20.2/lib/jets3t-0.6.1.jar:/root/MountHDFS1/hadoop-0.20.2/lib/jetty-6.1.14.jar:/root/MountHDFS1/hadoop-0.20.2/lib/jetty-util-6.1.14.jar:/root/MountHDFS1/hadoop-0.20.2/lib/junit-3.8.1.jar:/root/MountHDFS1/hadoop-0.20.2/lib/kfs-0.2.2.jar:/root/MountHDFS1/hadoop-0.20.2/lib/log4j-1.2.15.jar:/root/MountHDFS1/hadoop-0.20.2/lib/mockito-all-1.8.0.jar:/root/MountHDFS1/hadoop-0.20.2/lib/oro-2.0.8.jar:/root/MountHDFS1/hadoop-0.20.2/lib/servlet-api-2.5-6.1.14.jar:/root/MountHDFS1/hadoop-0.20.2/lib/slf4j-api-1.4.3.jar:/root/MountHDFS1/hadoop-0.20.2/lib/slf4j-log4j12-1.4.3.jar:/root/MountHDFS1/hadoop-0.20.2/lib/xmlenc-0.52.jar:/root/MountHDFS1/hadoop-0.20.2/hadoop-0.20.2-ant.jar:/root/MountHDFS1/hadoop-0.20.2/hadoop-0.20.2-core.jar:/root/MountHDFS1/hadoop-0.20.2/hadoop-0.20.2-examples.jar:/root/MountHDFS1/hadoop-0.20.2/hadoop-0.20.2-test.jar:/root/MountHDFS1/hadoop-0.20.2/hadoop-0.20.2-tools.jar:ls:/usr/java/jdk1.6.0_30/jre/lib/alt-rt.jar:/usr/java/jdk1.6.0_30/jre/lib/alt-string.jar:/usr/java/jdk1.6.0_30/jre/lib/charsets.jar:/usr/java/jdk1.6.0_30/jre/lib/deploy.jar:/usr/java/jdk1.6.0_30/jre/lib/javaws.jar:/usr/java/jdk1.6.0_30/jre/lib/jce.jar:/usr/java/jdk1.6.0_30/jre/lib/jsse.jar:/usr/java/jdk1.6.0_30/jre/lib/management-agent.jar:/usr/java/jdk1.6.0_30/jre/lib/plugin.jar:/usr/java/jdk1.6.0_30/jre/lib/resources.jar:/usr/java/jdk1.6.0_30/jre/lib/rt.jar:/usr/java/jdk1.6.0_30/bin
>
> JAVA_HOME= /usr/java/jdk1.6.0_30
>
> LD_LIBRARY_PATH= /usr/lib:/usr/local/lib:/root/MountHDFS1/hadoop-0.20.2/build/libhdfs:/usr/java/jdk1.6.0_30/jre/lib/i386/server/:/lib/libfuse.so
>
> port=54310,server=slave
> fuse-dfs didn't recognize /mnt,-2
> fuse-dfs ignoring option -d
> unique: 1, opcode: INIT (26), nodeid: 0, insize: 56
> INIT: 7.10
> flags=0x0000000b
> max_readahead=0x00020000
>   INIT: 7.8
>   flags=0x00000001
>   max_readahead=0x00020000
>   max_write=0x00020000
>   unique: 1, error: 0 (Success), outsize: 40
> unique: 2, opcode: GETATTR (3), nodeid: 1, insize: 56
> Error occurred during initialization of VM
> java/lang/NoClassDefFoundError: java/lang/Object
>
> Im stuck with this. This is the output with fresh build of fuse-dfs. CLASSPATH variable contains Hadoop*.jar, and $JAVA_HOME/jre/lib/*jar files. JAVA_HOME is set differently also
> What I am doing wrong. Any Idea ??
>
> Thanks
>
>
> -----Original Message-----
> From: alo.alt [mailto:wget.null@googlemail.com<ma...@googlemail.com>]
> Sent: Saturday, January 07, 2012 11:20 PM
> To: hdfs-user@hadoop.apache.org<ma...@hadoop.apache.org>
> Subject: Re: Mounting HDFS
>
> try:
> ./fuse_dfs_wrapper.sh dfs://namenode.local:<PORT> /MOUNT_POINT -d
>
> MOUNT_POINT has to be writable, try to use a mount point under / like /hdfs or similar. Also remove the tailing /
>
> - Alex
>
>
> --
> Alexander Lorenz
> http://mapredit.blogspot.com
>
> On Jan 7, 2012, at 12:35 AM, Stuti Awasthi wrote:
>
>> Hi Alo,Srivas,
>>
>> Thanks for pointing this out. I am still getting the same error. This time with fuse_dfs_wrapper.sh I echoed the environment variables value also.
>>
>> [root@slave fuse-dfs]# ./fuse_dfs_wrapper.sh dfs://slave:54310
>> /root/FreshMount/mnt1/ -d
>>
>> CLASSPATH=/usr/java/jdk1.6.0_30/jre/lib/rt.jar:/usr/java/jdk1.6.0_30/j
>> re/lib/jce.jar:/usr/java/jdk1.6.0_30/jre/lib/javaws.jar:/usr/java/jdk1
>> .6.0_30/jre/lib/deploy.jar:/usr/java/jdk1.6.0_30/jre/lib/jsse.jar:/usr
>> /java/jdk1.6.0_30/jre/lib/plugin.jar:ls:/root/FreshMount/hadoop-0.20.2
>> /lib/commons-cli-1.2.jar:/root/FreshMount/hadoop-0.20.2/lib/commons-co
>> dec-1.3.jar:/root/FreshMount/hadoop-0.20.2/lib/commons-el-1.0.jar:/roo
>> t/FreshMount/hadoop-0.20.2/lib/commons-httpclient-3.0.1.jar:/root/Fres
>> hMount/hadoop-0.20.2/lib/commons-logging-1.0.4.jar:/root/FreshMount/ha
>> doop-0.20.2/lib/commons-logging-api-1.0.4.jar:/root/FreshMount/hadoop-
>> 0.20.2/lib/commons-net-1.4.1.jar:/root/FreshMount/hadoop-0.20.2/lib/co
>> re-3.1.1.jar:/root/FreshMount/hadoop-0.20.2/lib/hsqldb-1.8.0.10.jar:/r
>> oot/FreshMount/hadoop-0.20.2/lib/jasper-compiler-5.5.12.jar:/root/Fres
>> hMount/hadoop-0.20.2/lib/jasper-runtime-5.5.12.jar:/root/FreshMount/ha
>> doop-0.20.2/lib/jets3t-0.6.1.jar:/root/FreshMount/hadoop-0.20.2/lib/je
>> tty-6.1.14.jar:/root/FreshMount/hadoop-0.20.2/lib/jetty-util-6.1.14.ja
>> r:/root/FreshMount/hadoop-0.20.2/lib/junit-3.8.1.jar:/root/FreshMount/
>> hadoop-0.20.2/lib/kfs-0.2.2.jar:/root/FreshMount/hadoop-0.20.2/lib/log
>> 4j-1.2.15.jar:/root/FreshMount/hadoop-0.20.2/lib/mockito-all-1.8.0.jar
>> :/root/FreshMount/hadoop-0.20.2/lib/oro-2.0.8.jar:/root/FreshMount/had
>> oop-0.20.2/lib/servlet-api-2.5-6.1.14.jar:/root/FreshMount/hadoop-0.20
>> .2/lib/slf4j-api-1.4.3.jar:/root/FreshMount/hadoop-0.20.2/lib/slf4j-lo
>> g4j12-1.4.3.jar:/root/FreshMount/hadoop-0.20.2/lib/xmlenc-0.52.jar:/ro
>> ot/FreshMount/hadoop-0.20.2/hadoop-0.20.2-ant.jar:/root/FreshMount/had
>> oop-0.20.2/hadoop-0.20.2-core.jar:/root/FreshMount/hadoop-0.20.2/hadoo
>> p-0.20.2-examples.jar:/root/FreshMount/hadoop-0.20.2/hadoop-0.20.2-tes
>> t.jar:/root/FreshMount/hadoop-0.20.2/hadoop-0.20.2-tools.jar
>>
>> LD_LIBRARY_PATH=/usr/lib:/usr/local/lib:/root/FreshMount/hadoop-0.20.2
>> /build/libhdfs:/usr/java/jdk1.6.0_30/jre/lib/i386/server/:/lib/libfuse
>> .so
>>
>> JAVA_HOME=/usr/java/jdk1.6.0_30
>>
>> Error:
>> port=54310,server=slave
>> fuse-dfs didn't recognize /root/FreshMount/mnt1/,-2 fuse-dfs ignoring
>> option -d
>> unique: 1, opcode: INIT (26), nodeid: 0, insize: 56
>> INIT: 7.10
>> flags=0x0000000b
>> max_readahead=0x00020000
>>  INIT: 7.8
>>  flags=0x00000001
>>  max_readahead=0x00020000
>>  max_write=0x00020000
>>  unique: 1, error: 0 (Success), outsize: 40
>> unique: 2, opcode: GETATTR (3), nodeid: 1, insize: 56 Error occurred
>> during initialization of VM
>> java/lang/NoClassDefFoundError: java/lang/Object
>>
>> -----Original Message-----
>> From: alo.alt [mailto:wget.null@googlemail.com<ma...@googlemail.com>]
>> Sent: Friday, January 06, 2012 10:41 PM
>> To: hdfs-user@hadoop.apache.org<ma...@hadoop.apache.org>
>> Subject: Re: Mounting HDFS
>>
>> Stuti, define in CLASSPATH="...." only the jars you really need for. An export of all jars in a given directory is a red flag (done with *.jar).
>>
>> - Alex
>>
>>
>> On Jan 6, 2012, at 7:23 AM, M. C. Srivas wrote:
>>
>>>
>>> unique: 1, error: 0 (Success), outsize: 40
>>> unique: 2, opcode: GETATTR (3), nodeid: 1, insize: 56 Error occurred
>>> during initialization of VM
>>> java/lang/NoClassDefFoundError: java/lang/Object
>>>
>>> Exported Environment Variable:
>>>
>>> CLASSPATH="/root/FreshMount/hadoop-0.20.2/lib/*.jar:/root/FreshMount/hadoop-0.20.2/*.jar:/usr/bin/java:/usr/local/lib:/usr/lib:/usr/:/usr/java/jdk1.6.0_26/jre/lib/rt.jar:/usr/java/jdk1.6.0_26/jre/lib/"
>>>
>>>
>>> CLASSPATH is a list of jars, not a list of directories
>>>
>>>
>>> I know that this is simple Java Classpath Error but I have set JAVA_HOME correctly.
>>>
>>> [root@slave ~]# which java
>>> /usr/bin/java
>>>
>>
>>
>> ::DISCLAIMER::
>> ----------------------------------------------------------------------
>> -------------------------------------------------
>>
>> The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
>> It shall not attach any liability on the originator or HCL or its
>> affiliates. Any views or opinions presented in this email are solely those of the author and may not necessarily reflect the opinions of HCL or its affiliates.
>> Any form of reproduction, dissemination, copying, disclosure,
>> modification, distribution and / or publication of this message
>> without the prior written consent of the author of this e-mail is
>> strictly prohibited. If you have received this email in error please delete it and notify the sender immediately. Before opening any mail and attachments please check them for viruses and defect.
>>
>> ----------------------------------------------------------------------
>> -------------------------------------------------
>


Re: Mounting HDFS

Posted by kartheek muthyala <ka...@gmail.com>.
Stuti,
Can you send us the syslog details? (/var/log/syslog)
Thanks,
Kartheek.

On Tue, Jan 10, 2012 at 1:36 AM, alo.alt <wg...@googlemail.com> wrote:

> Hmm, do you have fuse installed?
> http://fuse.sourceforge.net/
>
> - Alex
>
> --
> Alexander Lorenz
> http://mapredit.blogspot.com
>
> On Jan 9, 2012, at 3:55 AM, Stuti Awasthi wrote:
>
> > Hi Alo,
> > I tried to fresh build of fuse-dfs and this time I mount in /mnt but
> facing same issues. I edited the "fuse_dfs_wrapper.sh" script and added
> $JAVA_HOME/jre/lib/*.jar in the CLASSPATH , echoed the env variable and
> then tried again but no luck. Getting same error. Below is the trace:
> >
> > [root@slave fuse-dfs]# ./fuse_dfs_wrapper.sh dfs://slave:54310 /mnt -d
> >
> > HADOOP_HOME= /root/MountHDFS1/hadoop-0.20.2
> >
> > CLASSPATH=
> :ls:/root/MountHDFS1/hadoop-0.20.2/lib/commons-cli-1.2.jar:/root/MountHDFS1/hadoop-0.20.2/lib/commons-codec-1.3.jar:/root/MountHDFS1/hadoop-0.20.2/lib/commons-el-1.0.jar:/root/MountHDFS1/hadoop-0.20.2/lib/commons-httpclient-3.0.1.jar:/root/MountHDFS1/hadoop-0.20.2/lib/commons-logging-1.0.4.jar:/root/MountHDFS1/hadoop-0.20.2/lib/commons-logging-api-1.0.4.jar:/root/MountHDFS1/hadoop-0.20.2/lib/commons-net-1.4.1.jar:/root/MountHDFS1/hadoop-0.20.2/lib/core-3.1.1.jar:/root/MountHDFS1/hadoop-0.20.2/lib/hsqldb-1.8.0.10.jar:/root/MountHDFS1/hadoop-0.20.2/lib/jasper-compiler-5.5.12.jar:/root/MountHDFS1/hadoop-0.20.2/lib/jasper-runtime-5.5.12.jar:/root/MountHDFS1/hadoop-0.20.2/lib/jets3t-0.6.1.jar:/root/MountHDFS1/hadoop-0.20.2/lib/jetty-6.1.14.jar:/root/MountHDFS1/hadoop-0.20.2/lib/jetty-util-6.1.14.jar:/root/MountHDFS1/hadoop-0.20.2/lib/junit-3.8.1.jar:/root/MountHDFS1/hadoop-0.20.2/lib/kfs-0.2.2.jar:/root/MountHDFS1/hadoop-0.20.2/lib/log4j-1.2.15.jar:/root/MountHDFS1/hadoop-0.20.2/lib/mockito-all-1.8.0.jar:/root/MountHDFS1/hadoop-0.20.2/lib/oro-2.0.8.jar:/root/MountHDFS1/hadoop-0.20.2/lib/servlet-api-2.5-6.1.14.jar:/root/MountHDFS1/hadoop-0.20.2/lib/slf4j-api-1.4.3.jar:/root/MountHDFS1/hadoop-0.20.2/lib/slf4j-log4j12-1.4.3.jar:/root/MountHDFS1/hadoop-0.20.2/lib/xmlenc-0.52.jar:/root/MountHDFS1/hadoop-0.20.2/hadoop-0.20.2-ant.jar:/root/MountHDFS1/hadoop-0.20.2/hadoop-0.20.2-core.jar:/root/MountHDFS1/hadoop-0.20.2/hadoop-0.20.2-examples.jar:/root/MountHDFS1/hadoop-0.20.2/hadoop-0.20.2-test.jar:/root/MountHDFS1/hadoop-0.20.2/hadoop-0.20.2-tools.jar:ls:/usr/java/jdk1.6.0_30/jre/lib/alt-rt.jar:/usr/java/jdk1.6.0_30/jre/lib/alt-string.jar:/usr/java/jdk1.6.0_30/jre/lib/charsets.jar:/usr/java/jdk1.6.0_30/jre/lib/deploy.jar:/usr/java/jdk1.6.0_30/jre/lib/javaws.jar:/usr/java/jdk1.6.0_30/jre/lib/jce.jar:/usr/java/jdk1.6.0_30/jre/lib/jsse.jar:/usr/java/jdk1.6.0_30/jre/lib/management-agent.jar:/usr/java/jdk1.6.0_30/jre/lib/plugin.jar:/usr/java/jdk1.6.0_30/jre/lib/resources.jar:/usr/java/jdk1.6.0_30/jre/lib/rt.jar:/usr/java/jdk1.6.0_30/bin
> >
> > JAVA_HOME= /usr/java/jdk1.6.0_30
> >
> > LD_LIBRARY_PATH=
> /usr/lib:/usr/local/lib:/root/MountHDFS1/hadoop-0.20.2/build/libhdfs:/usr/java/jdk1.6.0_30/jre/lib/i386/server/:/lib/libfuse.so
> >
> > port=54310,server=slave
> > fuse-dfs didn't recognize /mnt,-2
> > fuse-dfs ignoring option -d
> > unique: 1, opcode: INIT (26), nodeid: 0, insize: 56
> > INIT: 7.10
> > flags=0x0000000b
> > max_readahead=0x00020000
> >   INIT: 7.8
> >   flags=0x00000001
> >   max_readahead=0x00020000
> >   max_write=0x00020000
> >   unique: 1, error: 0 (Success), outsize: 40
> > unique: 2, opcode: GETATTR (3), nodeid: 1, insize: 56
> > Error occurred during initialization of VM
> > java/lang/NoClassDefFoundError: java/lang/Object
> >
> > Im stuck with this. This is the output with fresh build of fuse-dfs.
> CLASSPATH variable contains Hadoop*.jar, and $JAVA_HOME/jre/lib/*jar files.
> JAVA_HOME is set differently also
> > What I am doing wrong. Any Idea ??
> >
> > Thanks
> >
> >
> > -----Original Message-----
> > From: alo.alt [mailto:wget.null@googlemail.com]
> > Sent: Saturday, January 07, 2012 11:20 PM
> > To: hdfs-user@hadoop.apache.org
> > Subject: Re: Mounting HDFS
> >
> > try:
> > ./fuse_dfs_wrapper.sh dfs://namenode.local:<PORT> /MOUNT_POINT -d
> >
> > MOUNT_POINT has to be writable, try to use a mount point under / like
> /hdfs or similar. Also remove the tailing /
> >
> > - Alex
> >
> >
> > --
> > Alexander Lorenz
> > http://mapredit.blogspot.com
> >
> > On Jan 7, 2012, at 12:35 AM, Stuti Awasthi wrote:
> >
> >> Hi Alo,Srivas,
> >>
> >> Thanks for pointing this out. I am still getting the same error. This
> time with fuse_dfs_wrapper.sh I echoed the environment variables value also.
> >>
> >> [root@slave fuse-dfs]# ./fuse_dfs_wrapper.sh dfs://slave:54310
> >> /root/FreshMount/mnt1/ -d
> >>
> >> CLASSPATH=/usr/java/jdk1.6.0_30/jre/lib/rt.jar:/usr/java/jdk1.6.0_30/j
> >> re/lib/jce.jar:/usr/java/jdk1.6.0_30/jre/lib/javaws.jar:/usr/java/jdk1
> >> .6.0_30/jre/lib/deploy.jar:/usr/java/jdk1.6.0_30/jre/lib/jsse.jar:/usr
> >> /java/jdk1.6.0_30/jre/lib/plugin.jar:ls:/root/FreshMount/hadoop-0.20.2
> >> /lib/commons-cli-1.2.jar:/root/FreshMount/hadoop-0.20.2/lib/commons-co
> >> dec-1.3.jar:/root/FreshMount/hadoop-0.20.2/lib/commons-el-1.0.jar:/roo
> >> t/FreshMount/hadoop-0.20.2/lib/commons-httpclient-3.0.1.jar:/root/Fres
> >> hMount/hadoop-0.20.2/lib/commons-logging-1.0.4.jar:/root/FreshMount/ha
> >> doop-0.20.2/lib/commons-logging-api-1.0.4.jar:/root/FreshMount/hadoop-
> >> 0.20.2/lib/commons-net-1.4.1.jar:/root/FreshMount/hadoop-0.20.2/lib/co
> >> re-3.1.1.jar:/root/FreshMount/hadoop-0.20.2/lib/hsqldb-1.8.0.10.jar:/r
> >> oot/FreshMount/hadoop-0.20.2/lib/jasper-compiler-5.5.12.jar:/root/Fres
> >> hMount/hadoop-0.20.2/lib/jasper-runtime-5.5.12.jar:/root/FreshMount/ha
> >> doop-0.20.2/lib/jets3t-0.6.1.jar:/root/FreshMount/hadoop-0.20.2/lib/je
> >> tty-6.1.14.jar:/root/FreshMount/hadoop-0.20.2/lib/jetty-util-6.1.14.ja
> >> r:/root/FreshMount/hadoop-0.20.2/lib/junit-3.8.1.jar:/root/FreshMount/
> >> hadoop-0.20.2/lib/kfs-0.2.2.jar:/root/FreshMount/hadoop-0.20.2/lib/log
> >> 4j-1.2.15.jar:/root/FreshMount/hadoop-0.20.2/lib/mockito-all-1.8.0.jar
> >> :/root/FreshMount/hadoop-0.20.2/lib/oro-2.0.8.jar:/root/FreshMount/had
> >> oop-0.20.2/lib/servlet-api-2.5-6.1.14.jar:/root/FreshMount/hadoop-0.20
> >> .2/lib/slf4j-api-1.4.3.jar:/root/FreshMount/hadoop-0.20.2/lib/slf4j-lo
> >> g4j12-1.4.3.jar:/root/FreshMount/hadoop-0.20.2/lib/xmlenc-0.52.jar:/ro
> >> ot/FreshMount/hadoop-0.20.2/hadoop-0.20.2-ant.jar:/root/FreshMount/had
> >> oop-0.20.2/hadoop-0.20.2-core.jar:/root/FreshMount/hadoop-0.20.2/hadoo
> >> p-0.20.2-examples.jar:/root/FreshMount/hadoop-0.20.2/hadoop-0.20.2-tes
> >> t.jar:/root/FreshMount/hadoop-0.20.2/hadoop-0.20.2-tools.jar
> >>
> >> LD_LIBRARY_PATH=/usr/lib:/usr/local/lib:/root/FreshMount/hadoop-0.20.2
> >> /build/libhdfs:/usr/java/jdk1.6.0_30/jre/lib/i386/server/:/lib/libfuse
> >> .so
> >>
> >> JAVA_HOME=/usr/java/jdk1.6.0_30
> >>
> >> Error:
> >> port=54310,server=slave
> >> fuse-dfs didn't recognize /root/FreshMount/mnt1/,-2 fuse-dfs ignoring
> >> option -d
> >> unique: 1, opcode: INIT (26), nodeid: 0, insize: 56
> >> INIT: 7.10
> >> flags=0x0000000b
> >> max_readahead=0x00020000
> >>  INIT: 7.8
> >>  flags=0x00000001
> >>  max_readahead=0x00020000
> >>  max_write=0x00020000
> >>  unique: 1, error: 0 (Success), outsize: 40
> >> unique: 2, opcode: GETATTR (3), nodeid: 1, insize: 56 Error occurred
> >> during initialization of VM
> >> java/lang/NoClassDefFoundError: java/lang/Object
> >>
> >> -----Original Message-----
> >> From: alo.alt [mailto:wget.null@googlemail.com]
> >> Sent: Friday, January 06, 2012 10:41 PM
> >> To: hdfs-user@hadoop.apache.org
> >> Subject: Re: Mounting HDFS
> >>
> >> Stuti, define in CLASSPATH="...." only the jars you really need for. An
> export of all jars in a given directory is a red flag (done with *.jar).
> >>
> >> - Alex
> >>
> >>
> >> On Jan 6, 2012, at 7:23 AM, M. C. Srivas wrote:
> >>
> >>>
> >>> unique: 1, error: 0 (Success), outsize: 40
> >>> unique: 2, opcode: GETATTR (3), nodeid: 1, insize: 56 Error occurred
> >>> during initialization of VM
> >>> java/lang/NoClassDefFoundError: java/lang/Object
> >>>
> >>> Exported Environment Variable:
> >>>
> >>>
> CLASSPATH="/root/FreshMount/hadoop-0.20.2/lib/*.jar:/root/FreshMount/hadoop-0.20.2/*.jar:/usr/bin/java:/usr/local/lib:/usr/lib:/usr/:/usr/java/jdk1.6.0_26/jre/lib/rt.jar:/usr/java/jdk1.6.0_26/jre/lib/"
> >>>
> >>>
> >>> CLASSPATH is a list of jars, not a list of directories
> >>>
> >>>
> >>> I know that this is simple Java Classpath Error but I have set
> JAVA_HOME correctly.
> >>>
> >>> [root@slave ~]# which java
> >>> /usr/bin/java
> >>>
> >>
> >>
> >> ::DISCLAIMER::
> >> ----------------------------------------------------------------------
> >> -------------------------------------------------
> >>
> >> The contents of this e-mail and any attachment(s) are confidential and
> intended for the named recipient(s) only.
> >> It shall not attach any liability on the originator or HCL or its
> >> affiliates. Any views or opinions presented in this email are solely
> those of the author and may not necessarily reflect the opinions of HCL or
> its affiliates.
> >> Any form of reproduction, dissemination, copying, disclosure,
> >> modification, distribution and / or publication of this message
> >> without the prior written consent of the author of this e-mail is
> >> strictly prohibited. If you have received this email in error please
> delete it and notify the sender immediately. Before opening any mail and
> attachments please check them for viruses and defect.
> >>
> >> ----------------------------------------------------------------------
> >> -------------------------------------------------
> >
>
>

Re: Mounting HDFS

Posted by "alo.alt" <wg...@googlemail.com>.
Hmm, do you have fuse installed? 
http://fuse.sourceforge.net/

- Alex

--
Alexander Lorenz
http://mapredit.blogspot.com

On Jan 9, 2012, at 3:55 AM, Stuti Awasthi wrote:

> Hi Alo,
> I tried to fresh build of fuse-dfs and this time I mount in /mnt but facing same issues. I edited the "fuse_dfs_wrapper.sh" script and added $JAVA_HOME/jre/lib/*.jar in the CLASSPATH , echoed the env variable and then tried again but no luck. Getting same error. Below is the trace:
> 
> [root@slave fuse-dfs]# ./fuse_dfs_wrapper.sh dfs://slave:54310 /mnt -d
> 
> HADOOP_HOME= /root/MountHDFS1/hadoop-0.20.2
> 
> CLASSPATH= :ls:/root/MountHDFS1/hadoop-0.20.2/lib/commons-cli-1.2.jar:/root/MountHDFS1/hadoop-0.20.2/lib/commons-codec-1.3.jar:/root/MountHDFS1/hadoop-0.20.2/lib/commons-el-1.0.jar:/root/MountHDFS1/hadoop-0.20.2/lib/commons-httpclient-3.0.1.jar:/root/MountHDFS1/hadoop-0.20.2/lib/commons-logging-1.0.4.jar:/root/MountHDFS1/hadoop-0.20.2/lib/commons-logging-api-1.0.4.jar:/root/MountHDFS1/hadoop-0.20.2/lib/commons-net-1.4.1.jar:/root/MountHDFS1/hadoop-0.20.2/lib/core-3.1.1.jar:/root/MountHDFS1/hadoop-0.20.2/lib/hsqldb-1.8.0.10.jar:/root/MountHDFS1/hadoop-0.20.2/lib/jasper-compiler-5.5.12.jar:/root/MountHDFS1/hadoop-0.20.2/lib/jasper-runtime-5.5.12.jar:/root/MountHDFS1/hadoop-0.20.2/lib/jets3t-0.6.1.jar:/root/MountHDFS1/hadoop-0.20.2/lib/jetty-6.1.14.jar:/root/MountHDFS1/hadoop-0.20.2/lib/jetty-util-6.1.14.jar:/root/MountHDFS1/hadoop-0.20.2/lib/junit-3.8.1.jar:/root/MountHDFS1/hadoop-0.20.2/lib/kfs-0.2.2.jar:/root/MountHDFS1/hadoop-0.20.2/lib/log4j-1.2.15.jar:/root/MountHDFS1/hadoop-0.20.2/lib/mockito-all-1.8.0.jar:/root/MountHDFS1/hadoop-0.20.2/lib/oro-2.0.8.jar:/root/MountHDFS1/hadoop-0.20.2/lib/servlet-api-2.5-6.1.14.jar:/root/MountHDFS1/hadoop-0.20.2/lib/slf4j-api-1.4.3.jar:/root/MountHDFS1/hadoop-0.20.2/lib/slf4j-log4j12-1.4.3.jar:/root/MountHDFS1/hadoop-0.20.2/lib/xmlenc-0.52.jar:/root/MountHDFS1/hadoop-0.20.2/hadoop-0.20.2-ant.jar:/root/MountHDFS1/hadoop-0.20.2/hadoop-0.20.2-core.jar:/root/MountHDFS1/hadoop-0.20.2/hadoop-0.20.2-examples.jar:/root/MountHDFS1/hadoop-0.20.2/hadoop-0.20.2-test.jar:/root/MountHDFS1/hadoop-0.20.2/hadoop-0.20.2-tools.jar:ls:/usr/java/jdk1.6.0_30/jre/lib/alt-rt.jar:/usr/java/jdk1.6.0_30/jre/lib/alt-string.jar:/usr/java/jdk1.6.0_30/jre/lib/charsets.jar:/usr/java/jdk1.6.0_30/jre/lib/deploy.jar:/usr/java/jdk1.6.0_30/jre/lib/javaws.jar:/usr/java/jdk1.6.0_30/jre/lib/jce.jar:/usr/java/jdk1.6.0_30/jre/lib/jsse.jar:/usr/java/jdk1.6.0_30/jre/lib/management-agent.jar:/usr/java/jdk1.6.0_30/jre/lib/plugin.jar:/usr/java/jdk1.6.0_30/jre/lib/resources.jar:/usr/java/jdk1.6.0_30/jre/lib/rt.jar:/usr/java/jdk1.6.0_30/bin
> 
> JAVA_HOME= /usr/java/jdk1.6.0_30
> 
> LD_LIBRARY_PATH= /usr/lib:/usr/local/lib:/root/MountHDFS1/hadoop-0.20.2/build/libhdfs:/usr/java/jdk1.6.0_30/jre/lib/i386/server/:/lib/libfuse.so
> 
> port=54310,server=slave
> fuse-dfs didn't recognize /mnt,-2
> fuse-dfs ignoring option -d
> unique: 1, opcode: INIT (26), nodeid: 0, insize: 56
> INIT: 7.10
> flags=0x0000000b
> max_readahead=0x00020000
>   INIT: 7.8
>   flags=0x00000001
>   max_readahead=0x00020000
>   max_write=0x00020000
>   unique: 1, error: 0 (Success), outsize: 40
> unique: 2, opcode: GETATTR (3), nodeid: 1, insize: 56
> Error occurred during initialization of VM
> java/lang/NoClassDefFoundError: java/lang/Object 
> 
> Im stuck with this. This is the output with fresh build of fuse-dfs. CLASSPATH variable contains Hadoop*.jar, and $JAVA_HOME/jre/lib/*jar files. JAVA_HOME is set differently also
> What I am doing wrong. Any Idea ??
> 
> Thanks
> 
> 
> -----Original Message-----
> From: alo.alt [mailto:wget.null@googlemail.com] 
> Sent: Saturday, January 07, 2012 11:20 PM
> To: hdfs-user@hadoop.apache.org
> Subject: Re: Mounting HDFS
> 
> try:
> ./fuse_dfs_wrapper.sh dfs://namenode.local:<PORT> /MOUNT_POINT -d
> 
> MOUNT_POINT has to be writable, try to use a mount point under / like /hdfs or similar. Also remove the tailing /
> 
> - Alex 
> 
> 
> --
> Alexander Lorenz
> http://mapredit.blogspot.com
> 
> On Jan 7, 2012, at 12:35 AM, Stuti Awasthi wrote:
> 
>> Hi Alo,Srivas,
>> 
>> Thanks for pointing this out. I am still getting the same error. This time with fuse_dfs_wrapper.sh I echoed the environment variables value also.
>> 
>> [root@slave fuse-dfs]# ./fuse_dfs_wrapper.sh dfs://slave:54310 
>> /root/FreshMount/mnt1/ -d
>> 
>> CLASSPATH=/usr/java/jdk1.6.0_30/jre/lib/rt.jar:/usr/java/jdk1.6.0_30/j
>> re/lib/jce.jar:/usr/java/jdk1.6.0_30/jre/lib/javaws.jar:/usr/java/jdk1
>> .6.0_30/jre/lib/deploy.jar:/usr/java/jdk1.6.0_30/jre/lib/jsse.jar:/usr
>> /java/jdk1.6.0_30/jre/lib/plugin.jar:ls:/root/FreshMount/hadoop-0.20.2
>> /lib/commons-cli-1.2.jar:/root/FreshMount/hadoop-0.20.2/lib/commons-co
>> dec-1.3.jar:/root/FreshMount/hadoop-0.20.2/lib/commons-el-1.0.jar:/roo
>> t/FreshMount/hadoop-0.20.2/lib/commons-httpclient-3.0.1.jar:/root/Fres
>> hMount/hadoop-0.20.2/lib/commons-logging-1.0.4.jar:/root/FreshMount/ha
>> doop-0.20.2/lib/commons-logging-api-1.0.4.jar:/root/FreshMount/hadoop-
>> 0.20.2/lib/commons-net-1.4.1.jar:/root/FreshMount/hadoop-0.20.2/lib/co
>> re-3.1.1.jar:/root/FreshMount/hadoop-0.20.2/lib/hsqldb-1.8.0.10.jar:/r
>> oot/FreshMount/hadoop-0.20.2/lib/jasper-compiler-5.5.12.jar:/root/Fres
>> hMount/hadoop-0.20.2/lib/jasper-runtime-5.5.12.jar:/root/FreshMount/ha
>> doop-0.20.2/lib/jets3t-0.6.1.jar:/root/FreshMount/hadoop-0.20.2/lib/je
>> tty-6.1.14.jar:/root/FreshMount/hadoop-0.20.2/lib/jetty-util-6.1.14.ja
>> r:/root/FreshMount/hadoop-0.20.2/lib/junit-3.8.1.jar:/root/FreshMount/
>> hadoop-0.20.2/lib/kfs-0.2.2.jar:/root/FreshMount/hadoop-0.20.2/lib/log
>> 4j-1.2.15.jar:/root/FreshMount/hadoop-0.20.2/lib/mockito-all-1.8.0.jar
>> :/root/FreshMount/hadoop-0.20.2/lib/oro-2.0.8.jar:/root/FreshMount/had
>> oop-0.20.2/lib/servlet-api-2.5-6.1.14.jar:/root/FreshMount/hadoop-0.20
>> .2/lib/slf4j-api-1.4.3.jar:/root/FreshMount/hadoop-0.20.2/lib/slf4j-lo
>> g4j12-1.4.3.jar:/root/FreshMount/hadoop-0.20.2/lib/xmlenc-0.52.jar:/ro
>> ot/FreshMount/hadoop-0.20.2/hadoop-0.20.2-ant.jar:/root/FreshMount/had
>> oop-0.20.2/hadoop-0.20.2-core.jar:/root/FreshMount/hadoop-0.20.2/hadoo
>> p-0.20.2-examples.jar:/root/FreshMount/hadoop-0.20.2/hadoop-0.20.2-tes
>> t.jar:/root/FreshMount/hadoop-0.20.2/hadoop-0.20.2-tools.jar
>> 
>> LD_LIBRARY_PATH=/usr/lib:/usr/local/lib:/root/FreshMount/hadoop-0.20.2
>> /build/libhdfs:/usr/java/jdk1.6.0_30/jre/lib/i386/server/:/lib/libfuse
>> .so
>> 
>> JAVA_HOME=/usr/java/jdk1.6.0_30
>> 
>> Error:
>> port=54310,server=slave
>> fuse-dfs didn't recognize /root/FreshMount/mnt1/,-2 fuse-dfs ignoring 
>> option -d
>> unique: 1, opcode: INIT (26), nodeid: 0, insize: 56
>> INIT: 7.10
>> flags=0x0000000b
>> max_readahead=0x00020000
>>  INIT: 7.8
>>  flags=0x00000001
>>  max_readahead=0x00020000
>>  max_write=0x00020000
>>  unique: 1, error: 0 (Success), outsize: 40
>> unique: 2, opcode: GETATTR (3), nodeid: 1, insize: 56 Error occurred 
>> during initialization of VM
>> java/lang/NoClassDefFoundError: java/lang/Object
>> 
>> -----Original Message-----
>> From: alo.alt [mailto:wget.null@googlemail.com]
>> Sent: Friday, January 06, 2012 10:41 PM
>> To: hdfs-user@hadoop.apache.org
>> Subject: Re: Mounting HDFS
>> 
>> Stuti, define in CLASSPATH="...." only the jars you really need for. An export of all jars in a given directory is a red flag (done with *.jar).
>> 
>> - Alex
>> 
>> 
>> On Jan 6, 2012, at 7:23 AM, M. C. Srivas wrote:
>> 
>>> 
>>> unique: 1, error: 0 (Success), outsize: 40
>>> unique: 2, opcode: GETATTR (3), nodeid: 1, insize: 56 Error occurred 
>>> during initialization of VM
>>> java/lang/NoClassDefFoundError: java/lang/Object
>>> 
>>> Exported Environment Variable:
>>> 
>>> CLASSPATH="/root/FreshMount/hadoop-0.20.2/lib/*.jar:/root/FreshMount/hadoop-0.20.2/*.jar:/usr/bin/java:/usr/local/lib:/usr/lib:/usr/:/usr/java/jdk1.6.0_26/jre/lib/rt.jar:/usr/java/jdk1.6.0_26/jre/lib/"
>>> 
>>> 
>>> CLASSPATH is a list of jars, not a list of directories
>>> 
>>> 
>>> I know that this is simple Java Classpath Error but I have set JAVA_HOME correctly.
>>> 
>>> [root@slave ~]# which java
>>> /usr/bin/java
>>> 
>> 
>> 
>> ::DISCLAIMER::
>> ----------------------------------------------------------------------
>> -------------------------------------------------
>> 
>> The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
>> It shall not attach any liability on the originator or HCL or its 
>> affiliates. Any views or opinions presented in this email are solely those of the author and may not necessarily reflect the opinions of HCL or its affiliates.
>> Any form of reproduction, dissemination, copying, disclosure, 
>> modification, distribution and / or publication of this message 
>> without the prior written consent of the author of this e-mail is 
>> strictly prohibited. If you have received this email in error please delete it and notify the sender immediately. Before opening any mail and attachments please check them for viruses and defect.
>> 
>> ----------------------------------------------------------------------
>> -------------------------------------------------
> 


RE: Mounting HDFS

Posted by Stuti Awasthi <st...@hcl.com>.
Hi Alo,
I tried to fresh build of fuse-dfs and this time I mount in /mnt but facing same issues. I edited the "fuse_dfs_wrapper.sh" script and added $JAVA_HOME/jre/lib/*.jar in the CLASSPATH , echoed the env variable and then tried again but no luck. Getting same error. Below is the trace:

[root@slave fuse-dfs]# ./fuse_dfs_wrapper.sh dfs://slave:54310 /mnt -d

HADOOP_HOME= /root/MountHDFS1/hadoop-0.20.2

CLASSPATH= :ls:/root/MountHDFS1/hadoop-0.20.2/lib/commons-cli-1.2.jar:/root/MountHDFS1/hadoop-0.20.2/lib/commons-codec-1.3.jar:/root/MountHDFS1/hadoop-0.20.2/lib/commons-el-1.0.jar:/root/MountHDFS1/hadoop-0.20.2/lib/commons-httpclient-3.0.1.jar:/root/MountHDFS1/hadoop-0.20.2/lib/commons-logging-1.0.4.jar:/root/MountHDFS1/hadoop-0.20.2/lib/commons-logging-api-1.0.4.jar:/root/MountHDFS1/hadoop-0.20.2/lib/commons-net-1.4.1.jar:/root/MountHDFS1/hadoop-0.20.2/lib/core-3.1.1.jar:/root/MountHDFS1/hadoop-0.20.2/lib/hsqldb-1.8.0.10.jar:/root/MountHDFS1/hadoop-0.20.2/lib/jasper-compiler-5.5.12.jar:/root/MountHDFS1/hadoop-0.20.2/lib/jasper-runtime-5.5.12.jar:/root/MountHDFS1/hadoop-0.20.2/lib/jets3t-0.6.1.jar:/root/MountHDFS1/hadoop-0.20.2/lib/jetty-6.1.14.jar:/root/MountHDFS1/hadoop-0.20.2/lib/jetty-util-6.1.14.jar:/root/MountHDFS1/hadoop-0.20.2/lib/junit-3.8.1.jar:/root/MountHDFS1/hadoop-0.20.2/lib/kfs-0.2.2.jar:/root/MountHDFS1/hadoop-0.20.2/lib/log4j-1.2.15.jar:/root/MountHDFS1/hadoop-0.20.2/lib/mockito-all-1.8.0.jar:/root/MountHDFS1/hadoop-0.20.2/lib/oro-2.0.8.jar:/root/MountHDFS1/hadoop-0.20.2/lib/servlet-api-2.5-6.1.14.jar:/root/MountHDFS1/hadoop-0.20.2/lib/slf4j-api-1.4.3.jar:/root/MountHDFS1/hadoop-0.20.2/lib/slf4j-log4j12-1.4.3.jar:/root/MountHDFS1/hadoop-0.20.2/lib/xmlenc-0.52.jar:/root/MountHDFS1/hadoop-0.20.2/hadoop-0.20.2-ant.jar:/root/MountHDFS1/hadoop-0.20.2/hadoop-0.20.2-core.jar:/root/MountHDFS1/hadoop-0.20.2/hadoop-0.20.2-examples.jar:/root/MountHDFS1/hadoop-0.20.2/hadoop-0.20.2-test.jar:/root/MountHDFS1/hadoop-0.20.2/hadoop-0.20.2-tools.jar:ls:/usr/java/jdk1.6.0_30/jre/lib/alt-rt.jar:/usr/java/jdk1.6.0_30/jre/lib/alt-string.jar:/usr/java/jdk1.6.0_30/jre/lib/charsets.jar:/usr/java/jdk1.6.0_30/jre/lib/deploy.jar:/usr/java/jdk1.6.0_30/jre/lib/javaws.jar:/usr/java/jdk1.6.0_30/jre/lib/jce.jar:/usr/java/jdk1.6.0_30/jre/lib/jsse.jar:/usr/java/jdk1.6.0_30/jre/lib/management-agent.jar:/usr/java/jdk1.6.0_30/jre/lib/plugin.jar:/usr/java/jdk1.6.0_30/jre/lib/resources.jar:/usr/java/jdk1.6.0_30/jre/lib/rt.jar:/usr/java/jdk1.6.0_30/bin

JAVA_HOME= /usr/java/jdk1.6.0_30

LD_LIBRARY_PATH= /usr/lib:/usr/local/lib:/root/MountHDFS1/hadoop-0.20.2/build/libhdfs:/usr/java/jdk1.6.0_30/jre/lib/i386/server/:/lib/libfuse.so

port=54310,server=slave
fuse-dfs didn't recognize /mnt,-2
fuse-dfs ignoring option -d
unique: 1, opcode: INIT (26), nodeid: 0, insize: 56
INIT: 7.10
flags=0x0000000b
max_readahead=0x00020000
   INIT: 7.8
   flags=0x00000001
   max_readahead=0x00020000
   max_write=0x00020000
   unique: 1, error: 0 (Success), outsize: 40
unique: 2, opcode: GETATTR (3), nodeid: 1, insize: 56
Error occurred during initialization of VM
java/lang/NoClassDefFoundError: java/lang/Object 

Im stuck with this. This is the output with fresh build of fuse-dfs. CLASSPATH variable contains Hadoop*.jar, and $JAVA_HOME/jre/lib/*jar files. JAVA_HOME is set differently also
What I am doing wrong. Any Idea ??

Thanks


-----Original Message-----
From: alo.alt [mailto:wget.null@googlemail.com] 
Sent: Saturday, January 07, 2012 11:20 PM
To: hdfs-user@hadoop.apache.org
Subject: Re: Mounting HDFS

try:
./fuse_dfs_wrapper.sh dfs://namenode.local:<PORT> /MOUNT_POINT -d

MOUNT_POINT has to be writable, try to use a mount point under / like /hdfs or similar. Also remove the tailing /

- Alex 


--
Alexander Lorenz
http://mapredit.blogspot.com

On Jan 7, 2012, at 12:35 AM, Stuti Awasthi wrote:

> Hi Alo,Srivas,
> 
> Thanks for pointing this out. I am still getting the same error. This time with fuse_dfs_wrapper.sh I echoed the environment variables value also.
> 
> [root@slave fuse-dfs]# ./fuse_dfs_wrapper.sh dfs://slave:54310 
> /root/FreshMount/mnt1/ -d
> 
> CLASSPATH=/usr/java/jdk1.6.0_30/jre/lib/rt.jar:/usr/java/jdk1.6.0_30/j
> re/lib/jce.jar:/usr/java/jdk1.6.0_30/jre/lib/javaws.jar:/usr/java/jdk1
> .6.0_30/jre/lib/deploy.jar:/usr/java/jdk1.6.0_30/jre/lib/jsse.jar:/usr
> /java/jdk1.6.0_30/jre/lib/plugin.jar:ls:/root/FreshMount/hadoop-0.20.2
> /lib/commons-cli-1.2.jar:/root/FreshMount/hadoop-0.20.2/lib/commons-co
> dec-1.3.jar:/root/FreshMount/hadoop-0.20.2/lib/commons-el-1.0.jar:/roo
> t/FreshMount/hadoop-0.20.2/lib/commons-httpclient-3.0.1.jar:/root/Fres
> hMount/hadoop-0.20.2/lib/commons-logging-1.0.4.jar:/root/FreshMount/ha
> doop-0.20.2/lib/commons-logging-api-1.0.4.jar:/root/FreshMount/hadoop-
> 0.20.2/lib/commons-net-1.4.1.jar:/root/FreshMount/hadoop-0.20.2/lib/co
> re-3.1.1.jar:/root/FreshMount/hadoop-0.20.2/lib/hsqldb-1.8.0.10.jar:/r
> oot/FreshMount/hadoop-0.20.2/lib/jasper-compiler-5.5.12.jar:/root/Fres
> hMount/hadoop-0.20.2/lib/jasper-runtime-5.5.12.jar:/root/FreshMount/ha
> doop-0.20.2/lib/jets3t-0.6.1.jar:/root/FreshMount/hadoop-0.20.2/lib/je
> tty-6.1.14.jar:/root/FreshMount/hadoop-0.20.2/lib/jetty-util-6.1.14.ja
> r:/root/FreshMount/hadoop-0.20.2/lib/junit-3.8.1.jar:/root/FreshMount/
> hadoop-0.20.2/lib/kfs-0.2.2.jar:/root/FreshMount/hadoop-0.20.2/lib/log
> 4j-1.2.15.jar:/root/FreshMount/hadoop-0.20.2/lib/mockito-all-1.8.0.jar
> :/root/FreshMount/hadoop-0.20.2/lib/oro-2.0.8.jar:/root/FreshMount/had
> oop-0.20.2/lib/servlet-api-2.5-6.1.14.jar:/root/FreshMount/hadoop-0.20
> .2/lib/slf4j-api-1.4.3.jar:/root/FreshMount/hadoop-0.20.2/lib/slf4j-lo
> g4j12-1.4.3.jar:/root/FreshMount/hadoop-0.20.2/lib/xmlenc-0.52.jar:/ro
> ot/FreshMount/hadoop-0.20.2/hadoop-0.20.2-ant.jar:/root/FreshMount/had
> oop-0.20.2/hadoop-0.20.2-core.jar:/root/FreshMount/hadoop-0.20.2/hadoo
> p-0.20.2-examples.jar:/root/FreshMount/hadoop-0.20.2/hadoop-0.20.2-tes
> t.jar:/root/FreshMount/hadoop-0.20.2/hadoop-0.20.2-tools.jar
> 
> LD_LIBRARY_PATH=/usr/lib:/usr/local/lib:/root/FreshMount/hadoop-0.20.2
> /build/libhdfs:/usr/java/jdk1.6.0_30/jre/lib/i386/server/:/lib/libfuse
> .so
> 
> JAVA_HOME=/usr/java/jdk1.6.0_30
> 
> Error:
> port=54310,server=slave
> fuse-dfs didn't recognize /root/FreshMount/mnt1/,-2 fuse-dfs ignoring 
> option -d
> unique: 1, opcode: INIT (26), nodeid: 0, insize: 56
> INIT: 7.10
> flags=0x0000000b
> max_readahead=0x00020000
>   INIT: 7.8
>   flags=0x00000001
>   max_readahead=0x00020000
>   max_write=0x00020000
>   unique: 1, error: 0 (Success), outsize: 40
> unique: 2, opcode: GETATTR (3), nodeid: 1, insize: 56 Error occurred 
> during initialization of VM
> java/lang/NoClassDefFoundError: java/lang/Object
> 
> -----Original Message-----
> From: alo.alt [mailto:wget.null@googlemail.com]
> Sent: Friday, January 06, 2012 10:41 PM
> To: hdfs-user@hadoop.apache.org
> Subject: Re: Mounting HDFS
> 
> Stuti, define in CLASSPATH="...." only the jars you really need for. An export of all jars in a given directory is a red flag (done with *.jar).
> 
> - Alex
> 
> 
> On Jan 6, 2012, at 7:23 AM, M. C. Srivas wrote:
> 
>> 
>>  unique: 1, error: 0 (Success), outsize: 40
>> unique: 2, opcode: GETATTR (3), nodeid: 1, insize: 56 Error occurred 
>> during initialization of VM
>> java/lang/NoClassDefFoundError: java/lang/Object
>> 
>> Exported Environment Variable:
>> 
>> CLASSPATH="/root/FreshMount/hadoop-0.20.2/lib/*.jar:/root/FreshMount/hadoop-0.20.2/*.jar:/usr/bin/java:/usr/local/lib:/usr/lib:/usr/:/usr/java/jdk1.6.0_26/jre/lib/rt.jar:/usr/java/jdk1.6.0_26/jre/lib/"
>> 
>> 
>> CLASSPATH is a list of jars, not a list of directories
>> 
>> 
>> I know that this is simple Java Classpath Error but I have set JAVA_HOME correctly.
>> 
>> [root@slave ~]# which java
>> /usr/bin/java
>> 
> 
> 
> ::DISCLAIMER::
> ----------------------------------------------------------------------
> -------------------------------------------------
> 
> The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
> It shall not attach any liability on the originator or HCL or its 
> affiliates. Any views or opinions presented in this email are solely those of the author and may not necessarily reflect the opinions of HCL or its affiliates.
> Any form of reproduction, dissemination, copying, disclosure, 
> modification, distribution and / or publication of this message 
> without the prior written consent of the author of this e-mail is 
> strictly prohibited. If you have received this email in error please delete it and notify the sender immediately. Before opening any mail and attachments please check them for viruses and defect.
> 
> ----------------------------------------------------------------------
> -------------------------------------------------


Re: Mounting HDFS

Posted by "alo.alt" <wg...@googlemail.com>.
try:
./fuse_dfs_wrapper.sh dfs://namenode.local:<PORT> /MOUNT_POINT -d

MOUNT_POINT has to be writable, try to use a mount point under / like /hdfs or similar. Also remove the tailing /

- Alex 


--
Alexander Lorenz
http://mapredit.blogspot.com

On Jan 7, 2012, at 12:35 AM, Stuti Awasthi wrote:

> Hi Alo,Srivas,
> 
> Thanks for pointing this out. I am still getting the same error. This time with fuse_dfs_wrapper.sh I echoed the environment variables value also.
> 
> [root@slave fuse-dfs]# ./fuse_dfs_wrapper.sh dfs://slave:54310 /root/FreshMount/mnt1/ -d
> 
> CLASSPATH=/usr/java/jdk1.6.0_30/jre/lib/rt.jar:/usr/java/jdk1.6.0_30/jre/lib/jce.jar:/usr/java/jdk1.6.0_30/jre/lib/javaws.jar:/usr/java/jdk1.6.0_30/jre/lib/deploy.jar:/usr/java/jdk1.6.0_30/jre/lib/jsse.jar:/usr/java/jdk1.6.0_30/jre/lib/plugin.jar:ls:/root/FreshMount/hadoop-0.20.2/lib/commons-cli-1.2.jar:/root/FreshMount/hadoop-0.20.2/lib/commons-codec-1.3.jar:/root/FreshMount/hadoop-0.20.2/lib/commons-el-1.0.jar:/root/FreshMount/hadoop-0.20.2/lib/commons-httpclient-3.0.1.jar:/root/FreshMount/hadoop-0.20.2/lib/commons-logging-1.0.4.jar:/root/FreshMount/hadoop-0.20.2/lib/commons-logging-api-1.0.4.jar:/root/FreshMount/hadoop-0.20.2/lib/commons-net-1.4.1.jar:/root/FreshMount/hadoop-0.20.2/lib/core-3.1.1.jar:/root/FreshMount/hadoop-0.20.2/lib/hsqldb-1.8.0.10.jar:/root/FreshMount/hadoop-0.20.2/lib/jasper-compiler-5.5.12.jar:/root/FreshMount/hadoop-0.20.2/lib/jasper-runtime-5.5.12.jar:/root/FreshMount/hadoop-0.20.2/lib/jets3t-0.6.1.jar:/root/FreshMount/hadoop-0.20.2/lib/jetty-6.1.14.jar:/root/FreshMount/hadoop-0.20.2/lib/jetty-util-6.1.14.jar:/root/FreshMount/hadoop-0.20.2/lib/junit-3.8.1.jar:/root/FreshMount/hadoop-0.20.2/lib/kfs-0.2.2.jar:/root/FreshMount/hadoop-0.20.2/lib/log4j-1.2.15.jar:/root/FreshMount/hadoop-0.20.2/lib/mockito-all-1.8.0.jar:/root/FreshMount/hadoop-0.20.2/lib/oro-2.0.8.jar:/root/FreshMount/hadoop-0.20.2/lib/servlet-api-2.5-6.1.14.jar:/root/FreshMount/hadoop-0.20.2/lib/slf4j-api-1.4.3.jar:/root/FreshMount/hadoop-0.20.2/lib/slf4j-log4j12-1.4.3.jar:/root/FreshMount/hadoop-0.20.2/lib/xmlenc-0.52.jar:/root/FreshMount/hadoop-0.20.2/hadoop-0.20.2-ant.jar:/root/FreshMount/hadoop-0.20.2/hadoop-0.20.2-core.jar:/root/FreshMount/hadoop-0.20.2/hadoop-0.20.2-examples.jar:/root/FreshMount/hadoop-0.20.2/hadoop-0.20.2-test.jar:/root/FreshMount/hadoop-0.20.2/hadoop-0.20.2-tools.jar
> 
> LD_LIBRARY_PATH=/usr/lib:/usr/local/lib:/root/FreshMount/hadoop-0.20.2/build/libhdfs:/usr/java/jdk1.6.0_30/jre/lib/i386/server/:/lib/libfuse.so
> 
> JAVA_HOME=/usr/java/jdk1.6.0_30
> 
> Error:
> port=54310,server=slave
> fuse-dfs didn't recognize /root/FreshMount/mnt1/,-2
> fuse-dfs ignoring option -d
> unique: 1, opcode: INIT (26), nodeid: 0, insize: 56
> INIT: 7.10
> flags=0x0000000b
> max_readahead=0x00020000
>   INIT: 7.8
>   flags=0x00000001
>   max_readahead=0x00020000
>   max_write=0x00020000
>   unique: 1, error: 0 (Success), outsize: 40
> unique: 2, opcode: GETATTR (3), nodeid: 1, insize: 56
> Error occurred during initialization of VM
> java/lang/NoClassDefFoundError: java/lang/Object
> 
> -----Original Message-----
> From: alo.alt [mailto:wget.null@googlemail.com]
> Sent: Friday, January 06, 2012 10:41 PM
> To: hdfs-user@hadoop.apache.org
> Subject: Re: Mounting HDFS
> 
> Stuti, define in CLASSPATH="...." only the jars you really need for. An export of all jars in a given directory is a red flag (done with *.jar).
> 
> - Alex
> 
> 
> On Jan 6, 2012, at 7:23 AM, M. C. Srivas wrote:
> 
>> 
>>  unique: 1, error: 0 (Success), outsize: 40
>> unique: 2, opcode: GETATTR (3), nodeid: 1, insize: 56 Error occurred during initialization of VM
>> java/lang/NoClassDefFoundError: java/lang/Object
>> 
>> Exported Environment Variable:
>> 
>> CLASSPATH="/root/FreshMount/hadoop-0.20.2/lib/*.jar:/root/FreshMount/hadoop-0.20.2/*.jar:/usr/bin/java:/usr/local/lib:/usr/lib:/usr/:/usr/java/jdk1.6.0_26/jre/lib/rt.jar:/usr/java/jdk1.6.0_26/jre/lib/"
>> 
>> 
>> CLASSPATH is a list of jars, not a list of directories
>> 
>> 
>> I know that this is simple Java Classpath Error but I have set JAVA_HOME correctly.
>> 
>> [root@slave ~]# which java
>> /usr/bin/java
>> 
> 
> 
> ::DISCLAIMER::
> -----------------------------------------------------------------------------------------------------------------------
> 
> The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
> It shall not attach any liability on the originator or HCL or its affiliates. Any views or opinions presented in
> this email are solely those of the author and may not necessarily reflect the opinions of HCL or its affiliates.
> Any form of reproduction, dissemination, copying, disclosure, modification, distribution and / or publication of
> this message without the prior written consent of the author of this e-mail is strictly prohibited. If you have
> received this email in error please delete it and notify the sender immediately. Before opening any mail and
> attachments please check them for viruses and defect.
> 
> -----------------------------------------------------------------------------------------------------------------------


RE: Mounting HDFS

Posted by Stuti Awasthi <st...@hcl.com>.
Hi Alo,Srivas,

Thanks for pointing this out. I am still getting the same error. This time with fuse_dfs_wrapper.sh I echoed the environment variables value also.

[root@slave fuse-dfs]# ./fuse_dfs_wrapper.sh dfs://slave:54310 /root/FreshMount/mnt1/ -d

CLASSPATH=/usr/java/jdk1.6.0_30/jre/lib/rt.jar:/usr/java/jdk1.6.0_30/jre/lib/jce.jar:/usr/java/jdk1.6.0_30/jre/lib/javaws.jar:/usr/java/jdk1.6.0_30/jre/lib/deploy.jar:/usr/java/jdk1.6.0_30/jre/lib/jsse.jar:/usr/java/jdk1.6.0_30/jre/lib/plugin.jar:ls:/root/FreshMount/hadoop-0.20.2/lib/commons-cli-1.2.jar:/root/FreshMount/hadoop-0.20.2/lib/commons-codec-1.3.jar:/root/FreshMount/hadoop-0.20.2/lib/commons-el-1.0.jar:/root/FreshMount/hadoop-0.20.2/lib/commons-httpclient-3.0.1.jar:/root/FreshMount/hadoop-0.20.2/lib/commons-logging-1.0.4.jar:/root/FreshMount/hadoop-0.20.2/lib/commons-logging-api-1.0.4.jar:/root/FreshMount/hadoop-0.20.2/lib/commons-net-1.4.1.jar:/root/FreshMount/hadoop-0.20.2/lib/core-3.1.1.jar:/root/FreshMount/hadoop-0.20.2/lib/hsqldb-1.8.0.10.jar:/root/FreshMount/hadoop-0.20.2/lib/jasper-compiler-5.5.12.jar:/root/FreshMount/hadoop-0.20.2/lib/jasper-runtime-5.5.12.jar:/root/FreshMount/hadoop-0.20.2/lib/jets3t-0.6.1.jar:/root/FreshMount/hadoop-0.20.2/lib/jetty-6.1.14.jar:/root/FreshMount/hadoop-0.20.2/lib/jetty-util-6.1.14.jar:/root/FreshMount/hadoop-0.20.2/lib/junit-3.8.1.jar:/root/FreshMount/hadoop-0.20.2/lib/kfs-0.2.2.jar:/root/FreshMount/hadoop-0.20.2/lib/log4j-1.2.15.jar:/root/FreshMount/hadoop-0.20.2/lib/mockito-all-1.8.0.jar:/root/FreshMount/hadoop-0.20.2/lib/oro-2.0.8.jar:/root/FreshMount/hadoop-0.20.2/lib/servlet-api-2.5-6.1.14.jar:/root/FreshMount/hadoop-0.20.2/lib/slf4j-api-1.4.3.jar:/root/FreshMount/hadoop-0.20.2/lib/slf4j-log4j12-1.4.3.jar:/root/FreshMount/hadoop-0.20.2/lib/xmlenc-0.52.jar:/root/FreshMount/hadoop-0.20.2/hadoop-0.20.2-ant.jar:/root/FreshMount/hadoop-0.20.2/hadoop-0.20.2-core.jar:/root/FreshMount/hadoop-0.20.2/hadoop-0.20.2-examples.jar:/root/FreshMount/hadoop-0.20.2/hadoop-0.20.2-test.jar:/root/FreshMount/hadoop-0.20.2/hadoop-0.20.2-tools.jar

LD_LIBRARY_PATH=/usr/lib:/usr/local/lib:/root/FreshMount/hadoop-0.20.2/build/libhdfs:/usr/java/jdk1.6.0_30/jre/lib/i386/server/:/lib/libfuse.so

JAVA_HOME=/usr/java/jdk1.6.0_30

Error:
port=54310,server=slave
fuse-dfs didn't recognize /root/FreshMount/mnt1/,-2
fuse-dfs ignoring option -d
unique: 1, opcode: INIT (26), nodeid: 0, insize: 56
INIT: 7.10
flags=0x0000000b
max_readahead=0x00020000
   INIT: 7.8
   flags=0x00000001
   max_readahead=0x00020000
   max_write=0x00020000
   unique: 1, error: 0 (Success), outsize: 40
unique: 2, opcode: GETATTR (3), nodeid: 1, insize: 56
Error occurred during initialization of VM
java/lang/NoClassDefFoundError: java/lang/Object

-----Original Message-----
From: alo.alt [mailto:wget.null@googlemail.com]
Sent: Friday, January 06, 2012 10:41 PM
To: hdfs-user@hadoop.apache.org
Subject: Re: Mounting HDFS

Stuti, define in CLASSPATH="...." only the jars you really need for. An export of all jars in a given directory is a red flag (done with *.jar).

- Alex


On Jan 6, 2012, at 7:23 AM, M. C. Srivas wrote:

>
>   unique: 1, error: 0 (Success), outsize: 40
> unique: 2, opcode: GETATTR (3), nodeid: 1, insize: 56 Error occurred during initialization of VM
> java/lang/NoClassDefFoundError: java/lang/Object
>
> Exported Environment Variable:
>
> CLASSPATH="/root/FreshMount/hadoop-0.20.2/lib/*.jar:/root/FreshMount/hadoop-0.20.2/*.jar:/usr/bin/java:/usr/local/lib:/usr/lib:/usr/:/usr/java/jdk1.6.0_26/jre/lib/rt.jar:/usr/java/jdk1.6.0_26/jre/lib/"
>
>
> CLASSPATH is a list of jars, not a list of directories
>
>
> I know that this is simple Java Classpath Error but I have set JAVA_HOME correctly.
>
> [root@slave ~]# which java
> /usr/bin/java
>


::DISCLAIMER::
-----------------------------------------------------------------------------------------------------------------------

The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
It shall not attach any liability on the originator or HCL or its affiliates. Any views or opinions presented in
this email are solely those of the author and may not necessarily reflect the opinions of HCL or its affiliates.
Any form of reproduction, dissemination, copying, disclosure, modification, distribution and / or publication of
this message without the prior written consent of the author of this e-mail is strictly prohibited. If you have
received this email in error please delete it and notify the sender immediately. Before opening any mail and
attachments please check them for viruses and defect.

-----------------------------------------------------------------------------------------------------------------------

Re: Mounting HDFS

Posted by "alo.alt" <wg...@googlemail.com>.
Stuti, define in CLASSPATH="…." only the jars you really need for. An export of all jars in a given directory is a red flag (done with *.jar).

- Alex


On Jan 6, 2012, at 7:23 AM, M. C. Srivas wrote:

> 
>   unique: 1, error: 0 (Success), outsize: 40
> unique: 2, opcode: GETATTR (3), nodeid: 1, insize: 56 Error occurred during initialization of VM
> java/lang/NoClassDefFoundError: java/lang/Object
> 
> Exported Environment Variable:
> 
> CLASSPATH="/root/FreshMount/hadoop-0.20.2/lib/*.jar:/root/FreshMount/hadoop-0.20.2/*.jar:/usr/bin/java:/usr/local/lib:/usr/lib:/usr/:/usr/java/jdk1.6.0_26/jre/lib/rt.jar:/usr/java/jdk1.6.0_26/jre/lib/"
>  
> 
> CLASSPATH is a list of jars, not a list of directories
>  
> 
> I know that this is simple Java Classpath Error but I have set JAVA_HOME correctly.
> 
> [root@slave ~]# which java
> /usr/bin/java
> 


Re: Mounting HDFS

Posted by "M. C. Srivas" <mc...@gmail.com>.
>   unique: 1, error: 0 (Success), outsize: 40
> unique: 2, opcode: GETATTR (3), nodeid: 1, insize: 56 Error occurred
> during initialization of VM
> java/lang/NoClassDefFoundError: java/lang/Object
>
> Exported Environment Variable:
>
>
> CLASSPATH="/root/FreshMount/hadoop-0.20.2/lib/*.jar:/root/FreshMount/hadoop-0.20.2/*.jar:/usr/bin/java:/usr/local/lib:/usr/lib:/usr/:/usr/java/jdk1.6.0_26/jre/lib/rt.jar:/usr/java/jdk1.6.0_26/jre/lib/"
>


CLASSPATH is a list of jars, not a list of directories


I know that this is simple Java Classpath Error but I have set JAVA_HOME
> correctly.
>
> [root@slave ~]# which java
> /usr/bin/java
>
>

RE: Mounting HDFS

Posted by Stuti Awasthi <st...@hcl.com>.
Hi Guys,
Badly stuck up with the fuse-dfs since last 3 days. Following are the errors I am facing :

[root@slave fuse-dfs]# ./fuse_dfs dfs://slave:54310 /root/FreshMount/mnt1/ -d 

port=54310,server=slave fuse-dfs didn't recognize /root/FreshMount/mnt1/,-2 fuse-dfs ignoring option -d
unique: 1, opcode: INIT (26), nodeid: 0, insize: 56
INIT: 7.10
flags=0x0000000b
max_readahead=0x00020000
   INIT: 7.8
   flags=0x00000001
   max_readahead=0x00020000
   max_write=0x00020000
   unique: 1, error: 0 (Success), outsize: 40
unique: 2, opcode: GETATTR (3), nodeid: 1, insize: 56 Error occurred during initialization of VM
java/lang/NoClassDefFoundError: java/lang/Object

Exported Environment Variable:

CLASSPATH="/root/FreshMount/hadoop-0.20.2/lib/*.jar:/root/FreshMount/hadoop-0.20.2/*.jar:/usr/bin/java:/usr/local/lib:/usr/lib:/usr/:/usr/java/jdk1.6.0_26/jre/lib/rt.jar:/usr/java/jdk1.6.0_26/jre/lib/"
JAVA_HOME="/usr/java/jdk1.6.0_26 "
HADOOP_HOME="/root/FreshMount/hadoop-0.20.2"
LD_LIBRARY_PATH="/usr/lib:/usr/local/lib:/root/FreshMount/hadoop-0.20.2/build/libhdfs/:/usr/java/jdk1.6.0_26/jre/lib/i386/server/:/lib/libfuse.so"
PATH="/root/FreshMount/hadoop-0.20.2/build/contrib/fuse-dfs/fuse_dfs:/usr/kerberos/sbin:/usr/kerberos/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/usr/X11R6/bin:/root/bin:/usr/java/jdk1.6.0_26/bin:/usr/local/lib:/usr/lib:/usr/:/usr/bin/java:/usr/java/jdk1.6.0_26/jre/lib/rt.jar:/usr/java/jdk1.6.0_26/jre/lib/"

I know that this is simple Java Classpath Error but I have set JAVA_HOME correctly.

[root@slave ~]# which java
/usr/bin/java

[root@slave ~]# ls -l /usr/bin/java
lrwxrwxrwx 1 root root 30 Aug  9 16:54 /usr/bin/java -> /usr/java/jdk1.6.0_26/bin/java

[root@slave ~]# java
Usage: java [-options] class [args...]
           (to execute a class)
   or  java [-options] -jar jarfile [args...]
           (to execute a jar file)

Please help me to fix this ..

Thanks in advance



-----Original Message-----
From: Stuti Awasthi 
Sent: Thursday, January 05, 2012 11:04 AM
To: hdfs-user@hadoop.apache.org
Subject: RE: Mounting HDFS

Hi all,
I fixed the previous issue but now I am getting this :

Entry in /etc/fstab :
fuse_dfs#dfs://7071bcce81d9:54310 /home/jony/FreshHadoop/mnt fuse -oallow_other,rw,-ousetrash 0 0

$ sudo mount /home/jony/FreshHadoop/mnt
port=54310,server=7071bcce81d9
fuse-dfs didn't recognize /home/jony/FreshHadoop/mnt,-2 fuse-dfs ignoring option -oallow_other fuse-dfs ignoring option -ousetrash fuse-dfs ignoring option dev fuse-dfs ignoring option suid
fuse: unknown option `-oallow_other'

I am able to mount with "fuse_dfs_wrapper.sh" script. Any Ideas??

Thanks
-----Original Message-----
From: Alexander Lorenz [mailto:wget.null@googlemail.com]
Sent: Wednesday, January 04, 2012 8:37 PM
To: hdfs-user@hadoop.apache.org
Subject: Re: Mounting HDFS

Hi Stuti,

Do a search for libhdfs.so* and do also an ldd /path/to/fuse_dfs. Could be that only a symlink is missing. With ldd you will see which libraries the binary wants, if the libhdfs.so.1 is not in the path export the path where you found it. 

- Alex

Alexander Lorenz
http://mapredit.blogspot.com

On Jan 4, 2012, at 4:08 AM, Stuti Awasthi <st...@hcl.com> wrote:

> I have already exported it in the env. Output of "export" command. 
> 
> declare -x LD_LIBRARY_PATH="/usr/lib:/usr/local/lib:/home/jony/FreshHadoop/hadoop-0.20.2/build/libhdfs:/usr/lib/jvm/java-6-openjdk/jre/lib/i386/server/:/usr/lib/libfuse.so"
> 
> Stuti
> ________________________________________
> From: Harsh J [harsh@cloudera.com]
> Sent: Wednesday, January 04, 2012 5:34 PM
> To: hdfs-user@hadoop.apache.org
> Subject: Re: Mounting HDFS
> 
> Stuti,
> 
> Your env needs to carry this:
> 
> export
> LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/path/to/dir/where/libhdfs/files/are/
> present
> 
> Otherwise the fuse_dfs binary won't be able to find and load it. The 
> wrapper script does this as part of its setup if you read it.
> 
> On Wed, Jan 4, 2012 at 5:29 PM, Stuti Awasthi <st...@hcl.com> wrote:
>> Im able to mount using command :
>> 
>> fuse_dfs_wrapper.sh dfs://<server>:<port> /export/hdfs
>> 
>> -----Original Message-----
>> From: Stuti Awasthi
>> Sent: Wednesday, January 04, 2012 5:24 PM
>> To: hdfs-user@hadoop.apache.org
>> Subject: RE: Mounting HDFS
>> 
>> Harsh,
>> 
>> Output of $file `which fuse_dfs`
>> 
>> /sbin/fuse_dfs: ELF 32-bit LSB executable, Intel 80386, version 1 
>> (SYSV), dynamically linked (uses shared libs), for GNU/Linux 2.6.15, 
>> not stripped
>> 
>> Same output for $ file /sbin/fuse_dfs
>> 
>> Thanks
>> ________________________________________
>> From: Harsh J [harsh@cloudera.com]
>> Sent: Wednesday, January 04, 2012 5:18 PM
>> To: hdfs-user@hadoop.apache.org
>> Subject: Re: Mounting HDFS
>> 
>> Stuti,
>> 
>> My original command was "file `which fuse_dfs`", and not just the which command.
>> 
>> Can you run "file /sbin/fuse_dfs"? You need the utility called 'file' available (its mostly present).
>> 
>> On 04-Jan-2012, at 5:08 PM, Stuti Awasthi wrote:
>> 
>>> Hi Harsh,
>>> 
>>> Currently I am using 32 bit Ubuntu11.10, Hadoop 0.20.2
>>> 
>>> Output of : $ which fuse_dfs
>>> /sbin/fuse_dfs
>>> 
>>> I searched on net and I got this url "http://wiki.apache.org/hadoop/MountableHDFS"
>>> How can I get hdfs fuse deb or rpm packages ?? Thanks for pointing this, can you please guide me more on this .
>>> 
>>> Thanks
>>> 
>>> -----Original Message-----
>>> From: Harsh J [mailto:harsh@cloudera.com]
>>> Sent: Wednesday, January 04, 2012 4:51 PM
>>> To: hdfs-user@hadoop.apache.org
>>> Subject: Re: Mounting HDFS
>>> 
>>> Stuti,
>>> 
>>> What's your platform - 32-bits or 64-bits? Which one have you built libhdfs for?
>>> 
>>> What's the output of the following?
>>> $ file `which fuse_dfs`
>>> 
>>> FWIW, the most hassle free way to do these things today is to use proper packages available for your platform, instead of compiling it by yourself. Just a suggestion.
>>> 
>>> On 04-Jan-2012, at 4:28 PM, Stuti Awasthi wrote:
>>> 
>>>> Hi All,
>>>> 
>>>> I am following http://wiki.apache.org/hadoop/MountableHDFS for HDFS mount.
>>>> I have successfully followed the steps till "Installing" and I am able mount it properly. After that I am trying with "Deploying" step and followed the steps:
>>>> 
>>>> 1. add the following to /etc/fstab
>>>> fuse_dfs#dfs://hadoop_server.foo.com:9000 /export/hdfs fuse 
>>>> -oallow_other,rw,-ousetrash 0 0
>>>> 
>>>> 2. added fuse_dfs to /sbin
>>>> 
>>>> 3. export
>>>> LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/build/libhdfs
>>>> 
>>>> 4. Mount using: mount /export/hdfs.
>>>> 
>>>> But getting error :
>>>> fuse_dfs: error while loading shared libraries : libhdfs.so.0: cannot open shared object file: No such file or directory.
>>>> 
>>>> How to fix this ?
>>>> 
>>>> Thanks
>>>> 
>>>> ::DISCLAIMER::
>>>> -------------------------------------------------------------------
>>>> --
>>>> -
>>>> -------------------------------------------------
>>>> 
>>>> The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
>>>> It shall not attach any liability on the originator or HCL or its 
>>>> affiliates. Any views or opinions presented in this email are solely those of the author and may not necessarily reflect the opinions of HCL or its affiliates.
>>>> Any form of reproduction, dissemination, copying, disclosure, 
>>>> modification, distribution and / or publication of this message 
>>>> without the prior written consent of the author of this e-mail is 
>>>> strictly prohibited. If you have received this email in error please delete it and notify the sender immediately. Before opening any mail and attachments please check them for viruses and defect.
>>>> 
>>>> -------------------------------------------------------------------
>>>> --
>>>> -
>>>> -------------------------------------------------
>>> 
>> 
> 
> 
> 
> --
> Harsh J

RE: Mounting HDFS

Posted by Stuti Awasthi <st...@hcl.com>.
Hi all,
I fixed the previous issue but now I am getting this :

Entry in /etc/fstab :
fuse_dfs#dfs://7071bcce81d9:54310 /home/jony/FreshHadoop/mnt fuse -oallow_other,rw,-ousetrash 0 0

$ sudo mount /home/jony/FreshHadoop/mnt
port=54310,server=7071bcce81d9
fuse-dfs didn't recognize /home/jony/FreshHadoop/mnt,-2
fuse-dfs ignoring option -oallow_other
fuse-dfs ignoring option -ousetrash
fuse-dfs ignoring option dev
fuse-dfs ignoring option suid
fuse: unknown option `-oallow_other'

I am able to mount with "fuse_dfs_wrapper.sh" script. Any Ideas??

Thanks
-----Original Message-----
From: Alexander Lorenz [mailto:wget.null@googlemail.com] 
Sent: Wednesday, January 04, 2012 8:37 PM
To: hdfs-user@hadoop.apache.org
Subject: Re: Mounting HDFS

Hi Stuti,

Do a search for libhdfs.so* and do also an ldd /path/to/fuse_dfs. Could be that only a symlink is missing. With ldd you will see which libraries the binary wants, if the libhdfs.so.1 is not in the path export the path where you found it. 

- Alex

Alexander Lorenz
http://mapredit.blogspot.com

On Jan 4, 2012, at 4:08 AM, Stuti Awasthi <st...@hcl.com> wrote:

> I have already exported it in the env. Output of "export" command. 
> 
> declare -x LD_LIBRARY_PATH="/usr/lib:/usr/local/lib:/home/jony/FreshHadoop/hadoop-0.20.2/build/libhdfs:/usr/lib/jvm/java-6-openjdk/jre/lib/i386/server/:/usr/lib/libfuse.so"
> 
> Stuti
> ________________________________________
> From: Harsh J [harsh@cloudera.com]
> Sent: Wednesday, January 04, 2012 5:34 PM
> To: hdfs-user@hadoop.apache.org
> Subject: Re: Mounting HDFS
> 
> Stuti,
> 
> Your env needs to carry this:
> 
> export 
> LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/path/to/dir/where/libhdfs/files/are/
> present
> 
> Otherwise the fuse_dfs binary won't be able to find and load it. The 
> wrapper script does this as part of its setup if you read it.
> 
> On Wed, Jan 4, 2012 at 5:29 PM, Stuti Awasthi <st...@hcl.com> wrote:
>> Im able to mount using command :
>> 
>> fuse_dfs_wrapper.sh dfs://<server>:<port> /export/hdfs
>> 
>> -----Original Message-----
>> From: Stuti Awasthi
>> Sent: Wednesday, January 04, 2012 5:24 PM
>> To: hdfs-user@hadoop.apache.org
>> Subject: RE: Mounting HDFS
>> 
>> Harsh,
>> 
>> Output of $file `which fuse_dfs`
>> 
>> /sbin/fuse_dfs: ELF 32-bit LSB executable, Intel 80386, version 1 
>> (SYSV), dynamically linked (uses shared libs), for GNU/Linux 2.6.15, 
>> not stripped
>> 
>> Same output for $ file /sbin/fuse_dfs
>> 
>> Thanks
>> ________________________________________
>> From: Harsh J [harsh@cloudera.com]
>> Sent: Wednesday, January 04, 2012 5:18 PM
>> To: hdfs-user@hadoop.apache.org
>> Subject: Re: Mounting HDFS
>> 
>> Stuti,
>> 
>> My original command was "file `which fuse_dfs`", and not just the which command.
>> 
>> Can you run "file /sbin/fuse_dfs"? You need the utility called 'file' available (its mostly present).
>> 
>> On 04-Jan-2012, at 5:08 PM, Stuti Awasthi wrote:
>> 
>>> Hi Harsh,
>>> 
>>> Currently I am using 32 bit Ubuntu11.10, Hadoop 0.20.2
>>> 
>>> Output of : $ which fuse_dfs
>>> /sbin/fuse_dfs
>>> 
>>> I searched on net and I got this url "http://wiki.apache.org/hadoop/MountableHDFS"
>>> How can I get hdfs fuse deb or rpm packages ?? Thanks for pointing this, can you please guide me more on this .
>>> 
>>> Thanks
>>> 
>>> -----Original Message-----
>>> From: Harsh J [mailto:harsh@cloudera.com]
>>> Sent: Wednesday, January 04, 2012 4:51 PM
>>> To: hdfs-user@hadoop.apache.org
>>> Subject: Re: Mounting HDFS
>>> 
>>> Stuti,
>>> 
>>> What's your platform - 32-bits or 64-bits? Which one have you built libhdfs for?
>>> 
>>> What's the output of the following?
>>> $ file `which fuse_dfs`
>>> 
>>> FWIW, the most hassle free way to do these things today is to use proper packages available for your platform, instead of compiling it by yourself. Just a suggestion.
>>> 
>>> On 04-Jan-2012, at 4:28 PM, Stuti Awasthi wrote:
>>> 
>>>> Hi All,
>>>> 
>>>> I am following http://wiki.apache.org/hadoop/MountableHDFS for HDFS mount.
>>>> I have successfully followed the steps till "Installing" and I am able mount it properly. After that I am trying with "Deploying" step and followed the steps:
>>>> 
>>>> 1. add the following to /etc/fstab
>>>> fuse_dfs#dfs://hadoop_server.foo.com:9000 /export/hdfs fuse 
>>>> -oallow_other,rw,-ousetrash 0 0
>>>> 
>>>> 2. added fuse_dfs to /sbin
>>>> 
>>>> 3. export 
>>>> LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/build/libhdfs
>>>> 
>>>> 4. Mount using: mount /export/hdfs.
>>>> 
>>>> But getting error :
>>>> fuse_dfs: error while loading shared libraries : libhdfs.so.0: cannot open shared object file: No such file or directory.
>>>> 
>>>> How to fix this ?
>>>> 
>>>> Thanks
>>>> 
>>>> ::DISCLAIMER::
>>>> -------------------------------------------------------------------
>>>> --
>>>> -
>>>> -------------------------------------------------
>>>> 
>>>> The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
>>>> It shall not attach any liability on the originator or HCL or its 
>>>> affiliates. Any views or opinions presented in this email are solely those of the author and may not necessarily reflect the opinions of HCL or its affiliates.
>>>> Any form of reproduction, dissemination, copying, disclosure, 
>>>> modification, distribution and / or publication of this message 
>>>> without the prior written consent of the author of this e-mail is 
>>>> strictly prohibited. If you have received this email in error please delete it and notify the sender immediately. Before opening any mail and attachments please check them for viruses and defect.
>>>> 
>>>> -------------------------------------------------------------------
>>>> --
>>>> -
>>>> -------------------------------------------------
>>> 
>> 
> 
> 
> 
> --
> Harsh J

Re: Mounting HDFS

Posted by Alexander Lorenz <wg...@googlemail.com>.
Hi Stuti,

Do a search for libhdfs.so* and do also an ldd /path/to/fuse_dfs. Could be that only a symlink is missing. With ldd you will see which libraries the binary wants, if the libhdfs.so.1 is not in the path export the path where you found it. 

- Alex

Alexander Lorenz
http://mapredit.blogspot.com

On Jan 4, 2012, at 4:08 AM, Stuti Awasthi <st...@hcl.com> wrote:

> I have already exported it in the env. Output of "export" command. 
> 
> declare -x LD_LIBRARY_PATH="/usr/lib:/usr/local/lib:/home/jony/FreshHadoop/hadoop-0.20.2/build/libhdfs:/usr/lib/jvm/java-6-openjdk/jre/lib/i386/server/:/usr/lib/libfuse.so"
> 
> Stuti
> ________________________________________
> From: Harsh J [harsh@cloudera.com]
> Sent: Wednesday, January 04, 2012 5:34 PM
> To: hdfs-user@hadoop.apache.org
> Subject: Re: Mounting HDFS
> 
> Stuti,
> 
> Your env needs to carry this:
> 
> export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/path/to/dir/where/libhdfs/files/are/present
> 
> Otherwise the fuse_dfs binary won't be able to find and load it. The
> wrapper script does this as part of its setup if you read it.
> 
> On Wed, Jan 4, 2012 at 5:29 PM, Stuti Awasthi <st...@hcl.com> wrote:
>> Im able to mount using command :
>> 
>> fuse_dfs_wrapper.sh dfs://<server>:<port> /export/hdfs
>> 
>> -----Original Message-----
>> From: Stuti Awasthi
>> Sent: Wednesday, January 04, 2012 5:24 PM
>> To: hdfs-user@hadoop.apache.org
>> Subject: RE: Mounting HDFS
>> 
>> Harsh,
>> 
>> Output of $file `which fuse_dfs`
>> 
>> /sbin/fuse_dfs: ELF 32-bit LSB executable, Intel 80386, version 1 (SYSV), dynamically linked (uses shared libs), for GNU/Linux 2.6.15, not stripped
>> 
>> Same output for $ file /sbin/fuse_dfs
>> 
>> Thanks
>> ________________________________________
>> From: Harsh J [harsh@cloudera.com]
>> Sent: Wednesday, January 04, 2012 5:18 PM
>> To: hdfs-user@hadoop.apache.org
>> Subject: Re: Mounting HDFS
>> 
>> Stuti,
>> 
>> My original command was "file `which fuse_dfs`", and not just the which command.
>> 
>> Can you run "file /sbin/fuse_dfs"? You need the utility called 'file' available (its mostly present).
>> 
>> On 04-Jan-2012, at 5:08 PM, Stuti Awasthi wrote:
>> 
>>> Hi Harsh,
>>> 
>>> Currently I am using 32 bit Ubuntu11.10, Hadoop 0.20.2
>>> 
>>> Output of : $ which fuse_dfs
>>> /sbin/fuse_dfs
>>> 
>>> I searched on net and I got this url "http://wiki.apache.org/hadoop/MountableHDFS"
>>> How can I get hdfs fuse deb or rpm packages ?? Thanks for pointing this, can you please guide me more on this .
>>> 
>>> Thanks
>>> 
>>> -----Original Message-----
>>> From: Harsh J [mailto:harsh@cloudera.com]
>>> Sent: Wednesday, January 04, 2012 4:51 PM
>>> To: hdfs-user@hadoop.apache.org
>>> Subject: Re: Mounting HDFS
>>> 
>>> Stuti,
>>> 
>>> What's your platform - 32-bits or 64-bits? Which one have you built libhdfs for?
>>> 
>>> What's the output of the following?
>>> $ file `which fuse_dfs`
>>> 
>>> FWIW, the most hassle free way to do these things today is to use proper packages available for your platform, instead of compiling it by yourself. Just a suggestion.
>>> 
>>> On 04-Jan-2012, at 4:28 PM, Stuti Awasthi wrote:
>>> 
>>>> Hi All,
>>>> 
>>>> I am following http://wiki.apache.org/hadoop/MountableHDFS for HDFS mount.
>>>> I have successfully followed the steps till "Installing" and I am able mount it properly. After that I am trying with "Deploying" step and followed the steps:
>>>> 
>>>> 1. add the following to /etc/fstab
>>>> fuse_dfs#dfs://hadoop_server.foo.com:9000 /export/hdfs fuse
>>>> -oallow_other,rw,-ousetrash 0 0
>>>> 
>>>> 2. added fuse_dfs to /sbin
>>>> 
>>>> 3. export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/build/libhdfs
>>>> 
>>>> 4. Mount using: mount /export/hdfs.
>>>> 
>>>> But getting error :
>>>> fuse_dfs: error while loading shared libraries : libhdfs.so.0: cannot open shared object file: No such file or directory.
>>>> 
>>>> How to fix this ?
>>>> 
>>>> Thanks
>>>> 
>>>> ::DISCLAIMER::
>>>> ---------------------------------------------------------------------
>>>> -
>>>> -------------------------------------------------
>>>> 
>>>> The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
>>>> It shall not attach any liability on the originator or HCL or its
>>>> affiliates. Any views or opinions presented in this email are solely those of the author and may not necessarily reflect the opinions of HCL or its affiliates.
>>>> Any form of reproduction, dissemination, copying, disclosure,
>>>> modification, distribution and / or publication of this message
>>>> without the prior written consent of the author of this e-mail is
>>>> strictly prohibited. If you have received this email in error please delete it and notify the sender immediately. Before opening any mail and attachments please check them for viruses and defect.
>>>> 
>>>> ---------------------------------------------------------------------
>>>> -
>>>> -------------------------------------------------
>>> 
>> 
> 
> 
> 
> --
> Harsh J

RE: Mounting HDFS

Posted by Stuti Awasthi <st...@hcl.com>.
I have already exported it in the env. Output of "export" command. 

declare -x LD_LIBRARY_PATH="/usr/lib:/usr/local/lib:/home/jony/FreshHadoop/hadoop-0.20.2/build/libhdfs:/usr/lib/jvm/java-6-openjdk/jre/lib/i386/server/:/usr/lib/libfuse.so"

Stuti
________________________________________
From: Harsh J [harsh@cloudera.com]
Sent: Wednesday, January 04, 2012 5:34 PM
To: hdfs-user@hadoop.apache.org
Subject: Re: Mounting HDFS

Stuti,

Your env needs to carry this:

export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/path/to/dir/where/libhdfs/files/are/present

Otherwise the fuse_dfs binary won't be able to find and load it. The
wrapper script does this as part of its setup if you read it.

On Wed, Jan 4, 2012 at 5:29 PM, Stuti Awasthi <st...@hcl.com> wrote:
> Im able to mount using command :
>
> fuse_dfs_wrapper.sh dfs://<server>:<port> /export/hdfs
>
> -----Original Message-----
> From: Stuti Awasthi
> Sent: Wednesday, January 04, 2012 5:24 PM
> To: hdfs-user@hadoop.apache.org
> Subject: RE: Mounting HDFS
>
> Harsh,
>
> Output of $file `which fuse_dfs`
>
> /sbin/fuse_dfs: ELF 32-bit LSB executable, Intel 80386, version 1 (SYSV), dynamically linked (uses shared libs), for GNU/Linux 2.6.15, not stripped
>
> Same output for $ file /sbin/fuse_dfs
>
> Thanks
> ________________________________________
> From: Harsh J [harsh@cloudera.com]
> Sent: Wednesday, January 04, 2012 5:18 PM
> To: hdfs-user@hadoop.apache.org
> Subject: Re: Mounting HDFS
>
> Stuti,
>
> My original command was "file `which fuse_dfs`", and not just the which command.
>
> Can you run "file /sbin/fuse_dfs"? You need the utility called 'file' available (its mostly present).
>
> On 04-Jan-2012, at 5:08 PM, Stuti Awasthi wrote:
>
>> Hi Harsh,
>>
>> Currently I am using 32 bit Ubuntu11.10, Hadoop 0.20.2
>>
>> Output of : $ which fuse_dfs
>> /sbin/fuse_dfs
>>
>> I searched on net and I got this url "http://wiki.apache.org/hadoop/MountableHDFS"
>> How can I get hdfs fuse deb or rpm packages ?? Thanks for pointing this, can you please guide me more on this .
>>
>> Thanks
>>
>> -----Original Message-----
>> From: Harsh J [mailto:harsh@cloudera.com]
>> Sent: Wednesday, January 04, 2012 4:51 PM
>> To: hdfs-user@hadoop.apache.org
>> Subject: Re: Mounting HDFS
>>
>> Stuti,
>>
>> What's your platform - 32-bits or 64-bits? Which one have you built libhdfs for?
>>
>> What's the output of the following?
>> $ file `which fuse_dfs`
>>
>> FWIW, the most hassle free way to do these things today is to use proper packages available for your platform, instead of compiling it by yourself. Just a suggestion.
>>
>> On 04-Jan-2012, at 4:28 PM, Stuti Awasthi wrote:
>>
>>> Hi All,
>>>
>>> I am following http://wiki.apache.org/hadoop/MountableHDFS for HDFS mount.
>>> I have successfully followed the steps till "Installing" and I am able mount it properly. After that I am trying with "Deploying" step and followed the steps:
>>>
>>> 1. add the following to /etc/fstab
>>> fuse_dfs#dfs://hadoop_server.foo.com:9000 /export/hdfs fuse
>>> -oallow_other,rw,-ousetrash 0 0
>>>
>>> 2. added fuse_dfs to /sbin
>>>
>>> 3. export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/build/libhdfs
>>>
>>> 4. Mount using: mount /export/hdfs.
>>>
>>> But getting error :
>>> fuse_dfs: error while loading shared libraries : libhdfs.so.0: cannot open shared object file: No such file or directory.
>>>
>>> How to fix this ?
>>>
>>> Thanks
>>>
>>> ::DISCLAIMER::
>>> ---------------------------------------------------------------------
>>> -
>>> -------------------------------------------------
>>>
>>> The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
>>> It shall not attach any liability on the originator or HCL or its
>>> affiliates. Any views or opinions presented in this email are solely those of the author and may not necessarily reflect the opinions of HCL or its affiliates.
>>> Any form of reproduction, dissemination, copying, disclosure,
>>> modification, distribution and / or publication of this message
>>> without the prior written consent of the author of this e-mail is
>>> strictly prohibited. If you have received this email in error please delete it and notify the sender immediately. Before opening any mail and attachments please check them for viruses and defect.
>>>
>>> ---------------------------------------------------------------------
>>> -
>>> -------------------------------------------------
>>
>



--
Harsh J

Re: Mounting HDFS

Posted by Harsh J <ha...@cloudera.com>.
Stuti,

Your env needs to carry this:

export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/path/to/dir/where/libhdfs/files/are/present

Otherwise the fuse_dfs binary won't be able to find and load it. The
wrapper script does this as part of its setup if you read it.

On Wed, Jan 4, 2012 at 5:29 PM, Stuti Awasthi <st...@hcl.com> wrote:
> Im able to mount using command :
>
> fuse_dfs_wrapper.sh dfs://<server>:<port> /export/hdfs
>
> -----Original Message-----
> From: Stuti Awasthi
> Sent: Wednesday, January 04, 2012 5:24 PM
> To: hdfs-user@hadoop.apache.org
> Subject: RE: Mounting HDFS
>
> Harsh,
>
> Output of $file `which fuse_dfs`
>
> /sbin/fuse_dfs: ELF 32-bit LSB executable, Intel 80386, version 1 (SYSV), dynamically linked (uses shared libs), for GNU/Linux 2.6.15, not stripped
>
> Same output for $ file /sbin/fuse_dfs
>
> Thanks
> ________________________________________
> From: Harsh J [harsh@cloudera.com]
> Sent: Wednesday, January 04, 2012 5:18 PM
> To: hdfs-user@hadoop.apache.org
> Subject: Re: Mounting HDFS
>
> Stuti,
>
> My original command was "file `which fuse_dfs`", and not just the which command.
>
> Can you run "file /sbin/fuse_dfs"? You need the utility called 'file' available (its mostly present).
>
> On 04-Jan-2012, at 5:08 PM, Stuti Awasthi wrote:
>
>> Hi Harsh,
>>
>> Currently I am using 32 bit Ubuntu11.10, Hadoop 0.20.2
>>
>> Output of : $ which fuse_dfs
>> /sbin/fuse_dfs
>>
>> I searched on net and I got this url "http://wiki.apache.org/hadoop/MountableHDFS"
>> How can I get hdfs fuse deb or rpm packages ?? Thanks for pointing this, can you please guide me more on this .
>>
>> Thanks
>>
>> -----Original Message-----
>> From: Harsh J [mailto:harsh@cloudera.com]
>> Sent: Wednesday, January 04, 2012 4:51 PM
>> To: hdfs-user@hadoop.apache.org
>> Subject: Re: Mounting HDFS
>>
>> Stuti,
>>
>> What's your platform - 32-bits or 64-bits? Which one have you built libhdfs for?
>>
>> What's the output of the following?
>> $ file `which fuse_dfs`
>>
>> FWIW, the most hassle free way to do these things today is to use proper packages available for your platform, instead of compiling it by yourself. Just a suggestion.
>>
>> On 04-Jan-2012, at 4:28 PM, Stuti Awasthi wrote:
>>
>>> Hi All,
>>>
>>> I am following http://wiki.apache.org/hadoop/MountableHDFS for HDFS mount.
>>> I have successfully followed the steps till "Installing" and I am able mount it properly. After that I am trying with "Deploying" step and followed the steps:
>>>
>>> 1. add the following to /etc/fstab
>>> fuse_dfs#dfs://hadoop_server.foo.com:9000 /export/hdfs fuse
>>> -oallow_other,rw,-ousetrash 0 0
>>>
>>> 2. added fuse_dfs to /sbin
>>>
>>> 3. export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/build/libhdfs
>>>
>>> 4. Mount using: mount /export/hdfs.
>>>
>>> But getting error :
>>> fuse_dfs: error while loading shared libraries : libhdfs.so.0: cannot open shared object file: No such file or directory.
>>>
>>> How to fix this ?
>>>
>>> Thanks
>>>
>>> ::DISCLAIMER::
>>> ---------------------------------------------------------------------
>>> -
>>> -------------------------------------------------
>>>
>>> The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
>>> It shall not attach any liability on the originator or HCL or its
>>> affiliates. Any views or opinions presented in this email are solely those of the author and may not necessarily reflect the opinions of HCL or its affiliates.
>>> Any form of reproduction, dissemination, copying, disclosure,
>>> modification, distribution and / or publication of this message
>>> without the prior written consent of the author of this e-mail is
>>> strictly prohibited. If you have received this email in error please delete it and notify the sender immediately. Before opening any mail and attachments please check them for viruses and defect.
>>>
>>> ---------------------------------------------------------------------
>>> -
>>> -------------------------------------------------
>>
>



-- 
Harsh J

RE: Mounting HDFS

Posted by Stuti Awasthi <st...@hcl.com>.
Im able to mount using command :

fuse_dfs_wrapper.sh dfs://<server>:<port> /export/hdfs

-----Original Message-----
From: Stuti Awasthi 
Sent: Wednesday, January 04, 2012 5:24 PM
To: hdfs-user@hadoop.apache.org
Subject: RE: Mounting HDFS

Harsh,

Output of $file `which fuse_dfs`

/sbin/fuse_dfs: ELF 32-bit LSB executable, Intel 80386, version 1 (SYSV), dynamically linked (uses shared libs), for GNU/Linux 2.6.15, not stripped

Same output for $ file /sbin/fuse_dfs

Thanks
________________________________________
From: Harsh J [harsh@cloudera.com]
Sent: Wednesday, January 04, 2012 5:18 PM
To: hdfs-user@hadoop.apache.org
Subject: Re: Mounting HDFS

Stuti,

My original command was "file `which fuse_dfs`", and not just the which command.

Can you run "file /sbin/fuse_dfs"? You need the utility called 'file' available (its mostly present).

On 04-Jan-2012, at 5:08 PM, Stuti Awasthi wrote:

> Hi Harsh,
>
> Currently I am using 32 bit Ubuntu11.10, Hadoop 0.20.2
>
> Output of : $ which fuse_dfs
> /sbin/fuse_dfs
>
> I searched on net and I got this url "http://wiki.apache.org/hadoop/MountableHDFS"
> How can I get hdfs fuse deb or rpm packages ?? Thanks for pointing this, can you please guide me more on this .
>
> Thanks
>
> -----Original Message-----
> From: Harsh J [mailto:harsh@cloudera.com]
> Sent: Wednesday, January 04, 2012 4:51 PM
> To: hdfs-user@hadoop.apache.org
> Subject: Re: Mounting HDFS
>
> Stuti,
>
> What's your platform - 32-bits or 64-bits? Which one have you built libhdfs for?
>
> What's the output of the following?
> $ file `which fuse_dfs`
>
> FWIW, the most hassle free way to do these things today is to use proper packages available for your platform, instead of compiling it by yourself. Just a suggestion.
>
> On 04-Jan-2012, at 4:28 PM, Stuti Awasthi wrote:
>
>> Hi All,
>>
>> I am following http://wiki.apache.org/hadoop/MountableHDFS for HDFS mount.
>> I have successfully followed the steps till "Installing" and I am able mount it properly. After that I am trying with "Deploying" step and followed the steps:
>>
>> 1. add the following to /etc/fstab
>> fuse_dfs#dfs://hadoop_server.foo.com:9000 /export/hdfs fuse 
>> -oallow_other,rw,-ousetrash 0 0
>>
>> 2. added fuse_dfs to /sbin
>>
>> 3. export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/build/libhdfs
>>
>> 4. Mount using: mount /export/hdfs.
>>
>> But getting error :
>> fuse_dfs: error while loading shared libraries : libhdfs.so.0: cannot open shared object file: No such file or directory.
>>
>> How to fix this ?
>>
>> Thanks
>>
>> ::DISCLAIMER::
>> ---------------------------------------------------------------------
>> -
>> -------------------------------------------------
>>
>> The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
>> It shall not attach any liability on the originator or HCL or its 
>> affiliates. Any views or opinions presented in this email are solely those of the author and may not necessarily reflect the opinions of HCL or its affiliates.
>> Any form of reproduction, dissemination, copying, disclosure, 
>> modification, distribution and / or publication of this message 
>> without the prior written consent of the author of this e-mail is 
>> strictly prohibited. If you have received this email in error please delete it and notify the sender immediately. Before opening any mail and attachments please check them for viruses and defect.
>>
>> ---------------------------------------------------------------------
>> -
>> -------------------------------------------------
>


RE: Mounting HDFS

Posted by Stuti Awasthi <st...@hcl.com>.
Harsh,

Output of $file `which fuse_dfs`

/sbin/fuse_dfs: ELF 32-bit LSB executable, Intel 80386, version 1 (SYSV), dynamically linked (uses shared libs), for GNU/Linux 2.6.15, not stripped

Same output for $ file /sbin/fuse_dfs

Thanks
________________________________________
From: Harsh J [harsh@cloudera.com]
Sent: Wednesday, January 04, 2012 5:18 PM
To: hdfs-user@hadoop.apache.org
Subject: Re: Mounting HDFS

Stuti,

My original command was "file `which fuse_dfs`", and not just the which command.

Can you run "file /sbin/fuse_dfs"? You need the utility called 'file' available (its mostly present).

On 04-Jan-2012, at 5:08 PM, Stuti Awasthi wrote:

> Hi Harsh,
>
> Currently I am using 32 bit Ubuntu11.10, Hadoop 0.20.2
>
> Output of : $ which fuse_dfs
> /sbin/fuse_dfs
>
> I searched on net and I got this url "http://wiki.apache.org/hadoop/MountableHDFS"
> How can I get hdfs fuse deb or rpm packages ?? Thanks for pointing this, can you please guide me more on this .
>
> Thanks
>
> -----Original Message-----
> From: Harsh J [mailto:harsh@cloudera.com]
> Sent: Wednesday, January 04, 2012 4:51 PM
> To: hdfs-user@hadoop.apache.org
> Subject: Re: Mounting HDFS
>
> Stuti,
>
> What's your platform - 32-bits or 64-bits? Which one have you built libhdfs for?
>
> What's the output of the following?
> $ file `which fuse_dfs`
>
> FWIW, the most hassle free way to do these things today is to use proper packages available for your platform, instead of compiling it by yourself. Just a suggestion.
>
> On 04-Jan-2012, at 4:28 PM, Stuti Awasthi wrote:
>
>> Hi All,
>>
>> I am following http://wiki.apache.org/hadoop/MountableHDFS for HDFS mount.
>> I have successfully followed the steps till "Installing" and I am able mount it properly. After that I am trying with "Deploying" step and followed the steps:
>>
>> 1. add the following to /etc/fstab
>> fuse_dfs#dfs://hadoop_server.foo.com:9000 /export/hdfs fuse
>> -oallow_other,rw,-ousetrash 0 0
>>
>> 2. added fuse_dfs to /sbin
>>
>> 3. export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/build/libhdfs
>>
>> 4. Mount using: mount /export/hdfs.
>>
>> But getting error :
>> fuse_dfs: error while loading shared libraries : libhdfs.so.0: cannot open shared object file: No such file or directory.
>>
>> How to fix this ?
>>
>> Thanks
>>
>> ::DISCLAIMER::
>> ----------------------------------------------------------------------
>> -------------------------------------------------
>>
>> The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
>> It shall not attach any liability on the originator or HCL or its
>> affiliates. Any views or opinions presented in this email are solely those of the author and may not necessarily reflect the opinions of HCL or its affiliates.
>> Any form of reproduction, dissemination, copying, disclosure,
>> modification, distribution and / or publication of this message
>> without the prior written consent of the author of this e-mail is
>> strictly prohibited. If you have received this email in error please delete it and notify the sender immediately. Before opening any mail and attachments please check them for viruses and defect.
>>
>> ----------------------------------------------------------------------
>> -------------------------------------------------
>


Re: Mounting HDFS

Posted by Harsh J <ha...@cloudera.com>.
Stuti,

My original command was "file `which fuse_dfs`", and not just the which command.

Can you run "file /sbin/fuse_dfs"? You need the utility called 'file' available (its mostly present).

On 04-Jan-2012, at 5:08 PM, Stuti Awasthi wrote:

> Hi Harsh,
> 
> Currently I am using 32 bit Ubuntu11.10, Hadoop 0.20.2
> 
> Output of : $ which fuse_dfs
> /sbin/fuse_dfs
> 
> I searched on net and I got this url "http://wiki.apache.org/hadoop/MountableHDFS"
> How can I get hdfs fuse deb or rpm packages ?? Thanks for pointing this, can you please guide me more on this .
> 
> Thanks
> 
> -----Original Message-----
> From: Harsh J [mailto:harsh@cloudera.com] 
> Sent: Wednesday, January 04, 2012 4:51 PM
> To: hdfs-user@hadoop.apache.org
> Subject: Re: Mounting HDFS
> 
> Stuti,
> 
> What's your platform - 32-bits or 64-bits? Which one have you built libhdfs for?
> 
> What's the output of the following?
> $ file `which fuse_dfs`
> 
> FWIW, the most hassle free way to do these things today is to use proper packages available for your platform, instead of compiling it by yourself. Just a suggestion.
> 
> On 04-Jan-2012, at 4:28 PM, Stuti Awasthi wrote:
> 
>> Hi All,
>> 
>> I am following http://wiki.apache.org/hadoop/MountableHDFS for HDFS mount.
>> I have successfully followed the steps till "Installing" and I am able mount it properly. After that I am trying with "Deploying" step and followed the steps:
>> 
>> 1. add the following to /etc/fstab
>> fuse_dfs#dfs://hadoop_server.foo.com:9000 /export/hdfs fuse 
>> -oallow_other,rw,-ousetrash 0 0
>> 
>> 2. added fuse_dfs to /sbin
>> 
>> 3. export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/build/libhdfs
>> 
>> 4. Mount using: mount /export/hdfs.
>> 
>> But getting error :
>> fuse_dfs: error while loading shared libraries : libhdfs.so.0: cannot open shared object file: No such file or directory.
>> 
>> How to fix this ?
>> 
>> Thanks
>> 
>> ::DISCLAIMER::
>> ----------------------------------------------------------------------
>> -------------------------------------------------
>> 
>> The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
>> It shall not attach any liability on the originator or HCL or its 
>> affiliates. Any views or opinions presented in this email are solely those of the author and may not necessarily reflect the opinions of HCL or its affiliates.
>> Any form of reproduction, dissemination, copying, disclosure, 
>> modification, distribution and / or publication of this message 
>> without the prior written consent of the author of this e-mail is 
>> strictly prohibited. If you have received this email in error please delete it and notify the sender immediately. Before opening any mail and attachments please check them for viruses and defect.
>> 
>> ----------------------------------------------------------------------
>> -------------------------------------------------
> 


RE: Mounting HDFS

Posted by Stuti Awasthi <st...@hcl.com>.
Hi Harsh,

Currently I am using 32 bit Ubuntu11.10, Hadoop 0.20.2

Output of : $ which fuse_dfs
/sbin/fuse_dfs

I searched on net and I got this url "http://wiki.apache.org/hadoop/MountableHDFS"
How can I get hdfs fuse deb or rpm packages ?? Thanks for pointing this, can you please guide me more on this .

Thanks

-----Original Message-----
From: Harsh J [mailto:harsh@cloudera.com] 
Sent: Wednesday, January 04, 2012 4:51 PM
To: hdfs-user@hadoop.apache.org
Subject: Re: Mounting HDFS

Stuti,

What's your platform - 32-bits or 64-bits? Which one have you built libhdfs for?

What's the output of the following?
$ file `which fuse_dfs`

FWIW, the most hassle free way to do these things today is to use proper packages available for your platform, instead of compiling it by yourself. Just a suggestion.

On 04-Jan-2012, at 4:28 PM, Stuti Awasthi wrote:

> Hi All,
> 
> I am following http://wiki.apache.org/hadoop/MountableHDFS for HDFS mount.
> I have successfully followed the steps till "Installing" and I am able mount it properly. After that I am trying with "Deploying" step and followed the steps:
> 
> 1. add the following to /etc/fstab
> fuse_dfs#dfs://hadoop_server.foo.com:9000 /export/hdfs fuse 
> -oallow_other,rw,-ousetrash 0 0
> 
> 2. added fuse_dfs to /sbin
> 
> 3. export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/build/libhdfs
> 
> 4. Mount using: mount /export/hdfs.
> 
> But getting error :
> fuse_dfs: error while loading shared libraries : libhdfs.so.0: cannot open shared object file: No such file or directory.
> 
> How to fix this ?
> 
> Thanks
> 
> ::DISCLAIMER::
> ----------------------------------------------------------------------
> -------------------------------------------------
> 
> The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
> It shall not attach any liability on the originator or HCL or its 
> affiliates. Any views or opinions presented in this email are solely those of the author and may not necessarily reflect the opinions of HCL or its affiliates.
> Any form of reproduction, dissemination, copying, disclosure, 
> modification, distribution and / or publication of this message 
> without the prior written consent of the author of this e-mail is 
> strictly prohibited. If you have received this email in error please delete it and notify the sender immediately. Before opening any mail and attachments please check them for viruses and defect.
> 
> ----------------------------------------------------------------------
> -------------------------------------------------


Re: Mounting HDFS

Posted by Harsh J <ha...@cloudera.com>.
Stuti,

What's your platform - 32-bits or 64-bits? Which one have you built libhdfs for?

What's the output of the following?
$ file `which fuse_dfs`

FWIW, the most hassle free way to do these things today is to use proper packages available for your platform, instead of compiling it by yourself. Just a suggestion.

On 04-Jan-2012, at 4:28 PM, Stuti Awasthi wrote:

> Hi All,
> 
> I am following http://wiki.apache.org/hadoop/MountableHDFS for HDFS mount.
> I have successfully followed the steps till "Installing" and I am able mount it properly. After that I am trying with "Deploying" step and followed the steps:
> 
> 1. add the following to /etc/fstab
> fuse_dfs#dfs://hadoop_server.foo.com:9000 /export/hdfs fuse -oallow_other,rw,-ousetrash 0 0
> 
> 2. added fuse_dfs to /sbin
> 
> 3. export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/build/libhdfs
> 
> 4. Mount using: mount /export/hdfs.
> 
> But getting error :
> fuse_dfs: error while loading shared libraries : libhdfs.so.0: cannot open shared object file: No such file or directory.
> 
> How to fix this ?
> 
> Thanks
> 
> ::DISCLAIMER::
> -----------------------------------------------------------------------------------------------------------------------
> 
> The contents of this e-mail and any attachment(s) are confidential and intended for the named recipient(s) only.
> It shall not attach any liability on the originator or HCL or its affiliates. Any views or opinions presented in
> this email are solely those of the author and may not necessarily reflect the opinions of HCL or its affiliates.
> Any form of reproduction, dissemination, copying, disclosure, modification, distribution and / or publication of
> this message without the prior written consent of the author of this e-mail is strictly prohibited. If you have
> received this email in error please delete it and notify the sender immediately. Before opening any mail and
> attachments please check them for viruses and defect.
> 
> -----------------------------------------------------------------------------------------------------------------------