You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by YouPeng Yang <yy...@gmail.com> on 2013/04/10 16:06:12 UTC

No bulid.xml when to build FUSE

Dear All

   I want to integrate the FUSE with the Hadoop.
   So i checkout the code using the command:
*[root@Hadoop ~]#  svn checkout
http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk*

   However I did not find any ant build.xmls to build the fuse-dfs in the
*hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib.*
* *  Did I checkout the wrong codes, Or is there any other ways to bulid
fuse-dfs?

*  * Please guide me .
*   *
*
*
*Thanks *

regards

Re: No bulid.xml when to build FUSE

Posted by yypvsxf19870706 <yy...@gmail.com>.
Hi Jay
     I only get to know the way to build the fuse through the doc Readme.txt in the fuse-dfs directory . The readme file indicate to use the ant ,but no information about using the maven.
     Also I want to integrate the hdfs with fuse right now. Is there any docs about building fuse-dfs .
     Thank you all the same.


Regards



发自我的 iPhone

在 2013-4-10,22:28,Jay Vyas <ja...@gmail.com> 写道:

> hadoop-hdfs builds with maven, not ant.  
> 
> You might also need to install the serialization libraries.  
> 
> See http://wiki.apache.org/hadoop/HowToContribute .
> 
> As an aside, you could try to use gluster as a FUSE mount if you simply want a HA FUSE mountable filesystem
> which is mapreduce compatible. https://github.com/gluster/hadoop-glusterfs .
> 
> 
> 
> ---------- Forwarded message ----------
> From: YouPeng Yang <yy...@gmail.com>
> Date: Wed, Apr 10, 2013 at 10:06 AM
> Subject: No bulid.xml when to build FUSE
> To: user@hadoop.apache.org
> 
> 
> Dear All
> 
>    I want to integrate the FUSE with the Hadoop.
>    So i checkout the code using the command:
> [root@Hadoop ~]#  svn checkout http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk
>   
>    However I did not find any ant build.xmls to build the fuse-dfs in the  
> hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib.
>    Did I checkout the wrong codes, Or is there any other ways to bulid fuse-dfs?
> 
>    Please guide me .
>    
> 
> Thanks 
> 
> regards
> 
> 
> 
> -- 
> Jay Vyas
> http://jayunit100.blogspot.com

Re: No bulid.xml when to build FUSE

Posted by yypvsxf19870706 <yy...@gmail.com>.
Hi Jay
     I only get to know the way to build the fuse through the doc Readme.txt in the fuse-dfs directory . The readme file indicate to use the ant ,but no information about using the maven.
     Also I want to integrate the hdfs with fuse right now. Is there any docs about building fuse-dfs .
     Thank you all the same.


Regards



�����ҵ� iPhone

�� 2013-4-10��22:28��Jay Vyas <ja...@gmail.com> ���

> hadoop-hdfs builds with maven, not ant.  
> 
> You might also need to install the serialization libraries.  
> 
> See http://wiki.apache.org/hadoop/HowToContribute .
> 
> As an aside, you could try to use gluster as a FUSE mount if you simply want a HA FUSE mountable filesystem
> which is mapreduce compatible. https://github.com/gluster/hadoop-glusterfs .
> 
> 
> 
> ---------- Forwarded message ----------
> From: YouPeng Yang <yy...@gmail.com>
> Date: Wed, Apr 10, 2013 at 10:06 AM
> Subject: No bulid.xml when to build FUSE
> To: user@hadoop.apache.org
> 
> 
> Dear All
> 
>    I want to integrate the FUSE with the Hadoop.
>    So i checkout the code using the command:
> [root@Hadoop ~]#  svn checkout http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk
>   
>    However I did not find any ant build.xmls to build the fuse-dfs in the  
> hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib.
>    Did I checkout the wrong codes, Or is there any other ways to bulid fuse-dfs?
> 
>    Please guide me .
>    
> 
> Thanks 
> 
> regards
> 
> 
> 
> -- 
> Jay Vyas
> http://jayunit100.blogspot.com

Re: No bulid.xml when to build FUSE

Posted by yypvsxf19870706 <yy...@gmail.com>.
Hi Jay
     I only get to know the way to build the fuse through the doc Readme.txt in the fuse-dfs directory . The readme file indicate to use the ant ,but no information about using the maven.
     Also I want to integrate the hdfs with fuse right now. Is there any docs about building fuse-dfs .
     Thank you all the same.


Regards



�����ҵ� iPhone

�� 2013-4-10��22:28��Jay Vyas <ja...@gmail.com> ���

> hadoop-hdfs builds with maven, not ant.  
> 
> You might also need to install the serialization libraries.  
> 
> See http://wiki.apache.org/hadoop/HowToContribute .
> 
> As an aside, you could try to use gluster as a FUSE mount if you simply want a HA FUSE mountable filesystem
> which is mapreduce compatible. https://github.com/gluster/hadoop-glusterfs .
> 
> 
> 
> ---------- Forwarded message ----------
> From: YouPeng Yang <yy...@gmail.com>
> Date: Wed, Apr 10, 2013 at 10:06 AM
> Subject: No bulid.xml when to build FUSE
> To: user@hadoop.apache.org
> 
> 
> Dear All
> 
>    I want to integrate the FUSE with the Hadoop.
>    So i checkout the code using the command:
> [root@Hadoop ~]#  svn checkout http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk
>   
>    However I did not find any ant build.xmls to build the fuse-dfs in the  
> hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib.
>    Did I checkout the wrong codes, Or is there any other ways to bulid fuse-dfs?
> 
>    Please guide me .
>    
> 
> Thanks 
> 
> regards
> 
> 
> 
> -- 
> Jay Vyas
> http://jayunit100.blogspot.com

Re: No bulid.xml when to build FUSE

Posted by yypvsxf19870706 <yy...@gmail.com>.
Hi Jay
     I only get to know the way to build the fuse through the doc Readme.txt in the fuse-dfs directory . The readme file indicate to use the ant ,but no information about using the maven.
     Also I want to integrate the hdfs with fuse right now. Is there any docs about building fuse-dfs .
     Thank you all the same.


Regards



发自我的 iPhone

在 2013-4-10,22:28,Jay Vyas <ja...@gmail.com> 写道:

> hadoop-hdfs builds with maven, not ant.  
> 
> You might also need to install the serialization libraries.  
> 
> See http://wiki.apache.org/hadoop/HowToContribute .
> 
> As an aside, you could try to use gluster as a FUSE mount if you simply want a HA FUSE mountable filesystem
> which is mapreduce compatible. https://github.com/gluster/hadoop-glusterfs .
> 
> 
> 
> ---------- Forwarded message ----------
> From: YouPeng Yang <yy...@gmail.com>
> Date: Wed, Apr 10, 2013 at 10:06 AM
> Subject: No bulid.xml when to build FUSE
> To: user@hadoop.apache.org
> 
> 
> Dear All
> 
>    I want to integrate the FUSE with the Hadoop.
>    So i checkout the code using the command:
> [root@Hadoop ~]#  svn checkout http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk
>   
>    However I did not find any ant build.xmls to build the fuse-dfs in the  
> hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib.
>    Did I checkout the wrong codes, Or is there any other ways to bulid fuse-dfs?
> 
>    Please guide me .
>    
> 
> Thanks 
> 
> regards
> 
> 
> 
> -- 
> Jay Vyas
> http://jayunit100.blogspot.com

Re: No bulid.xml when to build FUSE

Posted by Jay Vyas <ja...@gmail.com>.
hadoop-hdfs builds with maven, not ant.

You might also need to install the serialization libraries.

See http://wiki.apache.org/hadoop/HowToContribute .

As an aside, you could try to use gluster as a FUSE mount if you simply
want a HA FUSE mountable filesystem
which is mapreduce compatible. https://github.com/gluster/hadoop-glusterfs .



---------- Forwarded message ----------
From: YouPeng Yang <yy...@gmail.com>
Date: Wed, Apr 10, 2013 at 10:06 AM
Subject: No bulid.xml when to build FUSE
To: user@hadoop.apache.org


Dear All

   I want to integrate the FUSE with the Hadoop.
    So i checkout the code using the command:
*[root@Hadoop ~]#  svn checkout
http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk*

   However I did not find any ant build.xmls to build the fuse-dfs in the
*hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib.*
* *  Did I checkout the wrong codes, Or is there any other ways to bulid
fuse-dfs?

*  * Please guide me .
*   *
*
*
*Thanks *

regards



-- 
Jay Vyas
http://jayunit100.blogspot.com

Re: No bulid.xml when to build FUSE

Posted by Jay Vyas <ja...@gmail.com>.
hadoop-hdfs builds with maven, not ant.

You might also need to install the serialization libraries.

See http://wiki.apache.org/hadoop/HowToContribute .

As an aside, you could try to use gluster as a FUSE mount if you simply
want a HA FUSE mountable filesystem
which is mapreduce compatible. https://github.com/gluster/hadoop-glusterfs .



---------- Forwarded message ----------
From: YouPeng Yang <yy...@gmail.com>
Date: Wed, Apr 10, 2013 at 10:06 AM
Subject: No bulid.xml when to build FUSE
To: user@hadoop.apache.org


Dear All

   I want to integrate the FUSE with the Hadoop.
    So i checkout the code using the command:
*[root@Hadoop ~]#  svn checkout
http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk*

   However I did not find any ant build.xmls to build the fuse-dfs in the
*hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib.*
* *  Did I checkout the wrong codes, Or is there any other ways to bulid
fuse-dfs?

*  * Please guide me .
*   *
*
*
*Thanks *

regards



-- 
Jay Vyas
http://jayunit100.blogspot.com

Re: No bulid.xml when to build FUSE

Posted by YouPeng Yang <yy...@gmail.com>.
Hi Harsh

   I have found out the reason and the solutions after I check fuse-dfs
source code.So I reply again to close this question.

   The reason that error come out for is  the hadoop-*-.jars need to be in
the  CLASSPATH.So I add them to the CLASSPATH,and it work.

  Thank you


Regards.




2013/4/12 YouPeng Yang <yy...@gmail.com>

> Hi Harsh
>
>    Sorry for replying so later.
>
>    I still get some errors.
>
>    After I run the command  under hadoop-hdfs project:
>    *mvn install -Drequire.fuse=true -DskipTests*
>     I still can not find the  fuse_dfs binary script in my system by
> running the command:
>    * find / -name fuse_dfs.*
>
>    Thank to the google .I try *mvn package -Pnative -DskipTests* ,and the fuse_dfs
> comes out.
>
>     So,I run the command as user hadoop with an error :
>     [hadoop@Hadoop fuse-dfs]$ *./fuse_dfs_wrapper.sh hdfs://Hadoop:8020/
> /home/hadoop/expdfs/*
>     INFO
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:164
> Adding FUSE arg /home/hadoop/expdfs/
>     fuse: failed to exec fusermount: Permission denied
>
>     According to error,i switch to the user root,and run the command in
> debug mod  :
>
>    [root@Hadoop fuse-dfs]# *./fuse_dfs_wrapper.sh -d  hdfs://
> 192.168.1.150:8080/ /home/hadoop/expdfs/*
> INFO
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:115
> Ignoring option -d
> INFO
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:164
> Adding FUSE arg /home/hadoop/expdfs/
> FUSE library version: 2.8.3
> nullpath_ok: 0
> unique: 1, opcode: INIT (26), nodeid: 0, insize: 56
> INIT: 7.13
> flags=0x0000007b
> max_readahead=0x00020000
> INFO
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:98
> Mounting with options: [ protected=(NULL), nn_uri=hdfs://
> 192.168.1.150:8080/, nn_port=0, debug=0, read_only=0, initchecks=0,
> no_permissions=0, usetrash=0, entry_timeout=60, attribute_timeout=60,
> rdbuffer_size=10485760, direct_io=0 ]
> loadFileSystems error:
> (unable to get stack trace for java.lang.NoClassDefFoundError exception:
> ExceptionUtils::getStackTrace error.)
> hdfsConfGetInt(hadoop.fuse.timer.period): new Configuration error:
> (unable to get stack trace for java.lang.NoClassDefFoundError exception:
> ExceptionUtils::getStackTrace error.)
> Unable to determine the configured value for
> hadoop.fuse.timer.period.ERROR
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:135
> FATAL: dfs_init: fuseConnectInit failed with error -22!
> ERROR
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:34
> LD_LIBRARY_PATH=/home/oracle/database/product/10.2.0/db_1/lib:
> ERROR
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:35
> CLASSPATH=.:/usr/local/java/lib/dt.jar:/usr/local/java/lib/tools.jar
>
>
>    I get stuck with the FATAL error.
>    Please give me some suggestion.
>
>   Any help will be apprecated.
>
>
> Regards
>
>
>
>
>
>
>
> 2013/4/11 Harsh J <ha...@cloudera.com>
>
>> Hi,
>>
>> You need to place fuse_dfs' binary directory on your PATH if you
>> expect to use that script - it is simply looking it up as a command
>> and not finding it. I usually just invoke the fuse_dfs binary directly
>> since my environment is usually pre-setup.
>>
>> On Thu, Apr 11, 2013 at 8:19 PM, YouPeng Yang <yy...@gmail.com>
>> wrote:
>> > Hi Harsh:
>> >
>> >  I run under hadoop-hdfs project:
>> >  mvn install -Drequire.fuse=true -DskipTests
>> >
>> >  and the logs show: BUILD SUCCESS
>> >
>> >  So I go to the
>> > src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/ to run the
>> > fuse_dfs_wrapper.sh:
>> > [root@Hadoop fuse-dfs]# ./fuse_dfs_wrapper.sh  hdfs://Hadoop:8020 /mnt/
>> > ./fuse_dfs_wrapper.sh: line 46: fuse_dfs: command not found
>> >
>> > Obviously,the above error shows that something is still abnormal.
>> >
>> >  My question are:
>> >  Is it correct that to check the fuse?
>> >  Is there any other ways to check whether the fuse-dfs installed
>> > successfully?
>> >
>> >
>> > Thanks
>> >
>> > Regards
>> >
>> >
>> >
>> >
>> >
>> >
>> >
>> > 2013/4/10 Harsh J <ha...@cloudera.com>
>> >>
>> >> Run under hadoop-hdfs project:
>> >>
>> >> mvn install -Drequire.fuse=true
>> >>
>> >>
>> >> On Wed, Apr 10, 2013 at 7:36 PM, YouPeng Yang <
>> yypvsxf19870706@gmail.com>
>> >> wrote:
>> >>>
>> >>> Dear All
>> >>>
>> >>>    I want to integrate the FUSE with the Hadoop.
>> >>>    So i checkout the code using the command:
>> >>> [root@Hadoop ~]#  svn checkout
>> >>> http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk
>> >>>
>> >>>    However I did not find any ant build.xmls to build the fuse-dfs in
>> the
>> >>> hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib.
>> >>>    Did I checkout the wrong codes, Or is there any other ways to bulid
>> >>> fuse-dfs?
>> >>>
>> >>>    Please guide me .
>> >>>
>> >>>
>> >>> Thanks
>> >>>
>> >>> regards
>> >>
>> >>
>> >>
>> >>
>> >> --
>> >> Harsh J
>> >
>> >
>>
>>
>>
>> --
>> Harsh J
>>
>
>

Re: No bulid.xml when to build FUSE

Posted by YouPeng Yang <yy...@gmail.com>.
Hi Harsh

   I have found out the reason and the solutions after I check fuse-dfs
source code.So I reply again to close this question.

   The reason that error come out for is  the hadoop-*-.jars need to be in
the  CLASSPATH.So I add them to the CLASSPATH,and it work.

  Thank you


Regards.




2013/4/12 YouPeng Yang <yy...@gmail.com>

> Hi Harsh
>
>    Sorry for replying so later.
>
>    I still get some errors.
>
>    After I run the command  under hadoop-hdfs project:
>    *mvn install -Drequire.fuse=true -DskipTests*
>     I still can not find the  fuse_dfs binary script in my system by
> running the command:
>    * find / -name fuse_dfs.*
>
>    Thank to the google .I try *mvn package -Pnative -DskipTests* ,and the fuse_dfs
> comes out.
>
>     So,I run the command as user hadoop with an error :
>     [hadoop@Hadoop fuse-dfs]$ *./fuse_dfs_wrapper.sh hdfs://Hadoop:8020/
> /home/hadoop/expdfs/*
>     INFO
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:164
> Adding FUSE arg /home/hadoop/expdfs/
>     fuse: failed to exec fusermount: Permission denied
>
>     According to error,i switch to the user root,and run the command in
> debug mod  :
>
>    [root@Hadoop fuse-dfs]# *./fuse_dfs_wrapper.sh -d  hdfs://
> 192.168.1.150:8080/ /home/hadoop/expdfs/*
> INFO
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:115
> Ignoring option -d
> INFO
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:164
> Adding FUSE arg /home/hadoop/expdfs/
> FUSE library version: 2.8.3
> nullpath_ok: 0
> unique: 1, opcode: INIT (26), nodeid: 0, insize: 56
> INIT: 7.13
> flags=0x0000007b
> max_readahead=0x00020000
> INFO
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:98
> Mounting with options: [ protected=(NULL), nn_uri=hdfs://
> 192.168.1.150:8080/, nn_port=0, debug=0, read_only=0, initchecks=0,
> no_permissions=0, usetrash=0, entry_timeout=60, attribute_timeout=60,
> rdbuffer_size=10485760, direct_io=0 ]
> loadFileSystems error:
> (unable to get stack trace for java.lang.NoClassDefFoundError exception:
> ExceptionUtils::getStackTrace error.)
> hdfsConfGetInt(hadoop.fuse.timer.period): new Configuration error:
> (unable to get stack trace for java.lang.NoClassDefFoundError exception:
> ExceptionUtils::getStackTrace error.)
> Unable to determine the configured value for
> hadoop.fuse.timer.period.ERROR
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:135
> FATAL: dfs_init: fuseConnectInit failed with error -22!
> ERROR
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:34
> LD_LIBRARY_PATH=/home/oracle/database/product/10.2.0/db_1/lib:
> ERROR
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:35
> CLASSPATH=.:/usr/local/java/lib/dt.jar:/usr/local/java/lib/tools.jar
>
>
>    I get stuck with the FATAL error.
>    Please give me some suggestion.
>
>   Any help will be apprecated.
>
>
> Regards
>
>
>
>
>
>
>
> 2013/4/11 Harsh J <ha...@cloudera.com>
>
>> Hi,
>>
>> You need to place fuse_dfs' binary directory on your PATH if you
>> expect to use that script - it is simply looking it up as a command
>> and not finding it. I usually just invoke the fuse_dfs binary directly
>> since my environment is usually pre-setup.
>>
>> On Thu, Apr 11, 2013 at 8:19 PM, YouPeng Yang <yy...@gmail.com>
>> wrote:
>> > Hi Harsh:
>> >
>> >  I run under hadoop-hdfs project:
>> >  mvn install -Drequire.fuse=true -DskipTests
>> >
>> >  and the logs show: BUILD SUCCESS
>> >
>> >  So I go to the
>> > src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/ to run the
>> > fuse_dfs_wrapper.sh:
>> > [root@Hadoop fuse-dfs]# ./fuse_dfs_wrapper.sh  hdfs://Hadoop:8020 /mnt/
>> > ./fuse_dfs_wrapper.sh: line 46: fuse_dfs: command not found
>> >
>> > Obviously,the above error shows that something is still abnormal.
>> >
>> >  My question are:
>> >  Is it correct that to check the fuse?
>> >  Is there any other ways to check whether the fuse-dfs installed
>> > successfully?
>> >
>> >
>> > Thanks
>> >
>> > Regards
>> >
>> >
>> >
>> >
>> >
>> >
>> >
>> > 2013/4/10 Harsh J <ha...@cloudera.com>
>> >>
>> >> Run under hadoop-hdfs project:
>> >>
>> >> mvn install -Drequire.fuse=true
>> >>
>> >>
>> >> On Wed, Apr 10, 2013 at 7:36 PM, YouPeng Yang <
>> yypvsxf19870706@gmail.com>
>> >> wrote:
>> >>>
>> >>> Dear All
>> >>>
>> >>>    I want to integrate the FUSE with the Hadoop.
>> >>>    So i checkout the code using the command:
>> >>> [root@Hadoop ~]#  svn checkout
>> >>> http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk
>> >>>
>> >>>    However I did not find any ant build.xmls to build the fuse-dfs in
>> the
>> >>> hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib.
>> >>>    Did I checkout the wrong codes, Or is there any other ways to bulid
>> >>> fuse-dfs?
>> >>>
>> >>>    Please guide me .
>> >>>
>> >>>
>> >>> Thanks
>> >>>
>> >>> regards
>> >>
>> >>
>> >>
>> >>
>> >> --
>> >> Harsh J
>> >
>> >
>>
>>
>>
>> --
>> Harsh J
>>
>
>

Re: No bulid.xml when to build FUSE

Posted by YouPeng Yang <yy...@gmail.com>.
Hi Harsh

   I have found out the reason and the solutions after I check fuse-dfs
source code.So I reply again to close this question.

   The reason that error come out for is  the hadoop-*-.jars need to be in
the  CLASSPATH.So I add them to the CLASSPATH,and it work.

  Thank you


Regards.




2013/4/12 YouPeng Yang <yy...@gmail.com>

> Hi Harsh
>
>    Sorry for replying so later.
>
>    I still get some errors.
>
>    After I run the command  under hadoop-hdfs project:
>    *mvn install -Drequire.fuse=true -DskipTests*
>     I still can not find the  fuse_dfs binary script in my system by
> running the command:
>    * find / -name fuse_dfs.*
>
>    Thank to the google .I try *mvn package -Pnative -DskipTests* ,and the fuse_dfs
> comes out.
>
>     So,I run the command as user hadoop with an error :
>     [hadoop@Hadoop fuse-dfs]$ *./fuse_dfs_wrapper.sh hdfs://Hadoop:8020/
> /home/hadoop/expdfs/*
>     INFO
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:164
> Adding FUSE arg /home/hadoop/expdfs/
>     fuse: failed to exec fusermount: Permission denied
>
>     According to error,i switch to the user root,and run the command in
> debug mod  :
>
>    [root@Hadoop fuse-dfs]# *./fuse_dfs_wrapper.sh -d  hdfs://
> 192.168.1.150:8080/ /home/hadoop/expdfs/*
> INFO
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:115
> Ignoring option -d
> INFO
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:164
> Adding FUSE arg /home/hadoop/expdfs/
> FUSE library version: 2.8.3
> nullpath_ok: 0
> unique: 1, opcode: INIT (26), nodeid: 0, insize: 56
> INIT: 7.13
> flags=0x0000007b
> max_readahead=0x00020000
> INFO
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:98
> Mounting with options: [ protected=(NULL), nn_uri=hdfs://
> 192.168.1.150:8080/, nn_port=0, debug=0, read_only=0, initchecks=0,
> no_permissions=0, usetrash=0, entry_timeout=60, attribute_timeout=60,
> rdbuffer_size=10485760, direct_io=0 ]
> loadFileSystems error:
> (unable to get stack trace for java.lang.NoClassDefFoundError exception:
> ExceptionUtils::getStackTrace error.)
> hdfsConfGetInt(hadoop.fuse.timer.period): new Configuration error:
> (unable to get stack trace for java.lang.NoClassDefFoundError exception:
> ExceptionUtils::getStackTrace error.)
> Unable to determine the configured value for
> hadoop.fuse.timer.period.ERROR
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:135
> FATAL: dfs_init: fuseConnectInit failed with error -22!
> ERROR
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:34
> LD_LIBRARY_PATH=/home/oracle/database/product/10.2.0/db_1/lib:
> ERROR
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:35
> CLASSPATH=.:/usr/local/java/lib/dt.jar:/usr/local/java/lib/tools.jar
>
>
>    I get stuck with the FATAL error.
>    Please give me some suggestion.
>
>   Any help will be apprecated.
>
>
> Regards
>
>
>
>
>
>
>
> 2013/4/11 Harsh J <ha...@cloudera.com>
>
>> Hi,
>>
>> You need to place fuse_dfs' binary directory on your PATH if you
>> expect to use that script - it is simply looking it up as a command
>> and not finding it. I usually just invoke the fuse_dfs binary directly
>> since my environment is usually pre-setup.
>>
>> On Thu, Apr 11, 2013 at 8:19 PM, YouPeng Yang <yy...@gmail.com>
>> wrote:
>> > Hi Harsh:
>> >
>> >  I run under hadoop-hdfs project:
>> >  mvn install -Drequire.fuse=true -DskipTests
>> >
>> >  and the logs show: BUILD SUCCESS
>> >
>> >  So I go to the
>> > src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/ to run the
>> > fuse_dfs_wrapper.sh:
>> > [root@Hadoop fuse-dfs]# ./fuse_dfs_wrapper.sh  hdfs://Hadoop:8020 /mnt/
>> > ./fuse_dfs_wrapper.sh: line 46: fuse_dfs: command not found
>> >
>> > Obviously,the above error shows that something is still abnormal.
>> >
>> >  My question are:
>> >  Is it correct that to check the fuse?
>> >  Is there any other ways to check whether the fuse-dfs installed
>> > successfully?
>> >
>> >
>> > Thanks
>> >
>> > Regards
>> >
>> >
>> >
>> >
>> >
>> >
>> >
>> > 2013/4/10 Harsh J <ha...@cloudera.com>
>> >>
>> >> Run under hadoop-hdfs project:
>> >>
>> >> mvn install -Drequire.fuse=true
>> >>
>> >>
>> >> On Wed, Apr 10, 2013 at 7:36 PM, YouPeng Yang <
>> yypvsxf19870706@gmail.com>
>> >> wrote:
>> >>>
>> >>> Dear All
>> >>>
>> >>>    I want to integrate the FUSE with the Hadoop.
>> >>>    So i checkout the code using the command:
>> >>> [root@Hadoop ~]#  svn checkout
>> >>> http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk
>> >>>
>> >>>    However I did not find any ant build.xmls to build the fuse-dfs in
>> the
>> >>> hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib.
>> >>>    Did I checkout the wrong codes, Or is there any other ways to bulid
>> >>> fuse-dfs?
>> >>>
>> >>>    Please guide me .
>> >>>
>> >>>
>> >>> Thanks
>> >>>
>> >>> regards
>> >>
>> >>
>> >>
>> >>
>> >> --
>> >> Harsh J
>> >
>> >
>>
>>
>>
>> --
>> Harsh J
>>
>
>

Re: No bulid.xml when to build FUSE

Posted by YouPeng Yang <yy...@gmail.com>.
Hi Harsh

   I have found out the reason and the solutions after I check fuse-dfs
source code.So I reply again to close this question.

   The reason that error come out for is  the hadoop-*-.jars need to be in
the  CLASSPATH.So I add them to the CLASSPATH,and it work.

  Thank you


Regards.




2013/4/12 YouPeng Yang <yy...@gmail.com>

> Hi Harsh
>
>    Sorry for replying so later.
>
>    I still get some errors.
>
>    After I run the command  under hadoop-hdfs project:
>    *mvn install -Drequire.fuse=true -DskipTests*
>     I still can not find the  fuse_dfs binary script in my system by
> running the command:
>    * find / -name fuse_dfs.*
>
>    Thank to the google .I try *mvn package -Pnative -DskipTests* ,and the fuse_dfs
> comes out.
>
>     So,I run the command as user hadoop with an error :
>     [hadoop@Hadoop fuse-dfs]$ *./fuse_dfs_wrapper.sh hdfs://Hadoop:8020/
> /home/hadoop/expdfs/*
>     INFO
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:164
> Adding FUSE arg /home/hadoop/expdfs/
>     fuse: failed to exec fusermount: Permission denied
>
>     According to error,i switch to the user root,and run the command in
> debug mod  :
>
>    [root@Hadoop fuse-dfs]# *./fuse_dfs_wrapper.sh -d  hdfs://
> 192.168.1.150:8080/ /home/hadoop/expdfs/*
> INFO
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:115
> Ignoring option -d
> INFO
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:164
> Adding FUSE arg /home/hadoop/expdfs/
> FUSE library version: 2.8.3
> nullpath_ok: 0
> unique: 1, opcode: INIT (26), nodeid: 0, insize: 56
> INIT: 7.13
> flags=0x0000007b
> max_readahead=0x00020000
> INFO
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:98
> Mounting with options: [ protected=(NULL), nn_uri=hdfs://
> 192.168.1.150:8080/, nn_port=0, debug=0, read_only=0, initchecks=0,
> no_permissions=0, usetrash=0, entry_timeout=60, attribute_timeout=60,
> rdbuffer_size=10485760, direct_io=0 ]
> loadFileSystems error:
> (unable to get stack trace for java.lang.NoClassDefFoundError exception:
> ExceptionUtils::getStackTrace error.)
> hdfsConfGetInt(hadoop.fuse.timer.period): new Configuration error:
> (unable to get stack trace for java.lang.NoClassDefFoundError exception:
> ExceptionUtils::getStackTrace error.)
> Unable to determine the configured value for
> hadoop.fuse.timer.period.ERROR
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:135
> FATAL: dfs_init: fuseConnectInit failed with error -22!
> ERROR
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:34
> LD_LIBRARY_PATH=/home/oracle/database/product/10.2.0/db_1/lib:
> ERROR
> /usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:35
> CLASSPATH=.:/usr/local/java/lib/dt.jar:/usr/local/java/lib/tools.jar
>
>
>    I get stuck with the FATAL error.
>    Please give me some suggestion.
>
>   Any help will be apprecated.
>
>
> Regards
>
>
>
>
>
>
>
> 2013/4/11 Harsh J <ha...@cloudera.com>
>
>> Hi,
>>
>> You need to place fuse_dfs' binary directory on your PATH if you
>> expect to use that script - it is simply looking it up as a command
>> and not finding it. I usually just invoke the fuse_dfs binary directly
>> since my environment is usually pre-setup.
>>
>> On Thu, Apr 11, 2013 at 8:19 PM, YouPeng Yang <yy...@gmail.com>
>> wrote:
>> > Hi Harsh:
>> >
>> >  I run under hadoop-hdfs project:
>> >  mvn install -Drequire.fuse=true -DskipTests
>> >
>> >  and the logs show: BUILD SUCCESS
>> >
>> >  So I go to the
>> > src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/ to run the
>> > fuse_dfs_wrapper.sh:
>> > [root@Hadoop fuse-dfs]# ./fuse_dfs_wrapper.sh  hdfs://Hadoop:8020 /mnt/
>> > ./fuse_dfs_wrapper.sh: line 46: fuse_dfs: command not found
>> >
>> > Obviously,the above error shows that something is still abnormal.
>> >
>> >  My question are:
>> >  Is it correct that to check the fuse?
>> >  Is there any other ways to check whether the fuse-dfs installed
>> > successfully?
>> >
>> >
>> > Thanks
>> >
>> > Regards
>> >
>> >
>> >
>> >
>> >
>> >
>> >
>> > 2013/4/10 Harsh J <ha...@cloudera.com>
>> >>
>> >> Run under hadoop-hdfs project:
>> >>
>> >> mvn install -Drequire.fuse=true
>> >>
>> >>
>> >> On Wed, Apr 10, 2013 at 7:36 PM, YouPeng Yang <
>> yypvsxf19870706@gmail.com>
>> >> wrote:
>> >>>
>> >>> Dear All
>> >>>
>> >>>    I want to integrate the FUSE with the Hadoop.
>> >>>    So i checkout the code using the command:
>> >>> [root@Hadoop ~]#  svn checkout
>> >>> http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk
>> >>>
>> >>>    However I did not find any ant build.xmls to build the fuse-dfs in
>> the
>> >>> hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib.
>> >>>    Did I checkout the wrong codes, Or is there any other ways to bulid
>> >>> fuse-dfs?
>> >>>
>> >>>    Please guide me .
>> >>>
>> >>>
>> >>> Thanks
>> >>>
>> >>> regards
>> >>
>> >>
>> >>
>> >>
>> >> --
>> >> Harsh J
>> >
>> >
>>
>>
>>
>> --
>> Harsh J
>>
>
>

Re: No bulid.xml when to build FUSE

Posted by YouPeng Yang <yy...@gmail.com>.
Hi Harsh

   Sorry for replying so later.

   I still get some errors.

   After I run the command  under hadoop-hdfs project:
   *mvn install -Drequire.fuse=true -DskipTests*
    I still can not find the  fuse_dfs binary script in my system by
running the command:
   * find / -name fuse_dfs.*

   Thank to the google .I try *mvn package -Pnative -DskipTests* ,and
the fuse_dfs
comes out.

    So,I run the command as user hadoop with an error :
    [hadoop@Hadoop fuse-dfs]$ *./fuse_dfs_wrapper.sh hdfs://Hadoop:8020/
/home/hadoop/expdfs/*
    INFO
/usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:164
Adding FUSE arg /home/hadoop/expdfs/
    fuse: failed to exec fusermount: Permission denied

    According to error,i switch to the user root,and run the command in
debug mod  :

   [root@Hadoop fuse-dfs]# *./fuse_dfs_wrapper.sh -d  hdfs://
192.168.1.150:8080/ /home/hadoop/expdfs/*
INFO
/usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:115
Ignoring option -d
INFO
/usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:164
Adding FUSE arg /home/hadoop/expdfs/
FUSE library version: 2.8.3
nullpath_ok: 0
unique: 1, opcode: INIT (26), nodeid: 0, insize: 56
INIT: 7.13
flags=0x0000007b
max_readahead=0x00020000
INFO
/usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:98
Mounting with options: [ protected=(NULL), nn_uri=hdfs://192.168.1.150:8080/,
nn_port=0, debug=0, read_only=0, initchecks=0, no_permissions=0,
usetrash=0, entry_timeout=60, attribute_timeout=60, rdbuffer_size=10485760,
direct_io=0 ]
loadFileSystems error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception:
ExceptionUtils::getStackTrace error.)
hdfsConfGetInt(hadoop.fuse.timer.period): new Configuration error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception:
ExceptionUtils::getStackTrace error.)
Unable to determine the configured value for hadoop.fuse.timer.period.ERROR
/usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:135
FATAL: dfs_init: fuseConnectInit failed with error -22!
ERROR
/usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:34
LD_LIBRARY_PATH=/home/oracle/database/product/10.2.0/db_1/lib:
ERROR
/usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:35
CLASSPATH=.:/usr/local/java/lib/dt.jar:/usr/local/java/lib/tools.jar


   I get stuck with the FATAL error.
   Please give me some suggestion.

  Any help will be apprecated.


Regards







2013/4/11 Harsh J <ha...@cloudera.com>

> Hi,
>
> You need to place fuse_dfs' binary directory on your PATH if you
> expect to use that script - it is simply looking it up as a command
> and not finding it. I usually just invoke the fuse_dfs binary directly
> since my environment is usually pre-setup.
>
> On Thu, Apr 11, 2013 at 8:19 PM, YouPeng Yang <yy...@gmail.com>
> wrote:
> > Hi Harsh:
> >
> >  I run under hadoop-hdfs project:
> >  mvn install -Drequire.fuse=true -DskipTests
> >
> >  and the logs show: BUILD SUCCESS
> >
> >  So I go to the
> > src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/ to run the
> > fuse_dfs_wrapper.sh:
> > [root@Hadoop fuse-dfs]# ./fuse_dfs_wrapper.sh  hdfs://Hadoop:8020 /mnt/
> > ./fuse_dfs_wrapper.sh: line 46: fuse_dfs: command not found
> >
> > Obviously,the above error shows that something is still abnormal.
> >
> >  My question are:
> >  Is it correct that to check the fuse?
> >  Is there any other ways to check whether the fuse-dfs installed
> > successfully?
> >
> >
> > Thanks
> >
> > Regards
> >
> >
> >
> >
> >
> >
> >
> > 2013/4/10 Harsh J <ha...@cloudera.com>
> >>
> >> Run under hadoop-hdfs project:
> >>
> >> mvn install -Drequire.fuse=true
> >>
> >>
> >> On Wed, Apr 10, 2013 at 7:36 PM, YouPeng Yang <
> yypvsxf19870706@gmail.com>
> >> wrote:
> >>>
> >>> Dear All
> >>>
> >>>    I want to integrate the FUSE with the Hadoop.
> >>>    So i checkout the code using the command:
> >>> [root@Hadoop ~]#  svn checkout
> >>> http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk
> >>>
> >>>    However I did not find any ant build.xmls to build the fuse-dfs in
> the
> >>> hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib.
> >>>    Did I checkout the wrong codes, Or is there any other ways to bulid
> >>> fuse-dfs?
> >>>
> >>>    Please guide me .
> >>>
> >>>
> >>> Thanks
> >>>
> >>> regards
> >>
> >>
> >>
> >>
> >> --
> >> Harsh J
> >
> >
>
>
>
> --
> Harsh J
>

Re: No bulid.xml when to build FUSE

Posted by YouPeng Yang <yy...@gmail.com>.
Hi Harsh

   Sorry for replying so later.

   I still get some errors.

   After I run the command  under hadoop-hdfs project:
   *mvn install -Drequire.fuse=true -DskipTests*
    I still can not find the  fuse_dfs binary script in my system by
running the command:
   * find / -name fuse_dfs.*

   Thank to the google .I try *mvn package -Pnative -DskipTests* ,and
the fuse_dfs
comes out.

    So,I run the command as user hadoop with an error :
    [hadoop@Hadoop fuse-dfs]$ *./fuse_dfs_wrapper.sh hdfs://Hadoop:8020/
/home/hadoop/expdfs/*
    INFO
/usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:164
Adding FUSE arg /home/hadoop/expdfs/
    fuse: failed to exec fusermount: Permission denied

    According to error,i switch to the user root,and run the command in
debug mod  :

   [root@Hadoop fuse-dfs]# *./fuse_dfs_wrapper.sh -d  hdfs://
192.168.1.150:8080/ /home/hadoop/expdfs/*
INFO
/usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:115
Ignoring option -d
INFO
/usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:164
Adding FUSE arg /home/hadoop/expdfs/
FUSE library version: 2.8.3
nullpath_ok: 0
unique: 1, opcode: INIT (26), nodeid: 0, insize: 56
INIT: 7.13
flags=0x0000007b
max_readahead=0x00020000
INFO
/usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:98
Mounting with options: [ protected=(NULL), nn_uri=hdfs://192.168.1.150:8080/,
nn_port=0, debug=0, read_only=0, initchecks=0, no_permissions=0,
usetrash=0, entry_timeout=60, attribute_timeout=60, rdbuffer_size=10485760,
direct_io=0 ]
loadFileSystems error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception:
ExceptionUtils::getStackTrace error.)
hdfsConfGetInt(hadoop.fuse.timer.period): new Configuration error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception:
ExceptionUtils::getStackTrace error.)
Unable to determine the configured value for hadoop.fuse.timer.period.ERROR
/usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:135
FATAL: dfs_init: fuseConnectInit failed with error -22!
ERROR
/usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:34
LD_LIBRARY_PATH=/home/oracle/database/product/10.2.0/db_1/lib:
ERROR
/usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:35
CLASSPATH=.:/usr/local/java/lib/dt.jar:/usr/local/java/lib/tools.jar


   I get stuck with the FATAL error.
   Please give me some suggestion.

  Any help will be apprecated.


Regards







2013/4/11 Harsh J <ha...@cloudera.com>

> Hi,
>
> You need to place fuse_dfs' binary directory on your PATH if you
> expect to use that script - it is simply looking it up as a command
> and not finding it. I usually just invoke the fuse_dfs binary directly
> since my environment is usually pre-setup.
>
> On Thu, Apr 11, 2013 at 8:19 PM, YouPeng Yang <yy...@gmail.com>
> wrote:
> > Hi Harsh:
> >
> >  I run under hadoop-hdfs project:
> >  mvn install -Drequire.fuse=true -DskipTests
> >
> >  and the logs show: BUILD SUCCESS
> >
> >  So I go to the
> > src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/ to run the
> > fuse_dfs_wrapper.sh:
> > [root@Hadoop fuse-dfs]# ./fuse_dfs_wrapper.sh  hdfs://Hadoop:8020 /mnt/
> > ./fuse_dfs_wrapper.sh: line 46: fuse_dfs: command not found
> >
> > Obviously,the above error shows that something is still abnormal.
> >
> >  My question are:
> >  Is it correct that to check the fuse?
> >  Is there any other ways to check whether the fuse-dfs installed
> > successfully?
> >
> >
> > Thanks
> >
> > Regards
> >
> >
> >
> >
> >
> >
> >
> > 2013/4/10 Harsh J <ha...@cloudera.com>
> >>
> >> Run under hadoop-hdfs project:
> >>
> >> mvn install -Drequire.fuse=true
> >>
> >>
> >> On Wed, Apr 10, 2013 at 7:36 PM, YouPeng Yang <
> yypvsxf19870706@gmail.com>
> >> wrote:
> >>>
> >>> Dear All
> >>>
> >>>    I want to integrate the FUSE with the Hadoop.
> >>>    So i checkout the code using the command:
> >>> [root@Hadoop ~]#  svn checkout
> >>> http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk
> >>>
> >>>    However I did not find any ant build.xmls to build the fuse-dfs in
> the
> >>> hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib.
> >>>    Did I checkout the wrong codes, Or is there any other ways to bulid
> >>> fuse-dfs?
> >>>
> >>>    Please guide me .
> >>>
> >>>
> >>> Thanks
> >>>
> >>> regards
> >>
> >>
> >>
> >>
> >> --
> >> Harsh J
> >
> >
>
>
>
> --
> Harsh J
>

Re: No bulid.xml when to build FUSE

Posted by YouPeng Yang <yy...@gmail.com>.
Hi Harsh

   Sorry for replying so later.

   I still get some errors.

   After I run the command  under hadoop-hdfs project:
   *mvn install -Drequire.fuse=true -DskipTests*
    I still can not find the  fuse_dfs binary script in my system by
running the command:
   * find / -name fuse_dfs.*

   Thank to the google .I try *mvn package -Pnative -DskipTests* ,and
the fuse_dfs
comes out.

    So,I run the command as user hadoop with an error :
    [hadoop@Hadoop fuse-dfs]$ *./fuse_dfs_wrapper.sh hdfs://Hadoop:8020/
/home/hadoop/expdfs/*
    INFO
/usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:164
Adding FUSE arg /home/hadoop/expdfs/
    fuse: failed to exec fusermount: Permission denied

    According to error,i switch to the user root,and run the command in
debug mod  :

   [root@Hadoop fuse-dfs]# *./fuse_dfs_wrapper.sh -d  hdfs://
192.168.1.150:8080/ /home/hadoop/expdfs/*
INFO
/usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:115
Ignoring option -d
INFO
/usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:164
Adding FUSE arg /home/hadoop/expdfs/
FUSE library version: 2.8.3
nullpath_ok: 0
unique: 1, opcode: INIT (26), nodeid: 0, insize: 56
INIT: 7.13
flags=0x0000007b
max_readahead=0x00020000
INFO
/usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:98
Mounting with options: [ protected=(NULL), nn_uri=hdfs://192.168.1.150:8080/,
nn_port=0, debug=0, read_only=0, initchecks=0, no_permissions=0,
usetrash=0, entry_timeout=60, attribute_timeout=60, rdbuffer_size=10485760,
direct_io=0 ]
loadFileSystems error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception:
ExceptionUtils::getStackTrace error.)
hdfsConfGetInt(hadoop.fuse.timer.period): new Configuration error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception:
ExceptionUtils::getStackTrace error.)
Unable to determine the configured value for hadoop.fuse.timer.period.ERROR
/usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:135
FATAL: dfs_init: fuseConnectInit failed with error -22!
ERROR
/usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:34
LD_LIBRARY_PATH=/home/oracle/database/product/10.2.0/db_1/lib:
ERROR
/usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:35
CLASSPATH=.:/usr/local/java/lib/dt.jar:/usr/local/java/lib/tools.jar


   I get stuck with the FATAL error.
   Please give me some suggestion.

  Any help will be apprecated.


Regards







2013/4/11 Harsh J <ha...@cloudera.com>

> Hi,
>
> You need to place fuse_dfs' binary directory on your PATH if you
> expect to use that script - it is simply looking it up as a command
> and not finding it. I usually just invoke the fuse_dfs binary directly
> since my environment is usually pre-setup.
>
> On Thu, Apr 11, 2013 at 8:19 PM, YouPeng Yang <yy...@gmail.com>
> wrote:
> > Hi Harsh:
> >
> >  I run under hadoop-hdfs project:
> >  mvn install -Drequire.fuse=true -DskipTests
> >
> >  and the logs show: BUILD SUCCESS
> >
> >  So I go to the
> > src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/ to run the
> > fuse_dfs_wrapper.sh:
> > [root@Hadoop fuse-dfs]# ./fuse_dfs_wrapper.sh  hdfs://Hadoop:8020 /mnt/
> > ./fuse_dfs_wrapper.sh: line 46: fuse_dfs: command not found
> >
> > Obviously,the above error shows that something is still abnormal.
> >
> >  My question are:
> >  Is it correct that to check the fuse?
> >  Is there any other ways to check whether the fuse-dfs installed
> > successfully?
> >
> >
> > Thanks
> >
> > Regards
> >
> >
> >
> >
> >
> >
> >
> > 2013/4/10 Harsh J <ha...@cloudera.com>
> >>
> >> Run under hadoop-hdfs project:
> >>
> >> mvn install -Drequire.fuse=true
> >>
> >>
> >> On Wed, Apr 10, 2013 at 7:36 PM, YouPeng Yang <
> yypvsxf19870706@gmail.com>
> >> wrote:
> >>>
> >>> Dear All
> >>>
> >>>    I want to integrate the FUSE with the Hadoop.
> >>>    So i checkout the code using the command:
> >>> [root@Hadoop ~]#  svn checkout
> >>> http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk
> >>>
> >>>    However I did not find any ant build.xmls to build the fuse-dfs in
> the
> >>> hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib.
> >>>    Did I checkout the wrong codes, Or is there any other ways to bulid
> >>> fuse-dfs?
> >>>
> >>>    Please guide me .
> >>>
> >>>
> >>> Thanks
> >>>
> >>> regards
> >>
> >>
> >>
> >>
> >> --
> >> Harsh J
> >
> >
>
>
>
> --
> Harsh J
>

Re: No bulid.xml when to build FUSE

Posted by YouPeng Yang <yy...@gmail.com>.
Hi Harsh

   Sorry for replying so later.

   I still get some errors.

   After I run the command  under hadoop-hdfs project:
   *mvn install -Drequire.fuse=true -DskipTests*
    I still can not find the  fuse_dfs binary script in my system by
running the command:
   * find / -name fuse_dfs.*

   Thank to the google .I try *mvn package -Pnative -DskipTests* ,and
the fuse_dfs
comes out.

    So,I run the command as user hadoop with an error :
    [hadoop@Hadoop fuse-dfs]$ *./fuse_dfs_wrapper.sh hdfs://Hadoop:8020/
/home/hadoop/expdfs/*
    INFO
/usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:164
Adding FUSE arg /home/hadoop/expdfs/
    fuse: failed to exec fusermount: Permission denied

    According to error,i switch to the user root,and run the command in
debug mod  :

   [root@Hadoop fuse-dfs]# *./fuse_dfs_wrapper.sh -d  hdfs://
192.168.1.150:8080/ /home/hadoop/expdfs/*
INFO
/usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:115
Ignoring option -d
INFO
/usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:164
Adding FUSE arg /home/hadoop/expdfs/
FUSE library version: 2.8.3
nullpath_ok: 0
unique: 1, opcode: INIT (26), nodeid: 0, insize: 56
INIT: 7.13
flags=0x0000007b
max_readahead=0x00020000
INFO
/usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:98
Mounting with options: [ protected=(NULL), nn_uri=hdfs://192.168.1.150:8080/,
nn_port=0, debug=0, read_only=0, initchecks=0, no_permissions=0,
usetrash=0, entry_timeout=60, attribute_timeout=60, rdbuffer_size=10485760,
direct_io=0 ]
loadFileSystems error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception:
ExceptionUtils::getStackTrace error.)
hdfsConfGetInt(hadoop.fuse.timer.period): new Configuration error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception:
ExceptionUtils::getStackTrace error.)
Unable to determine the configured value for hadoop.fuse.timer.period.ERROR
/usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:135
FATAL: dfs_init: fuseConnectInit failed with error -22!
ERROR
/usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:34
LD_LIBRARY_PATH=/home/oracle/database/product/10.2.0/db_1/lib:
ERROR
/usr/local/hadoop/src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:35
CLASSPATH=.:/usr/local/java/lib/dt.jar:/usr/local/java/lib/tools.jar


   I get stuck with the FATAL error.
   Please give me some suggestion.

  Any help will be apprecated.


Regards







2013/4/11 Harsh J <ha...@cloudera.com>

> Hi,
>
> You need to place fuse_dfs' binary directory on your PATH if you
> expect to use that script - it is simply looking it up as a command
> and not finding it. I usually just invoke the fuse_dfs binary directly
> since my environment is usually pre-setup.
>
> On Thu, Apr 11, 2013 at 8:19 PM, YouPeng Yang <yy...@gmail.com>
> wrote:
> > Hi Harsh:
> >
> >  I run under hadoop-hdfs project:
> >  mvn install -Drequire.fuse=true -DskipTests
> >
> >  and the logs show: BUILD SUCCESS
> >
> >  So I go to the
> > src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/ to run the
> > fuse_dfs_wrapper.sh:
> > [root@Hadoop fuse-dfs]# ./fuse_dfs_wrapper.sh  hdfs://Hadoop:8020 /mnt/
> > ./fuse_dfs_wrapper.sh: line 46: fuse_dfs: command not found
> >
> > Obviously,the above error shows that something is still abnormal.
> >
> >  My question are:
> >  Is it correct that to check the fuse?
> >  Is there any other ways to check whether the fuse-dfs installed
> > successfully?
> >
> >
> > Thanks
> >
> > Regards
> >
> >
> >
> >
> >
> >
> >
> > 2013/4/10 Harsh J <ha...@cloudera.com>
> >>
> >> Run under hadoop-hdfs project:
> >>
> >> mvn install -Drequire.fuse=true
> >>
> >>
> >> On Wed, Apr 10, 2013 at 7:36 PM, YouPeng Yang <
> yypvsxf19870706@gmail.com>
> >> wrote:
> >>>
> >>> Dear All
> >>>
> >>>    I want to integrate the FUSE with the Hadoop.
> >>>    So i checkout the code using the command:
> >>> [root@Hadoop ~]#  svn checkout
> >>> http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk
> >>>
> >>>    However I did not find any ant build.xmls to build the fuse-dfs in
> the
> >>> hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib.
> >>>    Did I checkout the wrong codes, Or is there any other ways to bulid
> >>> fuse-dfs?
> >>>
> >>>    Please guide me .
> >>>
> >>>
> >>> Thanks
> >>>
> >>> regards
> >>
> >>
> >>
> >>
> >> --
> >> Harsh J
> >
> >
>
>
>
> --
> Harsh J
>

Re: No bulid.xml when to build FUSE

Posted by Harsh J <ha...@cloudera.com>.
Hi,

You need to place fuse_dfs' binary directory on your PATH if you
expect to use that script - it is simply looking it up as a command
and not finding it. I usually just invoke the fuse_dfs binary directly
since my environment is usually pre-setup.

On Thu, Apr 11, 2013 at 8:19 PM, YouPeng Yang <yy...@gmail.com> wrote:
> Hi Harsh:
>
>  I run under hadoop-hdfs project:
>  mvn install -Drequire.fuse=true -DskipTests
>
>  and the logs show: BUILD SUCCESS
>
>  So I go to the
> src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/ to run the
> fuse_dfs_wrapper.sh:
> [root@Hadoop fuse-dfs]# ./fuse_dfs_wrapper.sh  hdfs://Hadoop:8020 /mnt/
> ./fuse_dfs_wrapper.sh: line 46: fuse_dfs: command not found
>
> Obviously,the above error shows that something is still abnormal.
>
>  My question are:
>  Is it correct that to check the fuse?
>  Is there any other ways to check whether the fuse-dfs installed
> successfully?
>
>
> Thanks
>
> Regards
>
>
>
>
>
>
>
> 2013/4/10 Harsh J <ha...@cloudera.com>
>>
>> Run under hadoop-hdfs project:
>>
>> mvn install -Drequire.fuse=true
>>
>>
>> On Wed, Apr 10, 2013 at 7:36 PM, YouPeng Yang <yy...@gmail.com>
>> wrote:
>>>
>>> Dear All
>>>
>>>    I want to integrate the FUSE with the Hadoop.
>>>    So i checkout the code using the command:
>>> [root@Hadoop ~]#  svn checkout
>>> http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk
>>>
>>>    However I did not find any ant build.xmls to build the fuse-dfs in the
>>> hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib.
>>>    Did I checkout the wrong codes, Or is there any other ways to bulid
>>> fuse-dfs?
>>>
>>>    Please guide me .
>>>
>>>
>>> Thanks
>>>
>>> regards
>>
>>
>>
>>
>> --
>> Harsh J
>
>



-- 
Harsh J

Re: No bulid.xml when to build FUSE

Posted by Harsh J <ha...@cloudera.com>.
Hi,

You need to place fuse_dfs' binary directory on your PATH if you
expect to use that script - it is simply looking it up as a command
and not finding it. I usually just invoke the fuse_dfs binary directly
since my environment is usually pre-setup.

On Thu, Apr 11, 2013 at 8:19 PM, YouPeng Yang <yy...@gmail.com> wrote:
> Hi Harsh:
>
>  I run under hadoop-hdfs project:
>  mvn install -Drequire.fuse=true -DskipTests
>
>  and the logs show: BUILD SUCCESS
>
>  So I go to the
> src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/ to run the
> fuse_dfs_wrapper.sh:
> [root@Hadoop fuse-dfs]# ./fuse_dfs_wrapper.sh  hdfs://Hadoop:8020 /mnt/
> ./fuse_dfs_wrapper.sh: line 46: fuse_dfs: command not found
>
> Obviously,the above error shows that something is still abnormal.
>
>  My question are:
>  Is it correct that to check the fuse?
>  Is there any other ways to check whether the fuse-dfs installed
> successfully?
>
>
> Thanks
>
> Regards
>
>
>
>
>
>
>
> 2013/4/10 Harsh J <ha...@cloudera.com>
>>
>> Run under hadoop-hdfs project:
>>
>> mvn install -Drequire.fuse=true
>>
>>
>> On Wed, Apr 10, 2013 at 7:36 PM, YouPeng Yang <yy...@gmail.com>
>> wrote:
>>>
>>> Dear All
>>>
>>>    I want to integrate the FUSE with the Hadoop.
>>>    So i checkout the code using the command:
>>> [root@Hadoop ~]#  svn checkout
>>> http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk
>>>
>>>    However I did not find any ant build.xmls to build the fuse-dfs in the
>>> hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib.
>>>    Did I checkout the wrong codes, Or is there any other ways to bulid
>>> fuse-dfs?
>>>
>>>    Please guide me .
>>>
>>>
>>> Thanks
>>>
>>> regards
>>
>>
>>
>>
>> --
>> Harsh J
>
>



-- 
Harsh J

Re: No bulid.xml when to build FUSE

Posted by Harsh J <ha...@cloudera.com>.
Hi,

You need to place fuse_dfs' binary directory on your PATH if you
expect to use that script - it is simply looking it up as a command
and not finding it. I usually just invoke the fuse_dfs binary directly
since my environment is usually pre-setup.

On Thu, Apr 11, 2013 at 8:19 PM, YouPeng Yang <yy...@gmail.com> wrote:
> Hi Harsh:
>
>  I run under hadoop-hdfs project:
>  mvn install -Drequire.fuse=true -DskipTests
>
>  and the logs show: BUILD SUCCESS
>
>  So I go to the
> src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/ to run the
> fuse_dfs_wrapper.sh:
> [root@Hadoop fuse-dfs]# ./fuse_dfs_wrapper.sh  hdfs://Hadoop:8020 /mnt/
> ./fuse_dfs_wrapper.sh: line 46: fuse_dfs: command not found
>
> Obviously,the above error shows that something is still abnormal.
>
>  My question are:
>  Is it correct that to check the fuse?
>  Is there any other ways to check whether the fuse-dfs installed
> successfully?
>
>
> Thanks
>
> Regards
>
>
>
>
>
>
>
> 2013/4/10 Harsh J <ha...@cloudera.com>
>>
>> Run under hadoop-hdfs project:
>>
>> mvn install -Drequire.fuse=true
>>
>>
>> On Wed, Apr 10, 2013 at 7:36 PM, YouPeng Yang <yy...@gmail.com>
>> wrote:
>>>
>>> Dear All
>>>
>>>    I want to integrate the FUSE with the Hadoop.
>>>    So i checkout the code using the command:
>>> [root@Hadoop ~]#  svn checkout
>>> http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk
>>>
>>>    However I did not find any ant build.xmls to build the fuse-dfs in the
>>> hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib.
>>>    Did I checkout the wrong codes, Or is there any other ways to bulid
>>> fuse-dfs?
>>>
>>>    Please guide me .
>>>
>>>
>>> Thanks
>>>
>>> regards
>>
>>
>>
>>
>> --
>> Harsh J
>
>



-- 
Harsh J

Re: No bulid.xml when to build FUSE

Posted by Harsh J <ha...@cloudera.com>.
Hi,

You need to place fuse_dfs' binary directory on your PATH if you
expect to use that script - it is simply looking it up as a command
and not finding it. I usually just invoke the fuse_dfs binary directly
since my environment is usually pre-setup.

On Thu, Apr 11, 2013 at 8:19 PM, YouPeng Yang <yy...@gmail.com> wrote:
> Hi Harsh:
>
>  I run under hadoop-hdfs project:
>  mvn install -Drequire.fuse=true -DskipTests
>
>  and the logs show: BUILD SUCCESS
>
>  So I go to the
> src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/ to run the
> fuse_dfs_wrapper.sh:
> [root@Hadoop fuse-dfs]# ./fuse_dfs_wrapper.sh  hdfs://Hadoop:8020 /mnt/
> ./fuse_dfs_wrapper.sh: line 46: fuse_dfs: command not found
>
> Obviously,the above error shows that something is still abnormal.
>
>  My question are:
>  Is it correct that to check the fuse?
>  Is there any other ways to check whether the fuse-dfs installed
> successfully?
>
>
> Thanks
>
> Regards
>
>
>
>
>
>
>
> 2013/4/10 Harsh J <ha...@cloudera.com>
>>
>> Run under hadoop-hdfs project:
>>
>> mvn install -Drequire.fuse=true
>>
>>
>> On Wed, Apr 10, 2013 at 7:36 PM, YouPeng Yang <yy...@gmail.com>
>> wrote:
>>>
>>> Dear All
>>>
>>>    I want to integrate the FUSE with the Hadoop.
>>>    So i checkout the code using the command:
>>> [root@Hadoop ~]#  svn checkout
>>> http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk
>>>
>>>    However I did not find any ant build.xmls to build the fuse-dfs in the
>>> hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib.
>>>    Did I checkout the wrong codes, Or is there any other ways to bulid
>>> fuse-dfs?
>>>
>>>    Please guide me .
>>>
>>>
>>> Thanks
>>>
>>> regards
>>
>>
>>
>>
>> --
>> Harsh J
>
>



-- 
Harsh J

Re: No bulid.xml when to build FUSE

Posted by YouPeng Yang <yy...@gmail.com>.
Hi Harsh:

 I run under hadoop-hdfs project:
 mvn install -Drequire.fuse=true -DskipTests

 and the logs show: BUILD SUCCESS

 So I go to the
src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/ to run the
fuse_dfs_wrapper.sh:
[root@Hadoop fuse-dfs]# ./fuse_dfs_wrapper.sh  hdfs://Hadoop:8020 /mnt/
./fuse_dfs_wrapper.sh: line 46: fuse_dfs: command not found

Obviously,the above error shows that something is still abnormal.

 My question are:
 Is it correct that to check the fuse?
 Is there any other ways to check whether the fuse-dfs installed
successfully?


Thanks

Regards







2013/4/10 Harsh J <ha...@cloudera.com>

> Run under hadoop-hdfs project:
>
> mvn install -Drequire.fuse=true
>
>
> On Wed, Apr 10, 2013 at 7:36 PM, YouPeng Yang <yy...@gmail.com>wrote:
>
>> Dear All
>>
>>    I want to integrate the FUSE with the Hadoop.
>>     So i checkout the code using the command:
>> *[root@Hadoop ~]#  svn checkout
>> http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk*
>>
>>    However I did not find any ant build.xmls to build the fuse-dfs in the
>>
>> *hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib.*
>> * *  Did I checkout the wrong codes, Or is there any other ways to bulid
>> fuse-dfs?
>>
>> *  * Please guide me .
>> *   *
>> *
>> *
>> *Thanks *
>>
>> regards
>>
>
>
>
> --
> Harsh J
>

Re: No bulid.xml when to build FUSE

Posted by YouPeng Yang <yy...@gmail.com>.
Hi Harsh:

 I run under hadoop-hdfs project:
 mvn install -Drequire.fuse=true -DskipTests

 and the logs show: BUILD SUCCESS

 So I go to the
src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/ to run the
fuse_dfs_wrapper.sh:
[root@Hadoop fuse-dfs]# ./fuse_dfs_wrapper.sh  hdfs://Hadoop:8020 /mnt/
./fuse_dfs_wrapper.sh: line 46: fuse_dfs: command not found

Obviously,the above error shows that something is still abnormal.

 My question are:
 Is it correct that to check the fuse?
 Is there any other ways to check whether the fuse-dfs installed
successfully?


Thanks

Regards







2013/4/10 Harsh J <ha...@cloudera.com>

> Run under hadoop-hdfs project:
>
> mvn install -Drequire.fuse=true
>
>
> On Wed, Apr 10, 2013 at 7:36 PM, YouPeng Yang <yy...@gmail.com>wrote:
>
>> Dear All
>>
>>    I want to integrate the FUSE with the Hadoop.
>>     So i checkout the code using the command:
>> *[root@Hadoop ~]#  svn checkout
>> http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk*
>>
>>    However I did not find any ant build.xmls to build the fuse-dfs in the
>>
>> *hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib.*
>> * *  Did I checkout the wrong codes, Or is there any other ways to bulid
>> fuse-dfs?
>>
>> *  * Please guide me .
>> *   *
>> *
>> *
>> *Thanks *
>>
>> regards
>>
>
>
>
> --
> Harsh J
>

Re: No bulid.xml when to build FUSE

Posted by YouPeng Yang <yy...@gmail.com>.
Hi Harsh:

 I run under hadoop-hdfs project:
 mvn install -Drequire.fuse=true -DskipTests

 and the logs show: BUILD SUCCESS

 So I go to the
src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/ to run the
fuse_dfs_wrapper.sh:
[root@Hadoop fuse-dfs]# ./fuse_dfs_wrapper.sh  hdfs://Hadoop:8020 /mnt/
./fuse_dfs_wrapper.sh: line 46: fuse_dfs: command not found

Obviously,the above error shows that something is still abnormal.

 My question are:
 Is it correct that to check the fuse?
 Is there any other ways to check whether the fuse-dfs installed
successfully?


Thanks

Regards







2013/4/10 Harsh J <ha...@cloudera.com>

> Run under hadoop-hdfs project:
>
> mvn install -Drequire.fuse=true
>
>
> On Wed, Apr 10, 2013 at 7:36 PM, YouPeng Yang <yy...@gmail.com>wrote:
>
>> Dear All
>>
>>    I want to integrate the FUSE with the Hadoop.
>>     So i checkout the code using the command:
>> *[root@Hadoop ~]#  svn checkout
>> http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk*
>>
>>    However I did not find any ant build.xmls to build the fuse-dfs in the
>>
>> *hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib.*
>> * *  Did I checkout the wrong codes, Or is there any other ways to bulid
>> fuse-dfs?
>>
>> *  * Please guide me .
>> *   *
>> *
>> *
>> *Thanks *
>>
>> regards
>>
>
>
>
> --
> Harsh J
>

Re: No bulid.xml when to build FUSE

Posted by YouPeng Yang <yy...@gmail.com>.
Hi Harsh:

 I run under hadoop-hdfs project:
 mvn install -Drequire.fuse=true -DskipTests

 and the logs show: BUILD SUCCESS

 So I go to the
src/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/ to run the
fuse_dfs_wrapper.sh:
[root@Hadoop fuse-dfs]# ./fuse_dfs_wrapper.sh  hdfs://Hadoop:8020 /mnt/
./fuse_dfs_wrapper.sh: line 46: fuse_dfs: command not found

Obviously,the above error shows that something is still abnormal.

 My question are:
 Is it correct that to check the fuse?
 Is there any other ways to check whether the fuse-dfs installed
successfully?


Thanks

Regards







2013/4/10 Harsh J <ha...@cloudera.com>

> Run under hadoop-hdfs project:
>
> mvn install -Drequire.fuse=true
>
>
> On Wed, Apr 10, 2013 at 7:36 PM, YouPeng Yang <yy...@gmail.com>wrote:
>
>> Dear All
>>
>>    I want to integrate the FUSE with the Hadoop.
>>     So i checkout the code using the command:
>> *[root@Hadoop ~]#  svn checkout
>> http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk*
>>
>>    However I did not find any ant build.xmls to build the fuse-dfs in the
>>
>> *hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib.*
>> * *  Did I checkout the wrong codes, Or is there any other ways to bulid
>> fuse-dfs?
>>
>> *  * Please guide me .
>> *   *
>> *
>> *
>> *Thanks *
>>
>> regards
>>
>
>
>
> --
> Harsh J
>

Re: No bulid.xml when to build FUSE

Posted by Harsh J <ha...@cloudera.com>.
Run under hadoop-hdfs project:

mvn install -Drequire.fuse=true


On Wed, Apr 10, 2013 at 7:36 PM, YouPeng Yang <yy...@gmail.com>wrote:

> Dear All
>
>    I want to integrate the FUSE with the Hadoop.
>     So i checkout the code using the command:
> *[root@Hadoop ~]#  svn checkout
> http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk*
>
>    However I did not find any ant build.xmls to build the fuse-dfs in the
> *hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib.*
> * *  Did I checkout the wrong codes, Or is there any other ways to bulid
> fuse-dfs?
>
> *  * Please guide me .
> *   *
> *
> *
> *Thanks *
>
> regards
>



-- 
Harsh J

Re: No bulid.xml when to build FUSE

Posted by Harsh J <ha...@cloudera.com>.
Run under hadoop-hdfs project:

mvn install -Drequire.fuse=true


On Wed, Apr 10, 2013 at 7:36 PM, YouPeng Yang <yy...@gmail.com>wrote:

> Dear All
>
>    I want to integrate the FUSE with the Hadoop.
>     So i checkout the code using the command:
> *[root@Hadoop ~]#  svn checkout
> http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk*
>
>    However I did not find any ant build.xmls to build the fuse-dfs in the
> *hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib.*
> * *  Did I checkout the wrong codes, Or is there any other ways to bulid
> fuse-dfs?
>
> *  * Please guide me .
> *   *
> *
> *
> *Thanks *
>
> regards
>



-- 
Harsh J

Re: No bulid.xml when to build FUSE

Posted by Harsh J <ha...@cloudera.com>.
Run under hadoop-hdfs project:

mvn install -Drequire.fuse=true


On Wed, Apr 10, 2013 at 7:36 PM, YouPeng Yang <yy...@gmail.com>wrote:

> Dear All
>
>    I want to integrate the FUSE with the Hadoop.
>     So i checkout the code using the command:
> *[root@Hadoop ~]#  svn checkout
> http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk*
>
>    However I did not find any ant build.xmls to build the fuse-dfs in the
> *hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib.*
> * *  Did I checkout the wrong codes, Or is there any other ways to bulid
> fuse-dfs?
>
> *  * Please guide me .
> *   *
> *
> *
> *Thanks *
>
> regards
>



-- 
Harsh J

Re: No bulid.xml when to build FUSE

Posted by Harsh J <ha...@cloudera.com>.
Run under hadoop-hdfs project:

mvn install -Drequire.fuse=true


On Wed, Apr 10, 2013 at 7:36 PM, YouPeng Yang <yy...@gmail.com>wrote:

> Dear All
>
>    I want to integrate the FUSE with the Hadoop.
>     So i checkout the code using the command:
> *[root@Hadoop ~]#  svn checkout
> http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk*
>
>    However I did not find any ant build.xmls to build the fuse-dfs in the
> *hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib.*
> * *  Did I checkout the wrong codes, Or is there any other ways to bulid
> fuse-dfs?
>
> *  * Please guide me .
> *   *
> *
> *
> *Thanks *
>
> regards
>



-- 
Harsh J

Re: No bulid.xml when to build FUSE

Posted by Jay Vyas <ja...@gmail.com>.
hadoop-hdfs builds with maven, not ant.

You might also need to install the serialization libraries.

See http://wiki.apache.org/hadoop/HowToContribute .

As an aside, you could try to use gluster as a FUSE mount if you simply
want a HA FUSE mountable filesystem
which is mapreduce compatible. https://github.com/gluster/hadoop-glusterfs .



---------- Forwarded message ----------
From: YouPeng Yang <yy...@gmail.com>
Date: Wed, Apr 10, 2013 at 10:06 AM
Subject: No bulid.xml when to build FUSE
To: user@hadoop.apache.org


Dear All

   I want to integrate the FUSE with the Hadoop.
    So i checkout the code using the command:
*[root@Hadoop ~]#  svn checkout
http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk*

   However I did not find any ant build.xmls to build the fuse-dfs in the
*hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib.*
* *  Did I checkout the wrong codes, Or is there any other ways to bulid
fuse-dfs?

*  * Please guide me .
*   *
*
*
*Thanks *

regards



-- 
Jay Vyas
http://jayunit100.blogspot.com

Re: No bulid.xml when to build FUSE

Posted by Jay Vyas <ja...@gmail.com>.
hadoop-hdfs builds with maven, not ant.

You might also need to install the serialization libraries.

See http://wiki.apache.org/hadoop/HowToContribute .

As an aside, you could try to use gluster as a FUSE mount if you simply
want a HA FUSE mountable filesystem
which is mapreduce compatible. https://github.com/gluster/hadoop-glusterfs .



---------- Forwarded message ----------
From: YouPeng Yang <yy...@gmail.com>
Date: Wed, Apr 10, 2013 at 10:06 AM
Subject: No bulid.xml when to build FUSE
To: user@hadoop.apache.org


Dear All

   I want to integrate the FUSE with the Hadoop.
    So i checkout the code using the command:
*[root@Hadoop ~]#  svn checkout
http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk*

   However I did not find any ant build.xmls to build the fuse-dfs in the
*hadoop-trunk/hadoop-hdfs-project/hadoop-hdfs/src/contrib.*
* *  Did I checkout the wrong codes, Or is there any other ways to bulid
fuse-dfs?

*  * Please guide me .
*   *
*
*
*Thanks *

regards



-- 
Jay Vyas
http://jayunit100.blogspot.com