You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-user@hadoop.apache.org by Charles Earl <ch...@me.com> on 2011/09/01 18:07:43 UTC

Problems compiling HDFS FUSE on hadoop-0.20.203.0

Hi,
I noted there was a similar thread, it does not look as though it was resolved, but I am still getting the errors that were described. 
Is there any patch that will address this?
Instructions at  http://wiki.apache.org/hadoop/MountableHDFS followed
$ ant compile-contrib -Dlibhdfs=1 -Dfusedfs=1
Buildfile: hadoop-0.20.203.0/build.xml

clover.setup:
…
check-libhdfs-exists:

compile:
     [echo] contrib: fuse-dfs
     [exec] checking build system type... x86_64-apple-darwin11.1.0
     [exec] checking host system type... x86_64-apple-darwin11.1.0
     [exec] checking target system type... x86_64-apple-darwin11.1.0
     [exec] checking for a BSD-compatible install... /usr/bin/install -c
     [exec] checking whether build environment is sane... yes
     [exec] checking for a thread-safe mkdir -p... ./install-sh -c -d
     [exec] checking for gawk... gawk
     [exec] checking whether make sets $(MAKE)... yes
     [exec] checking for style of include used by make... GNU
     [exec] 
     [exec] checking for gcc... gcc
     [exec] checking whether the C compiler works... yes
     [exec] checking for C compiler default output file name... a.out
     [exec] checking for suffix of executables... 
     [exec] checking whether we are cross compiling... no
     [exec] checking for suffix of object files... o
     [exec] checking whether we are using the GNU C compiler... yes
     [exec] checking whether gcc accepts -g... yes
     [exec] checking for gcc option to accept ISO C89... none needed
     [exec] checking dependency style of gcc... gcc3
     [exec] checking for g++... g++
     [exec] checking whether we are using the GNU C++ compiler... yes
     [exec] checking whether g++ accepts -g... yes
     [exec] checking dependency style of g++... gcc3
     [exec] checking for ranlib... ranlib
     [exec] checking for bash... /bin/sh
     [exec] checking for perl... /opt/local/bin/perl
     [exec] checking for python... /usr/bin/python
     [exec] checking for ar... /usr/bin/ar
     [exec] checking for ant... /usr/bin/ant
     [exec] checking how to run the C preprocessor... gcc -E
     [exec] checking for grep that handles long lines and -e... /usr/bin/grep
     [exec] checking for egrep... /usr/bin/grep -E
     [exec] checking for uid_t in sys/types.h... yes
     [exec] checking for ANSI C header files... yes
     [exec] checking for sys/types.h... yes
     [exec] checking for sys/stat.h... yes
     [exec] checking for stdlib.h... yes
     [exec] checking for string.h... yes
     [exec] checking for memory.h... yes
     [exec] checking for strings.h... yes
     [exec] checking for inttypes.h... yes
     [exec] checking for stdint.h... yes
     [exec] checking for unistd.h... yes
     [exec] checking type of array argument to getgroups... gid_t
     [exec] checking for size_t... yes
     [exec] checking for getgroups... yes
     [exec] checking for working getgroups... yes
     [exec] checking type of array argument to getgroups... (cached) gid_t
     [exec] checking Checking EXTERNAL_PATH set to... /Users/charlescearl/Development/hadoop-0.20.203.0/src/contrib/fuse-dfs
     [exec] checking whether to enable optimized build... yes
     [exec] checking whether to enable static mode... yes
     [exec] configure: creating ./config.status
     [exec] config.status: creating Makefile
     [exec] config.status: creating src/Makefile
     [exec] config.status: executing depfiles commands
     [exec] Making all in .
     [exec] make[1]: Nothing to be done for `all-am'.
     [exec] Making all in src
     [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_dfs.o -MD -MP -MF .deps/fuse_dfs.Tpo -c -o fuse_dfs.o fuse_dfs.c
     [exec] mv -f .deps/fuse_dfs.Tpo .deps/fuse_dfs.Po
     [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_options.o -MD -MP -MF .deps/fuse_options.Tpo -c -o fuse_options.o fuse_options.c
     [exec] mv -f .deps/fuse_options.Tpo .deps/fuse_options.Po
     [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_trash.o -MD -MP -MF .deps/fuse_trash.Tpo -c -o fuse_trash.o fuse_trash.c
     [exec] mv -f .deps/fuse_trash.Tpo .deps/fuse_trash.Po
     [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_stat_struct.o -MD -MP -MF .deps/fuse_stat_struct.Tpo -c -o fuse_stat_struct.o fuse_stat_struct.c
     [exec] mv -f .deps/fuse_stat_struct.Tpo .deps/fuse_stat_struct.Po
     [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_users.o -MD -MP -MF .deps/fuse_users.Tpo -c -o fuse_users.o fuse_users.c
     [exec] fuse_users.c: In function ‘getGroups’:
     [exec] fuse_users.c:183: warning: implicit declaration of function ‘getgrouplist’
     [exec] mv -f .deps/fuse_users.Tpo .deps/fuse_users.Po
     [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_init.o -MD -MP -MF .deps/fuse_init.Tpo -c -o fuse_init.o fuse_init.c
     [exec] fuse_connect.c: In function ‘doConnectAsUser’:
     [exec] fuse_connect.c:40: error: too many arguments to function ‘hdfsConnectAsUser’
     [exec] make[1]: *** [fuse_connect.o] Error 1
     [exec] make: *** [all-recursive] Error 1
     [exec] mv -f .deps/fuse_init.Tpo .deps/fuse_init.Po
     [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_connect.o -MD -MP -MF .deps/fuse_connect.Tpo -c -o fuse_connect.o fuse_connect.c

BUILD FAILED
/Users/charlescearl/Development/hadoop-0.20.203.0/build.xml:614: The following error occurred while executing this line:
/Users/charlescearl/Development/hadoop-0.20.203.0/src/contrib/build.xml:30: The following error occurred while executing this line:
/Users/charlescearl/Development/hadoop-0.20.203.0/src/contrib/fuse-dfs/build.xml:57: exec returned: 2



Re: Problems compiling HDFS FUSE on hadoop-0.20.203.0

Posted by Charles Earl <ch...@me.com>.
Finally entered the jira today. My first one, please excuse any departures from best practices.
   https://issues.apache.org/jira/browse/HDFS-2325
C
On Sep 3, 2011, at 11:24 AM, Arun Murthy wrote:

> Thanks Charles!
> 
> Sent from my iPhone
> 
> On Sep 3, 2011, at 5:41 AM, Charles Earl <ch...@me.com> wrote:
> 
>> Arun,
>> I have been able to get fuse-dfs to compile by changing the signature of hdfsConnectAsUser in src/c++/libhdfs/ from
>>    hdfsFS hdfsConnectAsUser(const char* host, tPort port, const char *user);
>> to
>>    hdfsFS hdfsConnectAsUser(const char* host, tPort port, const char *user, const char** groups, int numgroups);
>> to match the signature in
>>    fuse_connect.c
>> I have not run any of the tests with this, but it now works for mounting hdfs
>> I will create jira for it this weekend.
>> Charles
>> On Sep 1, 2011, at 3:23 PM, Charles Earl wrote:
>> 
>>> I was afraid you would say that.
>>> Yes, I will open jira.
>>> Charles
>>> 
>>> On Sep 1, 2011, at 2:58 PM, Arun C Murthy <ac...@hortonworks.com> wrote:
>>> 
>>>> Charles, can you please open a jira?
>>>> 
>>>> http://wiki.apache.org/hadoop/HowToContribute
>>>> 
>>>> thanks,
>>>> Arun
>>>> 
>>>> PS: We'd love a patch too! :)
>>>> 
>>>> 
>>>> On Sep 1, 2011, at 9:07 AM, Charles Earl wrote:
>>>> 
>>>>> Hi,
>>>>> I noted there was a similar thread, it does not look as though it was resolved, but I am still getting the errors that were described. 
>>>>> Is there any patch that will address this?
>>>>> Instructions at  http://wiki.apache.org/hadoop/MountableHDFS followed
>>>>> $ ant compile-contrib -Dlibhdfs=1 -Dfusedfs=1
>>>>> Buildfile: hadoop-0.20.203.0/build.xml
>>>>> 
>>>>> clover.setup:
>>>>> …
>>>>> check-libhdfs-exists:
>>>>> 
>>>>> compile:
>>>>>      [echo] contrib: fuse-dfs
>>>>>      [exec] checking build system type... x86_64-apple-darwin11.1.0
>>>>>      [exec] checking host system type... x86_64-apple-darwin11.1.0
>>>>>      [exec] checking target system type... x86_64-apple-darwin11.1.0
>>>>>      [exec] checking for a BSD-compatible install... /usr/bin/install -c
>>>>>      [exec] checking whether build environment is sane... yes
>>>>>      [exec] checking for a thread-safe mkdir -p... ./install-sh -c -d
>>>>>      [exec] checking for gawk... gawk
>>>>>      [exec] checking whether make sets $(MAKE)... yes
>>>>>      [exec] checking for style of include used by make... GNU
>>>>>      [exec] 
>>>>>      [exec] checking for gcc... gcc
>>>>>      [exec] checking whether the C compiler works... yes
>>>>>      [exec] checking for C compiler default output file name... a.out
>>>>>      [exec] checking for suffix of executables... 
>>>>>      [exec] checking whether we are cross compiling... no
>>>>>      [exec] checking for suffix of object files... o
>>>>>      [exec] checking whether we are using the GNU C compiler... yes
>>>>>      [exec] checking whether gcc accepts -g... yes
>>>>>      [exec] checking for gcc option to accept ISO C89... none needed
>>>>>      [exec] checking dependency style of gcc... gcc3
>>>>>      [exec] checking for g++... g++
>>>>>      [exec] checking whether we are using the GNU C++ compiler... yes
>>>>>      [exec] checking whether g++ accepts -g... yes
>>>>>      [exec] checking dependency style of g++... gcc3
>>>>>      [exec] checking for ranlib... ranlib
>>>>>      [exec] checking for bash... /bin/sh
>>>>>      [exec] checking for perl... /opt/local/bin/perl
>>>>>      [exec] checking for python... /usr/bin/python
>>>>>      [exec] checking for ar... /usr/bin/ar
>>>>>      [exec] checking for ant... /usr/bin/ant
>>>>>      [exec] checking how to run the C preprocessor... gcc -E
>>>>>      [exec] checking for grep that handles long lines and -e... /usr/bin/grep
>>>>>      [exec] checking for egrep... /usr/bin/grep -E
>>>>>      [exec] checking for uid_t in sys/types.h... yes
>>>>>      [exec] checking for ANSI C header files... yes
>>>>>      [exec] checking for sys/types.h... yes
>>>>>      [exec] checking for sys/stat.h... yes
>>>>>      [exec] checking for stdlib.h... yes
>>>>>      [exec] checking for string.h... yes
>>>>>      [exec] checking for memory.h... yes
>>>>>      [exec] checking for strings.h... yes
>>>>>      [exec] checking for inttypes.h... yes
>>>>>      [exec] checking for stdint.h... yes
>>>>>      [exec] checking for unistd.h... yes
>>>>>      [exec] checking type of array argument to getgroups... gid_t
>>>>>      [exec] checking for size_t... yes
>>>>>      [exec] checking for getgroups... yes
>>>>>      [exec] checking for working getgroups... yes
>>>>>      [exec] checking type of array argument to getgroups... (cached) gid_t
>>>>>      [exec] checking Checking EXTERNAL_PATH set to... /Users/charlescearl/Development/hadoop-0.20.203.0/src/contrib/fuse-dfs
>>>>>      [exec] checking whether to enable optimized build... yes
>>>>>      [exec] checking whether to enable static mode... yes
>>>>>      [exec] configure: creating ./config.status
>>>>>      [exec] config.status: creating Makefile
>>>>>      [exec] config.status: creating src/Makefile
>>>>>      [exec] config.status: executing depfiles commands
>>>>>      [exec] Making all in .
>>>>>      [exec] make[1]: Nothing to be done for `all-am'.
>>>>>      [exec] Making all in src
>>>>>      [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_dfs.o -MD -MP -MF .deps/fuse_dfs.Tpo -c -o fuse_dfs.o fuse_dfs.c
>>>>>      [exec] mv -f .deps/fuse_dfs.Tpo .deps/fuse_dfs.Po
>>>>>      [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_options.o -MD -MP -MF .deps/fuse_options.Tpo -c -o fuse_options.o fuse_options.c
>>>>>      [exec] mv -f .deps/fuse_options.Tpo .deps/fuse_options.Po
>>>>>      [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_trash.o -MD -MP -MF .deps/fuse_trash.Tpo -c -o fuse_trash.o fuse_trash.c
>>>>>      [exec] mv -f .deps/fuse_trash.Tpo .deps/fuse_trash.Po
>>>>>      [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_stat_struct.o -MD -MP -MF .deps/fuse_stat_struct.Tpo -c -o fuse_stat_struct.o fuse_stat_struct.c
>>>>>      [exec] mv -f .deps/fuse_stat_struct.Tpo .deps/fuse_stat_struct.Po
>>>>>      [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_users.o -MD -MP -MF .deps/fuse_users.Tpo -c -o fuse_users.o fuse_users.c
>>>>>      [exec] fuse_users.c: In function ‘getGroups’:
>>>>>      [exec] fuse_users.c:183: warning: implicit declaration of function ‘getgrouplist’
>>>>>      [exec] mv -f .deps/fuse_users.Tpo .deps/fuse_users.Po
>>>>>      [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_init.o -MD -MP -MF .deps/fuse_init.Tpo -c -o fuse_init.o fuse_init.c
>>>>>      [exec] fuse_connect.c: In function ‘doConnectAsUser’:
>>>>>      [exec] fuse_connect.c:40: error: too many arguments to function ‘hdfsConnectAsUser’
>>>>>      [exec] make[1]: *** [fuse_connect.o] Error 1
>>>>>      [exec] make: *** [all-recursive] Error 1
>>>>>      [exec] mv -f .deps/fuse_init.Tpo .deps/fuse_init.Po
>>>>>      [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_connect.o -MD -MP -MF .deps/fuse_connect.Tpo -c -o fuse_connect.o fuse_connect.c
>>>>> 
>>>>> BUILD FAILED
>>>>> /Users/charlescearl/Development/hadoop-0.20.203.0/build.xml:614: The following error occurred while executing this line:
>>>>> /Users/charlescearl/Development/hadoop-0.20.203.0/src/contrib/build.xml:30: The following error occurred while executing this line:
>>>>> /Users/charlescearl/Development/hadoop-0.20.203.0/src/contrib/fuse-dfs/build.xml:57: exec returned: 2
>>>>> 
>>>>> 
>>>> 
>> 


Re: Problems compiling HDFS FUSE on hadoop-0.20.203.0

Posted by Arun Murthy <ac...@hortonworks.com>.
Thanks Charles!

Sent from my iPhone

On Sep 3, 2011, at 5:41 AM, Charles Earl <ch...@me.com> wrote:

Arun,
I have been able to get fuse-dfs to compile by changing the signature
of hdfsConnectAsUser in src/c++/libhdfs/ from
   hdfsFS hdfsConnectAsUser(const char* host, tPort port, const char *user);
to
   hdfsFS hdfsConnectAsUser(const char* host, tPort port, const char *user,
const char** groups, int numgroups);
to match the signature in
   fuse_connect.c
I have not run any of the tests with this, but it now works for mounting
hdfs
I will create jira for it this weekend.
Charles
On Sep 1, 2011, at 3:23 PM, Charles Earl wrote:

I was afraid you would say that.
Yes, I will open jira.
Charles

On Sep 1, 2011, at 2:58 PM, Arun C Murthy <ac...@hortonworks.com> wrote:

Charles, can you please open a jira?

<http://wiki.apache.org/hadoop/HowToContribute>
http://wiki.apache.org/hadoop/HowToContribute

thanks,
Arun

PS: We'd love a patch too! :)


On Sep 1, 2011, at 9:07 AM, Charles Earl wrote:

Hi,
I noted there was a similar thread, it does not look as though it was
resolved, but I am still getting the errors that were described.
Is there any patch that will address this?
Instructions at   <http://wiki.apache.org/hadoop/MountableHDFS>
http://wiki.apache.org/hadoop/MountableHDFS followed
$ ant compile-contrib -Dlibhdfs=1 -Dfusedfs=1
Buildfile: hadoop-0.20.203.0/build.xml

clover.setup:
…
check-libhdfs-exists:

compile:
     [echo] contrib: fuse-dfs
     [exec] checking build system type... x86_64-apple-darwin11.1.0
     [exec] checking host system type... x86_64-apple-darwin11.1.0
     [exec] checking target system type... x86_64-apple-darwin11.1.0
     [exec] checking for a BSD-compatible install... /usr/bin/install -c
     [exec] checking whether build environment is sane... yes
     [exec] checking for a thread-safe mkdir -p... ./install-sh -c -d
     [exec] checking for gawk... gawk
     [exec] checking whether make sets $(MAKE)... yes
     [exec] checking for style of include used by make... GNU
     [exec]
     [exec] checking for gcc... gcc
     [exec] checking whether the C compiler works... yes
     [exec] checking for C compiler default output file name... a.out
     [exec] checking for suffix of executables...
     [exec] checking whether we are cross compiling... no
     [exec] checking for suffix of object files... o
     [exec] checking whether we are using the GNU C compiler... yes
     [exec] checking whether gcc accepts -g... yes
     [exec] checking for gcc option to accept ISO C89... none needed
     [exec] checking dependency style of gcc... gcc3
     [exec] checking for g++... g++
     [exec] checking whether we are using the GNU C++ compiler... yes
     [exec] checking whether g++ accepts -g... yes
     [exec] checking dependency style of g++... gcc3
     [exec] checking for ranlib... ranlib
     [exec] checking for bash... /bin/sh
     [exec] checking for perl... /opt/local/bin/perl
     [exec] checking for python... /usr/bin/python
     [exec] checking for ar... /usr/bin/ar
     [exec] checking for ant... /usr/bin/ant
     [exec] checking how to run the C preprocessor... gcc -E
     [exec] checking for grep that handles long lines and -e...
/usr/bin/grep
     [exec] checking for egrep... /usr/bin/grep -E
     [exec] checking for uid_t in sys/types.h... yes
     [exec] checking for ANSI C header files... yes
     [exec] checking for sys/types.h... yes
     [exec] checking for sys/stat.h... yes
     [exec] checking for stdlib.h... yes
     [exec] checking for string.h... yes
     [exec] checking for memory.h... yes
     [exec] checking for strings.h... yes
     [exec] checking for inttypes.h... yes
     [exec] checking for stdint.h... yes
     [exec] checking for unistd.h... yes
     [exec] checking type of array argument to getgroups... gid_t
     [exec] checking for size_t... yes
     [exec] checking for getgroups... yes
     [exec] checking for working getgroups... yes
     [exec] checking type of array argument to getgroups... (cached) gid_t
     [exec] checking Checking EXTERNAL_PATH set to...
/Users/charlescearl/Development/hadoop-0.20.203.0/src/contrib/fuse-dfs
     [exec] checking whether to enable optimized build... yes
     [exec] checking whether to enable static mode... yes
     [exec] configure: creating ./config.status
     [exec] config.status: creating Makefile
     [exec] config.status: creating src/Makefile
     [exec] config.status: executing depfiles commands
     [exec] Making all in .
     [exec] make[1]: Nothing to be done for `all-am'.
     [exec] Making all in src
     [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\"
-DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\"
-DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1
-DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1
-DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1
-DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t
-I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include
-I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/
-I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\"
-I/include   -Wall -O3 -MT fuse_dfs.o -MD -MP -MF .deps/fuse_dfs.Tpo -c -o
fuse_dfs.o fuse_dfs.c
     [exec] mv -f .deps/fuse_dfs.Tpo .deps/fuse_dfs.Po
     [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\"
-DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\"
-DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1
-DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1
-DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1
-DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t
-I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include
-I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/
-I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\"
-I/include   -Wall -O3 -MT fuse_options.o -MD -MP -MF .deps/fuse_options.Tpo
-c -o fuse_options.o fuse_options.c
     [exec] mv -f .deps/fuse_options.Tpo .deps/fuse_options.Po
     [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\"
-DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\"
-DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1
-DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1
-DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1
-DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t
-I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include
-I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/
-I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\"
-I/include   -Wall -O3 -MT fuse_trash.o -MD -MP -MF .deps/fuse_trash.Tpo -c
-o fuse_trash.o fuse_trash.c
     [exec] mv -f .deps/fuse_trash.Tpo .deps/fuse_trash.Po
     [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\"
-DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\"
-DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1
-DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1
-DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1
-DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t
-I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include
-I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/
-I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\"
-I/include   -Wall -O3 -MT fuse_stat_struct.o -MD -MP -MF
.deps/fuse_stat_struct.Tpo -c -o fuse_stat_struct.o fuse_stat_struct.c
     [exec] mv -f .deps/fuse_stat_struct.Tpo .deps/fuse_stat_struct.Po
     [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\"
-DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\"
-DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1
-DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1
-DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1
-DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t
-I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include
-I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/
-I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\"
-I/include   -Wall -O3 -MT fuse_users.o -MD -MP -MF .deps/fuse_users.Tpo -c
-o fuse_users.o fuse_users.c
     [exec] fuse_users.c: In function ‘getGroups’:
     [exec] fuse_users.c:183: warning: implicit declaration of function
‘getgrouplist’
     [exec] mv -f .deps/fuse_users.Tpo .deps/fuse_users.Po
     [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\"
-DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\"
-DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1
-DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1
-DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1
-DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t
-I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include
-I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/
-I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\"
-I/include   -Wall -O3 -MT fuse_init.o -MD -MP -MF .deps/fuse_init.Tpo -c -o
fuse_init.o fuse_init.c
     [exec] fuse_connect.c: In function ‘doConnectAsUser’:
     [exec] fuse_connect.c:40: error: too many arguments to function
‘hdfsConnectAsUser’
     [exec] make[1]: *** [fuse_connect.o] Error 1
     [exec] make: *** [all-recursive] Error 1
     [exec] mv -f .deps/fuse_init.Tpo .deps/fuse_init.Po
     [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\"
-DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\"
-DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1
-DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1
-DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1
-DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t
-I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include
-I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/
-I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\"
-I/include   -Wall -O3 -MT fuse_connect.o -MD -MP -MF .deps/fuse_connect.Tpo
-c -o fuse_connect.o fuse_connect.c

BUILD FAILED
/Users/charlescearl/Development/hadoop-0.20.203.0/build.xml:614: The
following error occurred while executing this line:
/Users/charlescearl/Development/hadoop-0.20.203.0/src/contrib/build.xml:30:
The following error occurred while executing this line:
/Users/charlescearl/Development/hadoop-0.20.203.0/src/contrib/fuse-dfs/build.xml:57:
exec returned: 2

Re: Problems compiling HDFS FUSE on hadoop-0.20.203.0

Posted by Charles Earl <ch...@me.com>.
Arun,
I have been able to get fuse-dfs to compile by changing the signature of hdfsConnectAsUser in src/c++/libhdfs/ from
   hdfsFS hdfsConnectAsUser(const char* host, tPort port, const char *user);
to
   hdfsFS hdfsConnectAsUser(const char* host, tPort port, const char *user, const char** groups, int numgroups);
to match the signature in
   fuse_connect.c
I have not run any of the tests with this, but it now works for mounting hdfs
I will create jira for it this weekend.
Charles
On Sep 1, 2011, at 3:23 PM, Charles Earl wrote:

> I was afraid you would say that.
> Yes, I will open jira.
> Charles
> 
> On Sep 1, 2011, at 2:58 PM, Arun C Murthy <ac...@hortonworks.com> wrote:
> 
>> Charles, can you please open a jira?
>> 
>> http://wiki.apache.org/hadoop/HowToContribute
>> 
>> thanks,
>> Arun
>> 
>> PS: We'd love a patch too! :)
>> 
>> 
>> On Sep 1, 2011, at 9:07 AM, Charles Earl wrote:
>> 
>>> Hi,
>>> I noted there was a similar thread, it does not look as though it was resolved, but I am still getting the errors that were described. 
>>> Is there any patch that will address this?
>>> Instructions at  http://wiki.apache.org/hadoop/MountableHDFS followed
>>> $ ant compile-contrib -Dlibhdfs=1 -Dfusedfs=1
>>> Buildfile: hadoop-0.20.203.0/build.xml
>>> 
>>> clover.setup:
>>> …
>>> check-libhdfs-exists:
>>> 
>>> compile:
>>>      [echo] contrib: fuse-dfs
>>>      [exec] checking build system type... x86_64-apple-darwin11.1.0
>>>      [exec] checking host system type... x86_64-apple-darwin11.1.0
>>>      [exec] checking target system type... x86_64-apple-darwin11.1.0
>>>      [exec] checking for a BSD-compatible install... /usr/bin/install -c
>>>      [exec] checking whether build environment is sane... yes
>>>      [exec] checking for a thread-safe mkdir -p... ./install-sh -c -d
>>>      [exec] checking for gawk... gawk
>>>      [exec] checking whether make sets $(MAKE)... yes
>>>      [exec] checking for style of include used by make... GNU
>>>      [exec] 
>>>      [exec] checking for gcc... gcc
>>>      [exec] checking whether the C compiler works... yes
>>>      [exec] checking for C compiler default output file name... a.out
>>>      [exec] checking for suffix of executables... 
>>>      [exec] checking whether we are cross compiling... no
>>>      [exec] checking for suffix of object files... o
>>>      [exec] checking whether we are using the GNU C compiler... yes
>>>      [exec] checking whether gcc accepts -g... yes
>>>      [exec] checking for gcc option to accept ISO C89... none needed
>>>      [exec] checking dependency style of gcc... gcc3
>>>      [exec] checking for g++... g++
>>>      [exec] checking whether we are using the GNU C++ compiler... yes
>>>      [exec] checking whether g++ accepts -g... yes
>>>      [exec] checking dependency style of g++... gcc3
>>>      [exec] checking for ranlib... ranlib
>>>      [exec] checking for bash... /bin/sh
>>>      [exec] checking for perl... /opt/local/bin/perl
>>>      [exec] checking for python... /usr/bin/python
>>>      [exec] checking for ar... /usr/bin/ar
>>>      [exec] checking for ant... /usr/bin/ant
>>>      [exec] checking how to run the C preprocessor... gcc -E
>>>      [exec] checking for grep that handles long lines and -e... /usr/bin/grep
>>>      [exec] checking for egrep... /usr/bin/grep -E
>>>      [exec] checking for uid_t in sys/types.h... yes
>>>      [exec] checking for ANSI C header files... yes
>>>      [exec] checking for sys/types.h... yes
>>>      [exec] checking for sys/stat.h... yes
>>>      [exec] checking for stdlib.h... yes
>>>      [exec] checking for string.h... yes
>>>      [exec] checking for memory.h... yes
>>>      [exec] checking for strings.h... yes
>>>      [exec] checking for inttypes.h... yes
>>>      [exec] checking for stdint.h... yes
>>>      [exec] checking for unistd.h... yes
>>>      [exec] checking type of array argument to getgroups... gid_t
>>>      [exec] checking for size_t... yes
>>>      [exec] checking for getgroups... yes
>>>      [exec] checking for working getgroups... yes
>>>      [exec] checking type of array argument to getgroups... (cached) gid_t
>>>      [exec] checking Checking EXTERNAL_PATH set to... /Users/charlescearl/Development/hadoop-0.20.203.0/src/contrib/fuse-dfs
>>>      [exec] checking whether to enable optimized build... yes
>>>      [exec] checking whether to enable static mode... yes
>>>      [exec] configure: creating ./config.status
>>>      [exec] config.status: creating Makefile
>>>      [exec] config.status: creating src/Makefile
>>>      [exec] config.status: executing depfiles commands
>>>      [exec] Making all in .
>>>      [exec] make[1]: Nothing to be done for `all-am'.
>>>      [exec] Making all in src
>>>      [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_dfs.o -MD -MP -MF .deps/fuse_dfs.Tpo -c -o fuse_dfs.o fuse_dfs.c
>>>      [exec] mv -f .deps/fuse_dfs.Tpo .deps/fuse_dfs.Po
>>>      [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_options.o -MD -MP -MF .deps/fuse_options.Tpo -c -o fuse_options.o fuse_options.c
>>>      [exec] mv -f .deps/fuse_options.Tpo .deps/fuse_options.Po
>>>      [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_trash.o -MD -MP -MF .deps/fuse_trash.Tpo -c -o fuse_trash.o fuse_trash.c
>>>      [exec] mv -f .deps/fuse_trash.Tpo .deps/fuse_trash.Po
>>>      [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_stat_struct.o -MD -MP -MF .deps/fuse_stat_struct.Tpo -c -o fuse_stat_struct.o fuse_stat_struct.c
>>>      [exec] mv -f .deps/fuse_stat_struct.Tpo .deps/fuse_stat_struct.Po
>>>      [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_users.o -MD -MP -MF .deps/fuse_users.Tpo -c -o fuse_users.o fuse_users.c
>>>      [exec] fuse_users.c: In function ‘getGroups’:
>>>      [exec] fuse_users.c:183: warning: implicit declaration of function ‘getgrouplist’
>>>      [exec] mv -f .deps/fuse_users.Tpo .deps/fuse_users.Po
>>>      [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_init.o -MD -MP -MF .deps/fuse_init.Tpo -c -o fuse_init.o fuse_init.c
>>>      [exec] fuse_connect.c: In function ‘doConnectAsUser’:
>>>      [exec] fuse_connect.c:40: error: too many arguments to function ‘hdfsConnectAsUser’
>>>      [exec] make[1]: *** [fuse_connect.o] Error 1
>>>      [exec] make: *** [all-recursive] Error 1
>>>      [exec] mv -f .deps/fuse_init.Tpo .deps/fuse_init.Po
>>>      [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_connect.o -MD -MP -MF .deps/fuse_connect.Tpo -c -o fuse_connect.o fuse_connect.c
>>> 
>>> BUILD FAILED
>>> /Users/charlescearl/Development/hadoop-0.20.203.0/build.xml:614: The following error occurred while executing this line:
>>> /Users/charlescearl/Development/hadoop-0.20.203.0/src/contrib/build.xml:30: The following error occurred while executing this line:
>>> /Users/charlescearl/Development/hadoop-0.20.203.0/src/contrib/fuse-dfs/build.xml:57: exec returned: 2
>>> 
>>> 
>> 


Re: Problems compiling HDFS FUSE on hadoop-0.20.203.0

Posted by Charles Earl <ch...@me.com>.
I was afraid you would say that.
Yes, I will open jira.
Charles

On Sep 1, 2011, at 2:58 PM, Arun C Murthy <ac...@hortonworks.com> wrote:

> Charles, can you please open a jira?
> 
> http://wiki.apache.org/hadoop/HowToContribute
> 
> thanks,
> Arun
> 
> PS: We'd love a patch too! :)
> 
> 
> On Sep 1, 2011, at 9:07 AM, Charles Earl wrote:
> 
>> Hi,
>> I noted there was a similar thread, it does not look as though it was resolved, but I am still getting the errors that were described. 
>> Is there any patch that will address this?
>> Instructions at  http://wiki.apache.org/hadoop/MountableHDFS followed
>> $ ant compile-contrib -Dlibhdfs=1 -Dfusedfs=1
>> Buildfile: hadoop-0.20.203.0/build.xml
>> 
>> clover.setup:
>> …
>> check-libhdfs-exists:
>> 
>> compile:
>>      [echo] contrib: fuse-dfs
>>      [exec] checking build system type... x86_64-apple-darwin11.1.0
>>      [exec] checking host system type... x86_64-apple-darwin11.1.0
>>      [exec] checking target system type... x86_64-apple-darwin11.1.0
>>      [exec] checking for a BSD-compatible install... /usr/bin/install -c
>>      [exec] checking whether build environment is sane... yes
>>      [exec] checking for a thread-safe mkdir -p... ./install-sh -c -d
>>      [exec] checking for gawk... gawk
>>      [exec] checking whether make sets $(MAKE)... yes
>>      [exec] checking for style of include used by make... GNU
>>      [exec] 
>>      [exec] checking for gcc... gcc
>>      [exec] checking whether the C compiler works... yes
>>      [exec] checking for C compiler default output file name... a.out
>>      [exec] checking for suffix of executables... 
>>      [exec] checking whether we are cross compiling... no
>>      [exec] checking for suffix of object files... o
>>      [exec] checking whether we are using the GNU C compiler... yes
>>      [exec] checking whether gcc accepts -g... yes
>>      [exec] checking for gcc option to accept ISO C89... none needed
>>      [exec] checking dependency style of gcc... gcc3
>>      [exec] checking for g++... g++
>>      [exec] checking whether we are using the GNU C++ compiler... yes
>>      [exec] checking whether g++ accepts -g... yes
>>      [exec] checking dependency style of g++... gcc3
>>      [exec] checking for ranlib... ranlib
>>      [exec] checking for bash... /bin/sh
>>      [exec] checking for perl... /opt/local/bin/perl
>>      [exec] checking for python... /usr/bin/python
>>      [exec] checking for ar... /usr/bin/ar
>>      [exec] checking for ant... /usr/bin/ant
>>      [exec] checking how to run the C preprocessor... gcc -E
>>      [exec] checking for grep that handles long lines and -e... /usr/bin/grep
>>      [exec] checking for egrep... /usr/bin/grep -E
>>      [exec] checking for uid_t in sys/types.h... yes
>>      [exec] checking for ANSI C header files... yes
>>      [exec] checking for sys/types.h... yes
>>      [exec] checking for sys/stat.h... yes
>>      [exec] checking for stdlib.h... yes
>>      [exec] checking for string.h... yes
>>      [exec] checking for memory.h... yes
>>      [exec] checking for strings.h... yes
>>      [exec] checking for inttypes.h... yes
>>      [exec] checking for stdint.h... yes
>>      [exec] checking for unistd.h... yes
>>      [exec] checking type of array argument to getgroups... gid_t
>>      [exec] checking for size_t... yes
>>      [exec] checking for getgroups... yes
>>      [exec] checking for working getgroups... yes
>>      [exec] checking type of array argument to getgroups... (cached) gid_t
>>      [exec] checking Checking EXTERNAL_PATH set to... /Users/charlescearl/Development/hadoop-0.20.203.0/src/contrib/fuse-dfs
>>      [exec] checking whether to enable optimized build... yes
>>      [exec] checking whether to enable static mode... yes
>>      [exec] configure: creating ./config.status
>>      [exec] config.status: creating Makefile
>>      [exec] config.status: creating src/Makefile
>>      [exec] config.status: executing depfiles commands
>>      [exec] Making all in .
>>      [exec] make[1]: Nothing to be done for `all-am'.
>>      [exec] Making all in src
>>      [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_dfs.o -MD -MP -MF .deps/fuse_dfs.Tpo -c -o fuse_dfs.o fuse_dfs.c
>>      [exec] mv -f .deps/fuse_dfs.Tpo .deps/fuse_dfs.Po
>>      [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_options.o -MD -MP -MF .deps/fuse_options.Tpo -c -o fuse_options.o fuse_options.c
>>      [exec] mv -f .deps/fuse_options.Tpo .deps/fuse_options.Po
>>      [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_trash.o -MD -MP -MF .deps/fuse_trash.Tpo -c -o fuse_trash.o fuse_trash.c
>>      [exec] mv -f .deps/fuse_trash.Tpo .deps/fuse_trash.Po
>>      [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_stat_struct.o -MD -MP -MF .deps/fuse_stat_struct.Tpo -c -o fuse_stat_struct.o fuse_stat_struct.c
>>      [exec] mv -f .deps/fuse_stat_struct.Tpo .deps/fuse_stat_struct.Po
>>      [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_users.o -MD -MP -MF .deps/fuse_users.Tpo -c -o fuse_users.o fuse_users.c
>>      [exec] fuse_users.c: In function ‘getGroups’:
>>      [exec] fuse_users.c:183: warning: implicit declaration of function ‘getgrouplist’
>>      [exec] mv -f .deps/fuse_users.Tpo .deps/fuse_users.Po
>>      [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_init.o -MD -MP -MF .deps/fuse_init.Tpo -c -o fuse_init.o fuse_init.c
>>      [exec] fuse_connect.c: In function ‘doConnectAsUser’:
>>      [exec] fuse_connect.c:40: error: too many arguments to function ‘hdfsConnectAsUser’
>>      [exec] make[1]: *** [fuse_connect.o] Error 1
>>      [exec] make: *** [all-recursive] Error 1
>>      [exec] mv -f .deps/fuse_init.Tpo .deps/fuse_init.Po
>>      [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_connect.o -MD -MP -MF .deps/fuse_connect.Tpo -c -o fuse_connect.o fuse_connect.c
>> 
>> BUILD FAILED
>> /Users/charlescearl/Development/hadoop-0.20.203.0/build.xml:614: The following error occurred while executing this line:
>> /Users/charlescearl/Development/hadoop-0.20.203.0/src/contrib/build.xml:30: The following error occurred while executing this line:
>> /Users/charlescearl/Development/hadoop-0.20.203.0/src/contrib/fuse-dfs/build.xml:57: exec returned: 2
>> 
>> 
> 

Re: Problems compiling HDFS FUSE on hadoop-0.20.203.0

Posted by Arun C Murthy <ac...@hortonworks.com>.
Charles, can you please open a jira?

http://wiki.apache.org/hadoop/HowToContribute

thanks,
Arun

PS: We'd love a patch too! :)


On Sep 1, 2011, at 9:07 AM, Charles Earl wrote:

> Hi,
> I noted there was a similar thread, it does not look as though it was resolved, but I am still getting the errors that were described. 
> Is there any patch that will address this?
> Instructions at  http://wiki.apache.org/hadoop/MountableHDFS followed
> $ ant compile-contrib -Dlibhdfs=1 -Dfusedfs=1
> Buildfile: hadoop-0.20.203.0/build.xml
> 
> clover.setup:
> …
> check-libhdfs-exists:
> 
> compile:
>      [echo] contrib: fuse-dfs
>      [exec] checking build system type... x86_64-apple-darwin11.1.0
>      [exec] checking host system type... x86_64-apple-darwin11.1.0
>      [exec] checking target system type... x86_64-apple-darwin11.1.0
>      [exec] checking for a BSD-compatible install... /usr/bin/install -c
>      [exec] checking whether build environment is sane... yes
>      [exec] checking for a thread-safe mkdir -p... ./install-sh -c -d
>      [exec] checking for gawk... gawk
>      [exec] checking whether make sets $(MAKE)... yes
>      [exec] checking for style of include used by make... GNU
>      [exec] 
>      [exec] checking for gcc... gcc
>      [exec] checking whether the C compiler works... yes
>      [exec] checking for C compiler default output file name... a.out
>      [exec] checking for suffix of executables... 
>      [exec] checking whether we are cross compiling... no
>      [exec] checking for suffix of object files... o
>      [exec] checking whether we are using the GNU C compiler... yes
>      [exec] checking whether gcc accepts -g... yes
>      [exec] checking for gcc option to accept ISO C89... none needed
>      [exec] checking dependency style of gcc... gcc3
>      [exec] checking for g++... g++
>      [exec] checking whether we are using the GNU C++ compiler... yes
>      [exec] checking whether g++ accepts -g... yes
>      [exec] checking dependency style of g++... gcc3
>      [exec] checking for ranlib... ranlib
>      [exec] checking for bash... /bin/sh
>      [exec] checking for perl... /opt/local/bin/perl
>      [exec] checking for python... /usr/bin/python
>      [exec] checking for ar... /usr/bin/ar
>      [exec] checking for ant... /usr/bin/ant
>      [exec] checking how to run the C preprocessor... gcc -E
>      [exec] checking for grep that handles long lines and -e... /usr/bin/grep
>      [exec] checking for egrep... /usr/bin/grep -E
>      [exec] checking for uid_t in sys/types.h... yes
>      [exec] checking for ANSI C header files... yes
>      [exec] checking for sys/types.h... yes
>      [exec] checking for sys/stat.h... yes
>      [exec] checking for stdlib.h... yes
>      [exec] checking for string.h... yes
>      [exec] checking for memory.h... yes
>      [exec] checking for strings.h... yes
>      [exec] checking for inttypes.h... yes
>      [exec] checking for stdint.h... yes
>      [exec] checking for unistd.h... yes
>      [exec] checking type of array argument to getgroups... gid_t
>      [exec] checking for size_t... yes
>      [exec] checking for getgroups... yes
>      [exec] checking for working getgroups... yes
>      [exec] checking type of array argument to getgroups... (cached) gid_t
>      [exec] checking Checking EXTERNAL_PATH set to... /Users/charlescearl/Development/hadoop-0.20.203.0/src/contrib/fuse-dfs
>      [exec] checking whether to enable optimized build... yes
>      [exec] checking whether to enable static mode... yes
>      [exec] configure: creating ./config.status
>      [exec] config.status: creating Makefile
>      [exec] config.status: creating src/Makefile
>      [exec] config.status: executing depfiles commands
>      [exec] Making all in .
>      [exec] make[1]: Nothing to be done for `all-am'.
>      [exec] Making all in src
>      [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_dfs.o -MD -MP -MF .deps/fuse_dfs.Tpo -c -o fuse_dfs.o fuse_dfs.c
>      [exec] mv -f .deps/fuse_dfs.Tpo .deps/fuse_dfs.Po
>      [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_options.o -MD -MP -MF .deps/fuse_options.Tpo -c -o fuse_options.o fuse_options.c
>      [exec] mv -f .deps/fuse_options.Tpo .deps/fuse_options.Po
>      [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_trash.o -MD -MP -MF .deps/fuse_trash.Tpo -c -o fuse_trash.o fuse_trash.c
>      [exec] mv -f .deps/fuse_trash.Tpo .deps/fuse_trash.Po
>      [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_stat_struct.o -MD -MP -MF .deps/fuse_stat_struct.Tpo -c -o fuse_stat_struct.o fuse_stat_struct.c
>      [exec] mv -f .deps/fuse_stat_struct.Tpo .deps/fuse_stat_struct.Po
>      [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_users.o -MD -MP -MF .deps/fuse_users.Tpo -c -o fuse_users.o fuse_users.c
>      [exec] fuse_users.c: In function ‘getGroups’:
>      [exec] fuse_users.c:183: warning: implicit declaration of function ‘getgrouplist’
>      [exec] mv -f .deps/fuse_users.Tpo .deps/fuse_users.Po
>      [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_init.o -MD -MP -MF .deps/fuse_init.Tpo -c -o fuse_init.o fuse_init.c
>      [exec] fuse_connect.c: In function ‘doConnectAsUser’:
>      [exec] fuse_connect.c:40: error: too many arguments to function ‘hdfsConnectAsUser’
>      [exec] make[1]: *** [fuse_connect.o] Error 1
>      [exec] make: *** [all-recursive] Error 1
>      [exec] mv -f .deps/fuse_init.Tpo .deps/fuse_init.Po
>      [exec] gcc -DPACKAGE_NAME=\"fuse_dfs\" -DPACKAGE_TARNAME=\"fuse_dfs\" -DPACKAGE_VERSION=\"0.1.0\" -DPACKAGE_STRING=\"fuse_dfs\ 0.1.0\" -DPACKAGE_BUGREPORT=\"\" -DPACKAGE_URL=\"\" -DSTDC_HEADERS=1 -DHAVE_SYS_TYPES_H=1 -DHAVE_SYS_STAT_H=1 -DHAVE_STDLIB_H=1 -DHAVE_STRING_H=1 -DHAVE_MEMORY_H=1 -DHAVE_STRINGS_H=1 -DHAVE_INTTYPES_H=1 -DHAVE_STDINT_H=1 -DHAVE_UNISTD_H=1 -DGETGROUPS_T=gid_t -DHAVE_GETGROUPS=1 -DGETGROUPS_T=gid_t -I.  -DPERMS=1 -D_FILE_OFFSET_BITS=64 -I/include -I/Users/charlescearl/Development/hadoop-0.20.203.0/src/c++/libhdfs/ -I/include/linux/ -D_FUSE_DFS_VERSION=\"0.1.0\" -DPROTECTED_PATHS=\"\" -I/include   -Wall -O3 -MT fuse_connect.o -MD -MP -MF .deps/fuse_connect.Tpo -c -o fuse_connect.o fuse_connect.c
> 
> BUILD FAILED
> /Users/charlescearl/Development/hadoop-0.20.203.0/build.xml:614: The following error occurred while executing this line:
> /Users/charlescearl/Development/hadoop-0.20.203.0/src/contrib/build.xml:30: The following error occurred while executing this line:
> /Users/charlescearl/Development/hadoop-0.20.203.0/src/contrib/fuse-dfs/build.xml:57: exec returned: 2
> 
>