You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Shashank Agarwal <sh...@gmail.com> on 2009/09/16 06:51:15 UTC

Reg: Error with "ant compile-contrib -Dlibhdfs=1 -Dfusedfs=1"

Hi,

I am trying to run this build target with libhdfs and fusedfs flags. But the
build fails and I get
----------------------
/home/shashank/Desktop/LabWork/hadoop-0.20.0/build.xml:497: The following
error occurred while executing this line:
/home/shashank/Desktop/LabWork/hadoop-0.20.0/src/contrib/build.xml:30: The
following error occurred while executing this line:
/home/shashank/Desktop/LabWork/hadoop-0.20.0/src/contrib/fuse-dfs/build.xml:54:
exec returned: 1
---------------------------------

I visited several threads for the fix, but nothing worked.
Based on suggestions I did
"ant compile-c++-libhdfs -Dislibhdfs=1" followed by ant package.
And then "ant compile-contrib -Dlibhdfs=1 -Dfusedfs=1"
Everything worked fine, but I got stuck on the above mentioned error.
Some threads suggested that I should includ the /usr/local/lib in
/etc/ld.so.conf . I did that as well, but no luck.

I tried on hadoop 0.20.0 and 0.20.1.

Can anyone help me with this.

Thanks

Re: Reg: Error with "ant compile-contrib -Dlibhdfs=1 -Dfusedfs=1"

Posted by Ted Yu <yu...@gmail.com>.
In the version of fuse.h I see:
    int (*statfs) (const char *, struct statfs *);

In hadoop\src\contrib\fuse-dfs\src\fuse_impls.h, I see:
int dfs_statfs(const char *path, struct statvfs *st);

If someone can provide a version of fuse.h that matches how fuse_impls.h
uses it, that would help.

On Wed, Sep 16, 2009 at 4:51 PM, Matt Massie <ma...@cloudera.com> wrote:

> This is C.  This is a common way to set callbacks in a struct.  Linux
> source
> code is full of syntax like this.
>
> -Matt
>
> On Wed, Sep 16, 2009 at 4:44 PM, Ted Yu <yu...@gmail.com> wrote:
>
> > I found this in fuse_dfs.c:
> >
> > static struct fuse_operations dfs_oper = {
> >  .getattr    = dfs_getattr,
> >  .access    = dfs_access,
> >  .readdir    = dfs_readdir,
> >  .destroy       = dfs_destroy,
> >  .init         = dfs_init,
> >
> > I am wondering what syntax this is.
> >
> > On Tue, Sep 15, 2009 at 9:51 PM, Shashank Agarwal <
> > shashankagarwal1706@gmail.com> wrote:
> >
> > > Hi,
> > >
> > > I am trying to run this build target with libhdfs and fusedfs flags.
> But
> > > the
> > > build fails and I get
> > > ----------------------
> > > /home/shashank/Desktop/LabWork/hadoop-0.20.0/build.xml:497: The
> following
> > > error occurred while executing this line:
> > > /home/shashank/Desktop/LabWork/hadoop-0.20.0/src/contrib/build.xml:30:
> > The
> > > following error occurred while executing this line:
> > >
> > >
> >
> /home/shashank/Desktop/LabWork/hadoop-0.20.0/src/contrib/fuse-dfs/build.xml:54:
> > > exec returned: 1
> > > ---------------------------------
> > >
> > > I visited several threads for the fix, but nothing worked.
> > > Based on suggestions I did
> > > "ant compile-c++-libhdfs -Dislibhdfs=1" followed by ant package.
> > > And then "ant compile-contrib -Dlibhdfs=1 -Dfusedfs=1"
> > > Everything worked fine, but I got stuck on the above mentioned error.
> > > Some threads suggested that I should includ the /usr/local/lib in
> > > /etc/ld.so.conf . I did that as well, but no luck.
> > >
> > > I tried on hadoop 0.20.0 and 0.20.1.
> > >
> > > Can anyone help me with this.
> > >
> > > Thanks
> > >
> >
>

Re: Reg: Error with "ant compile-contrib -Dlibhdfs=1 -Dfusedfs=1"

Posted by Matt Massie <ma...@cloudera.com>.
This is C.  This is a common way to set callbacks in a struct.  Linux source
code is full of syntax like this.

-Matt

On Wed, Sep 16, 2009 at 4:44 PM, Ted Yu <yu...@gmail.com> wrote:

> I found this in fuse_dfs.c:
>
> static struct fuse_operations dfs_oper = {
>  .getattr    = dfs_getattr,
>  .access    = dfs_access,
>  .readdir    = dfs_readdir,
>  .destroy       = dfs_destroy,
>  .init         = dfs_init,
>
> I am wondering what syntax this is.
>
> On Tue, Sep 15, 2009 at 9:51 PM, Shashank Agarwal <
> shashankagarwal1706@gmail.com> wrote:
>
> > Hi,
> >
> > I am trying to run this build target with libhdfs and fusedfs flags. But
> > the
> > build fails and I get
> > ----------------------
> > /home/shashank/Desktop/LabWork/hadoop-0.20.0/build.xml:497: The following
> > error occurred while executing this line:
> > /home/shashank/Desktop/LabWork/hadoop-0.20.0/src/contrib/build.xml:30:
> The
> > following error occurred while executing this line:
> >
> >
> /home/shashank/Desktop/LabWork/hadoop-0.20.0/src/contrib/fuse-dfs/build.xml:54:
> > exec returned: 1
> > ---------------------------------
> >
> > I visited several threads for the fix, but nothing worked.
> > Based on suggestions I did
> > "ant compile-c++-libhdfs -Dislibhdfs=1" followed by ant package.
> > And then "ant compile-contrib -Dlibhdfs=1 -Dfusedfs=1"
> > Everything worked fine, but I got stuck on the above mentioned error.
> > Some threads suggested that I should includ the /usr/local/lib in
> > /etc/ld.so.conf . I did that as well, but no luck.
> >
> > I tried on hadoop 0.20.0 and 0.20.1.
> >
> > Can anyone help me with this.
> >
> > Thanks
> >
>

Re: Reg: Error with "ant compile-contrib -Dlibhdfs=1 -Dfusedfs=1"

Posted by Ted Yu <yu...@gmail.com>.
I found this in fuse_dfs.c:

static struct fuse_operations dfs_oper = {
  .getattr    = dfs_getattr,
  .access    = dfs_access,
  .readdir    = dfs_readdir,
  .destroy       = dfs_destroy,
  .init         = dfs_init,

I am wondering what syntax this is.

On Tue, Sep 15, 2009 at 9:51 PM, Shashank Agarwal <
shashankagarwal1706@gmail.com> wrote:

> Hi,
>
> I am trying to run this build target with libhdfs and fusedfs flags. But
> the
> build fails and I get
> ----------------------
> /home/shashank/Desktop/LabWork/hadoop-0.20.0/build.xml:497: The following
> error occurred while executing this line:
> /home/shashank/Desktop/LabWork/hadoop-0.20.0/src/contrib/build.xml:30: The
> following error occurred while executing this line:
>
> /home/shashank/Desktop/LabWork/hadoop-0.20.0/src/contrib/fuse-dfs/build.xml:54:
> exec returned: 1
> ---------------------------------
>
> I visited several threads for the fix, but nothing worked.
> Based on suggestions I did
> "ant compile-c++-libhdfs -Dislibhdfs=1" followed by ant package.
> And then "ant compile-contrib -Dlibhdfs=1 -Dfusedfs=1"
> Everything worked fine, but I got stuck on the above mentioned error.
> Some threads suggested that I should includ the /usr/local/lib in
> /etc/ld.so.conf . I did that as well, but no luck.
>
> I tried on hadoop 0.20.0 and 0.20.1.
>
> Can anyone help me with this.
>
> Thanks
>