You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by hadooprecruit <sh...@gmail.com> on 2011/02/22 20:52:03 UTC

Re: fuse-dfs

Hadoop Expertise needed in Santa Clara, CA

Position: Principal Software engineer, Hadoop / VM 

Primary Location: Santa Clara, CA
Company: global enterprise Storage Company

 

 

Job Description: 

Join a dynamic and innovative team, lead the design, development and support
of next generation enterprise products for Big Data. This new initiative
will include Hadoop powered big data analytics and management capability for
business and IT using virtual machine technologies and cloud infrastructure
as building blocks. We are looking for leaders to help bring their unique
expertise to build and expand this key initiative that will help both
customers and end users gain the benefits of data mining over unstructured
data sets. This is a ground floor opportunity to work on cutting edge
technology with a large relevant addressable market.

 

Job Functions: 

-           Design and develop key software product components 

-           Ensure the quality of the contributions through peer design
reviews, the development of automated unit tests, and coordinating feature
and system testing with the QA team.

-           Trouble-shoot problems from system test and the field

 

Qualifications: 

-           Proven track record of 5 years and more of designing and
implementing large scalable systems

-           Proven track record of 3 years and more of leading the
architecture, design and development of enterprise software

-           Understanding of distributed systems, map-reduce algorithms,
Hadoop, object-oriented programming, and performance optimization
techniques. Hadoop development experience is a plus.

-           Familiarity with the Hadoop ecosystem. Knowledge of HBase, PIG,
HIVE a plus.

-           Understanding and hands-on experience with virtual machine
technology and hypervisors. Experience with KVM and XEN is a plus. 

-           Database server development experience

-           Web application development experience

-           Data warehouse and analytics experience

-           Ability to work with customers, understand customer business
requirements and communicate them to a development organization

-           Strong Java development and object oriented programming skills

-           Strong C++ development skills 

 

please email if interested: shealeneh@gmail.com

Thanks!!



Sebastian Vieira-2 wrote:
> 
> Hi,
> 
> I have installed Hadoop on 20 nodes (data storage) and one master
> (namenode)
> to which i want to add data. I have learned that this is possible through
> a
> Java API or via the Hadoop shell. However, i would like to mount the HDFS
> using FUSE and i discovered that there's a contrib/fuse-dfs within the
> Hadoop tar.gz package. Now i read the README file and noticed that i was
> unable to compile using this command:
> 
> ant compile-contrib -Dcompile.c++=1 -Dfusedfs=1
> 
> If i change the line to:
> 
> ant compile-contrib -Dcompile.c++=1 -Dlibhdfs-fuse=1
> 
> It goes a little bit further. It will now start the configure script, but
> still fails. I've tried alot of different things but i'm unable to compile
> fuse-dfs. This is a piece of the error i get from ant:
> 
> compile:
>      [echo] contrib: fuse-dfs
> -snip-
>      [exec] Making all in src
>      [exec] make[1]: Entering directory
> `/usr/local/src/hadoop-core-trunk/src/contrib/fuse-dfs/src'
>      [exec] gcc  -Wall -O3
> -L/usr/local/src/hadoop-core-trunk/build/libhdfs
> -lhdfs -L/usr/lib -lfuse -L/usr/java/jdk1.6.0_07/jre/lib/i386/server -ljvm
> -o fuse_dfs  fuse_dfs.o
>      [exec] /usr/bin/ld: cannot find -lhdfs
>      [exec] collect2: ld returned 1 exit status
>      [exec] make[1]: *** [fuse_dfs] Error 1
>      [exec] make[1]: Leaving directory
> `/usr/local/src/hadoop-core-trunk/src/contrib/fuse-dfs/src'
>      [exec] make: *** [all-recursive] Error 1
> 
> BUILD FAILED
> /usr/local/src/hadoop-core-trunk/build.xml:413: The following error
> occurred
> while executing this line:
> /usr/local/src/hadoop-core-trunk/src/contrib/build.xml:30: The following
> error occurred while executing this line:
> /usr/local/src/hadoop-core-trunk/src/contrib/fuse-dfs/build.xml:40: exec
> returned: 2
> 
> 
> Could somebody shed some light on this?
> 
> 
> thanks,
> 
> Sebastian.
> 
> 

-- 
View this message in context: http://old.nabble.com/fuse-dfs-tp18849722p30989045.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.