You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Rahul Sood <rs...@yahoo-inc.com> on 2008/02/20 14:22:56 UTC

deploying Pipes API app

Hi,

We're trying to deploy a C++ MapReduce app developed with the Hadoop
Pipes API on a large cluster. The app is not starting up because its
shared libs are not present on the cluster nodes. For local testing we
set up a 2 node cluster on our dev boxes, where all libs are in a
standard location - /home/myapp/lib and LD_LIBRARY_PATH is set to this
path. The app runs without any problems here. 

Are there any general procedures for deploying a C++ app on a Hadoop
cluster ? Ideally I'd like to just copy the libs to HDFS and let the
framework move them to the nodes where the map/reduce tasks are being
run. The libs should also be removed from the nodes after the tasks have
completed.

Thanks,

Rahul Sood
Advanced Tech Group
Yahoo, Bangalore


Re: deploying Pipes API app

Posted by Arun C Murthy <ac...@yahoo-inc.com>.
On Feb 20, 2008, at 5:22 AM, Rahul Sood wrote:

>
> Are there any general procedures for deploying a C++ app on a Hadoop
> cluster ? Ideally I'd like to just copy the libs to HDFS and let the
> framework move them to the nodes where the map/reduce tasks are being
> run. The libs should also be removed from the nodes after the tasks  
> have
> completed.
>

Unfortunately no. If you were writing java apps you could use  
System.load or System.loadLibrary...

I've opened https://issues.apache.org/jira/browse/HADOOP-2867 to make  
this enhancement.

Arun

> Thanks,
>
> Rahul Sood
> Advanced Tech Group
> Yahoo, Bangalore
>