You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Adarsh Sharma <ad...@orkash.com> on 2011/03/14 13:11:00 UTC
Not able to Run C++ code in Hadoop Cluster
Dear all,
I am puzzled around the error occured in running a C++ program to run
through Hadoop Pipes.
Below exception occurs while running the code. The error occurs in
reduce phase :
[hadoop@ws37-mah-lin hadoop-0.20.2]$ bin/hadoop pipes -D
hadoop.pipes.java.recordreader=true -D
hadoop.pipes.java.recordwriter=true -input gutenberg -output
gutenberg_cuda_output_final -program bin/wordcount1
11/03/14 17:27:29 WARN mapred.JobClient: No job jar file set. User
classes may not be found. See JobConf(Class) or JobConf#setJar(String).
11/03/14 17:27:29 INFO mapred.FileInputFormat: Total input paths to
process : 3
11/03/14 17:27:30 INFO mapred.JobClient: Running job: job_201103141407_0003
11/03/14 17:27:31 INFO mapred.JobClient: map 0% reduce 0%
11/03/14 17:27:46 INFO mapred.JobClient: map 100% reduce 0%
11/03/14 17:27:54 INFO mapred.JobClient: map 100% reduce 33%
11/03/14 17:27:56 INFO mapred.JobClient: Task Id :
attempt_201103141407_0003_r_000000_0, Status : FAILED
java.net.SocketException: Broken pipe
at java.net.SocketOutputStream.socketWrite0(Native Method)
at
java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:92)
at java.net.SocketOutputStream.write(SocketOutputStream.java:136)
at
java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:65)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:109)
at java.io.DataOutputStream.write(DataOutputStream.java:90)
at
org.apache.hadoop.mapred.pipes.BinaryProtocol.writeObject(BinaryProtocol.java:333)
at
org.apache.hadoop.mapred.pipes.BinaryProtocol.reduceValue(BinaryProtocol.java:302)
at
org.apache.hadoop.mapred.pipes.PipesReducer.reduce(PipesReducer.java:66)
at
org.apache.hadoop.mapred.pipes.PipesReducer.reduce(PipesReducer.java:37)
at
org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:463)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:411)
at org.apache.hadoop.mapred.Child.main(Child.java:170)
11/03/14 17:27:57 INFO mapred.JobClient: map 100% reduce 0%
11/03/14 17:28:07 INFO mapred.JobClient: map 100% reduce 33%
11/03/14 17:28:09 INFO mapred.JobClient: Task Id :
attempt_201103141407_0003_r_000000_1, Status : FAILED
java.net.SocketException: Broken pipe
at java.net.SocketOutputStream.socketWrite0(Native Method)
at
java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:92)
at java.net.SocketOutputStream.write(SocketOutputStream.java:136)
at
java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:65)
at java.io.BufferedOutputStream.write(BufferedOutputStream.java:109)
at java.io.DataOutputStream.write(DataOutputStream.java:90)
at
org.apache.hadoop.mapred.pipes.BinaryProtocol.writeObject(BinaryProtocol.java:333)
at
org.apache.hadoop.mapred.pipes.BinaryProtocol.reduceValue(BinaryProtocol.java:302)
at
org.apache.hadoop.mapred.pipes.PipesReducer.reduce(PipesReducer.java:66)
at
org.apache.hadoop.mapred.pipes.PipesReducer.reduce(PipesReducer.java:37)
at
org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:463)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:411)
at org.apache.hadoop.mapred.Child.main(Child.java:170)
I attached the code. Please find the attachment.
Thanks & best Regards,
Adarsh Sharma
Re: Not able to Run C++ code in Hadoop Cluster
Posted by Adarsh Sharma <ad...@orkash.com>.
Is it possible to run C++/GPU Code in Map-Reduce Framework through
Hadoop Streaming, if there is simple example , Please let me know.
Thanks & best Regards,
Adarsh Sharma
He Chen wrote:
> Agree with Keith Wiley, we use streaming also.
>
> On Mon, Mar 14, 2011 at 11:40 AM, Keith Wiley <kw...@keithwiley.com> wrote:
>
>
>> Not to speak against pipes because I don't have much experience with it,
>> but I eventually abandoned my pipes efforts and went with streaming. If you
>> don't get pipes to work, you might take a look at streaming as an
>> alternative.
>>
>> Cheers!
>>
>>
>> ________________________________________________________________________________
>> Keith Wiley kwiley@keithwiley.com keithwiley.com
>> music.keithwiley.com
>>
>> "I used to be with it, but then they changed what it was. Now, what I'm
>> with
>> isn't it, and what's it seems weird and scary to me."
>> -- Abe (Grandpa) Simpson
>>
>> ________________________________________________________________________________
>>
>>
>>
>
>
Re: Not able to Run C++ code in Hadoop Cluster
Posted by He Chen <ai...@gmail.com>.
Agree with Keith Wiley, we use streaming also.
On Mon, Mar 14, 2011 at 11:40 AM, Keith Wiley <kw...@keithwiley.com> wrote:
> Not to speak against pipes because I don't have much experience with it,
> but I eventually abandoned my pipes efforts and went with streaming. If you
> don't get pipes to work, you might take a look at streaming as an
> alternative.
>
> Cheers!
>
>
> ________________________________________________________________________________
> Keith Wiley kwiley@keithwiley.com keithwiley.com
> music.keithwiley.com
>
> "I used to be with it, but then they changed what it was. Now, what I'm
> with
> isn't it, and what's it seems weird and scary to me."
> -- Abe (Grandpa) Simpson
>
> ________________________________________________________________________________
>
>
Re: Not able to Run C++ code in Hadoop Cluster
Posted by Keith Wiley <kw...@keithwiley.com>.
Not to speak against pipes because I don't have much experience with it, but I eventually abandoned my pipes efforts and went with streaming. If you don't get pipes to work, you might take a look at streaming as an alternative.
Cheers!
________________________________________________________________________________
Keith Wiley kwiley@keithwiley.com keithwiley.com music.keithwiley.com
"I used to be with it, but then they changed what it was. Now, what I'm with
isn't it, and what's it seems weird and scary to me."
-- Abe (Grandpa) Simpson
________________________________________________________________________________