You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Peng Du <im...@gmail.com> on 2010/12/09 18:03:04 UTC

EOFException thrown by a Hadoop pipes program

Hello,

I have a small Hadoop pipes program that throws java.io.EOFException. 
The program takes as input a small text file and uses 
hadoop.pipes.java.recordreader and hadoop.pipes.java.recordwriter. The 
input is very simple like:

|1  262144  42.8084  15.9157  4.1324  0.06  0.1

|

However, Hadoop will throw an EOFException, which I can't see the 
reason. Below is the stack trace:

|10/12/08  23:04:04  INFO mapred.JobClient:  Running  job:  job_201012081252_0016
10/12/08  23:04:05  INFO mapred.JobClient:    map0%  reduce0%
10/12/08  23:04:16  INFO mapred.JobClient:  Task  Id  :  attempt_201012081252_0016_m_000000_0,  Status  :  FAILED
java.io.IOException:  pipe child exception
     at org.apache.hadoop.mapred.pipes.Application.abort(Application.java:151)
     at org.apache.hadoop.mapred.pipes.PipesMapRunner.run(PipesMapRunner.java:101)
     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358)
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
     at org.apache.hadoop.mapred.Child.main(Child.java:170)
Caused  by:  java.io.EOFException
     at java.io.DataInputStream.readByte(DataInputStream.java:267)
     at org.apache.hadoop.io.WritableUtils.readVLong(WritableUtils.java:298)
     at org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:319)
     at org.apache.hadoop.mapred.pipes.BinaryProtocol$UplinkReaderThread.run(BinaryProtocol.java:114)


|

BTW, I ran this on a fully-distributed mode (a cluster with 3 work nodes).

I am stuck and any help is appreciated! Thanks


- Peng

|


|