You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by ''癫、砜' <29...@qq.com> on 2014/03/27 13:07:45 UTC

Spark Pipe wrapException

When I use RDD.pipe("program") to analysis data, the spark throw wrapException. Something special is the native program just do "scanf" and "printf", we find when the scale of data is small, everything is ok, but when the scale of data increate, we got these exception.
We try to analysis the reason:The stack tell us socket time out, and the time is about 60 seconds, so we add "dfs.socket.timeout" to "dfs.xml" but it doesn't work.
Here is the error stack, maybe someone got the same problem as me,looking forward to the Reply.
the