You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Mkal <di...@hotmail.com> on 2019/03/05 11:25:32 UTC

C++ script on Spark Cluster throws exit status 132

I'm trying to run a c++ program on spark cluster by using the rdd.pipe()
operation but the executors throw: java.lang.IllegalStateException:
Subprocess exited with status 132.

The spark jar runs totally fine on standalone and the c++ program runs just
fine on its own as well. I've tried with another simple c++ script and there
is no problem with it running on the cluster.

As i understand it the number 132 means Illegal Instruction but i don't know
how to use this to pinpoint the source of this error.

I get no further info by checking the executor logs.I'm posting this here
hoping that someone has a suggestion. I've tried other forums but no luck
yet.



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org