You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by "Balachandar R.A." <ba...@gmail.com> on 2016/07/11 12:07:40 UTC

Spark job state is EXITED but does not return

Hello,

I have one apache spark based simple use case that process two datasets.
Each dataset takes about 5-7 min to process. I am doing this processing
inside the sc.parallelize(datasets){ } block. While the first dataset is
processed successfully,  the processing of dataset is not started by spark.
The application state is RUNNING but in executor summary, I notice that the
state here is EXITED. Can someone tell me where things are going wrong?

Regards
Bala