You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Shixiong Zhu <zs...@gmail.com> on 2015/10/01 08:28:23 UTC

Re: Spark Streaming Standalone 1.5 - Stage cancelled because SparkContext was shut down

Do you have the log? Looks like some exceptions in your codes make
SparkContext stopped.

Best Regards,
Shixiong Zhu

2015-09-30 17:30 GMT+08:00 tranan <tr...@gmail.com>:

> Hello All,
>
> I have several Spark Streaming applications running on Standalone mode in
> Spark 1.5.  Spark is currently set up for dynamic resource allocation.  The
> issue I am seeing is that I can have about 12 Spark Streaming Jobs running
> concurrently.  Occasionally I would see more than half where to fail due to
> Stage cancelled because SparkContext was shut down.  It would automatically
> restart as it runs on supervised mode.  Attached is the screenshot of one
> of
> the jobs that failed.  Anyone have any insight as to what is going on?
> <
> http://apache-spark-user-list.1001560.n3.nabble.com/file/n24885/Screen_Shot_2015-09-29_at_8.png
> >
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-Standalone-1-5-Stage-cancelled-because-SparkContext-was-shut-down-tp24885.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>