You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/01/19 10:12:39 UTC

[jira] [Resolved] (SPARK-12876) Race condition when driver rapidly shutdown after started.

     [ https://issues.apache.org/jira/browse/SPARK-12876?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen resolved SPARK-12876.
-------------------------------
    Resolution: Duplicate

> Race condition when driver rapidly shutdown after started.
> ----------------------------------------------------------
>
>                 Key: SPARK-12876
>                 URL: https://issues.apache.org/jira/browse/SPARK-12876
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.4.0
>            Reporter: jeffonia Tung
>            Priority: Minor
>
> It's a little same as the issue: SPARK-4300. Well, this time, it's happen on the driver occasionally.
> [INFO 2016-01-18 17:12:35 (Logging.scala:59)] Asked to launch driver driver-20160118171237-0009
> [INFO 2016-01-18 17:12:35 (Logging.scala:59)] Copying user jar file:/data/dbcenter/cdh5/spark-1.4.0-bin-hadoop2.4/mylib/spark-ly-streaming-v2-201601141018.jar to /data/dbcenter/cdh5/spark-1.4.0-bin-hado
> op2.4/work/driver-20160118171237-0009/spark-ly-streaming-v2-201601141018.jar
> [INFO 2016-01-18 17:12:35 (Logging.scala:59)] Copying /data/dbcenter/cdh5/spark-1.4.0-bin-hadoop2.4/mylib/spark-ly-streaming-v2-201601141018.jar to /data/dbcenter/cdh5/spark-1.4.0-bin-hadoop2.4/work/dri
> ver-20160118171237-0009/spark-ly-streaming-v2-201601141018.jar
> [INFO 2016-01-18 17:12:35 (Logging.scala:59)] Launch Command: "/data/dbcenter/jdk1.7.0_79/bin/java" "-cp" ....."org.apache.spark.deploy.worker.DriverWrapper"......
> [INFO 2016-01-18 17:12:39 (Logging.scala:59)] Asked to launch executor app-20160118171240-0256/15 for DirectKafkaStreamingV2
> [INFO 2016-01-18 17:12:39 (Logging.scala:59)] Launch command: "/data/dbcenter/jdk1.7.0_79/bin/java" "-cp"  ....."org.apache.spark.executor.CoarseGrainedExecutorBackend"......
> [INFO 2016-01-18 17:12:49 (Logging.scala:59)] Asked to kill driver driver-20160118164724-0008
> [INFO 2016-01-18 17:12:49 (Logging.scala:59)] Redirection to /data/dbcenter/cdh5/spark-1.4.0-bin-hadoop2.4/work/driver-20160118164724-0008/stdout closed: Stream closed
> [INFO 2016-01-18 17:12:49 (Logging.scala:59)] Asked to kill executor app-20160118164728-0250/15
> [INFO 2016-01-18 17:12:49 (Logging.scala:59)] Runner thread for executor app-20160118164728-0250/15 interrupted
> [INFO 2016-01-18 17:12:49 (Logging.scala:59)] Killing process!
> [ERROR 2016-01-18 17:12:49 (Logging.scala:96)] Error writing stream to file /data/dbcenter/cdh5/spark-1.4.0-bin-hadoop2.4/work/app-20160118164728-0250/15/stdout
> java.io.IOException: Stream closed
>         at java.io.BufferedInputStream.getBufIfOpen(BufferedInputStream.java:162)
>         at java.io.BufferedInputStream.read1(BufferedInputStream.java:272)
>         at java.io.BufferedInputStream.read(BufferedInputStream.java:334)
>         at java.io.FilterInputStream.read(FilterInputStream.java:107)
>         at org.apache.spark.util.logging.FileAppender.appendStreamToFile(FileAppender.scala:70)
>         at org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply$mcV$sp(FileAppender.scala:39)
>         at org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
>         at org.apache.spark.util.logging.FileAppender$$anon$1$$anonfun$run$1.apply(FileAppender.scala:39)
>         at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1772)
>         at org.apache.spark.util.logging.FileAppender$$anon$1.run(FileAppender.scala:38)
> [INFO 2016-01-18 17:12:49 (Logging.scala:59)] Executor app-20160118164728-0250/15 finished with state KILLED exitStatus 143



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org