You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Hu...@Dell.com on 2013/11/11 19:04:29 UTC

Worker: Executor ... finished with state FAILED message Command exited with code 1 exitStatus 1

Dell - Internal Use - Confidential
Hi,
Any suggestion on how to get more info/trace in spark worker logs? I am setting debug in my /opt/spark-0.8.0/conf/log4j.properties but only show INFO and I need to find out why an executor job fails
log4j.rootCategory=DEBUG, console
>From the worker logs, it just shows
13/11/08 16:56:11 INFO Worker: Executor app-20131108164655-0000/8 finished with state FAILED message Command exited with code 1 exitStatus 1

>From the main driver my console shows
org.apache.spark.SparkException: Job failed: Error: Disconnected from Spark cluster
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:760)
        at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:758)
        at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:60)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
        at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:758)
        at org.apache.spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:379)
        at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$run(DAGScheduler.scala:441)
        at org.apache.spark.scheduler.DAGScheduler$$anon$1.run(DAGScheduler.scala:149)

Thanks,
Hussam