You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by ryaminal <ta...@gmail.com> on 2014/11/07 21:17:08 UTC

Multiple Applications(Spark Contexts) Concurrently Fail With Broadcast Error

We are unable to run more than one application at a time using Spark 1.0.0 on
CDH5. We submit two applications using two different SparkContexts on the
same Spark Master. The Spark Master was started using the following command
and parameters and is running in standalone mode:

> /usr/java/jdk1.7.0_55-cloudera/bin/java   -XX:MaxPermSize=128m  
> -Djava.net.preferIPv4Stack=true   -Dspark.akka.logLifecycleEvents=true  
> -Xms8589934592   -Xmx8589934592   org.apache.spark.deploy.master.Master    
> --ip ip-10-186-155-45.ec2.internal

When submitting this application by itself it finishes and all of the data
comes out happy. The problem occurs when trying to run another application
while an existing application is still processing and we get an error
stating that the spark contexts were shut down prematurely.The errors can be
viewed in the following pastebins. All IP addresses have been changed to
1.1.1.1 for security reasons. Notice that on the top of the logs we have
printed out the spark config stuff for reference.The working logs:  Working
Pastebin <http://pastebin.com/CnitnMhy>  The broken logs:  Broken Pastebin
<http://pastebin.com/VGs87bBZ>  We have also included the worker logs. For
the second app, we see in the work/app/ directory 7 additional directors:
`0/ 1/ 2/ 3/ 4/ 5/ 6/`. There are then two different groups of errors. The
first three are one group and the other 4 are the other group of errors.
Worker log for broken app group 1:  Broken App Group 1
<http://pastebin.com/7VwZ1Gwu>  Worker log for broken app group 2:  Broken
App Group 2 <http://pastebin.com/shs4d8T4>  Worker log for working app:
available upon request.
The two different errors are the last lines of both groups and are:

> Received LaunchTask command but executor was null



> Slave registration failed: Duplicate executor ID: 4

tl;drWe are unable to run more than one application in the same spark master
using different spark contexts. The only errors we see are broadcast errors.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Multiple-Applications-Spark-Contexts-Concurrently-Fail-With-Broadcast-Error-tp18374.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.