You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by jeremycod <zo...@gmail.com> on 2017/02/17 00:46:19 UTC

Spark Worker can't find jar submitted programmatically

Hi,I'm trying to create application that would programmatically submit jar
file to Spark standalone cluster running on my local PC. However, I'm always
getting the error WARN  TaskSetManager:66 - Lost task 1.0 in stage 0.0 (TID
1, 192.168.2.68, executor 0): java.lang.RuntimeException: Stream
'/jars/sample-spark-maven-one-jar.jar' was not found.I'm creating the
SparkContext in the following way:val sparkConf = new SparkConf() 
sparkConf.setMaster("spark://zoran-Latitude-E5420:7077") 
sparkConf.set("spark.cores_max","2") 
sparkConf.set("spark.executor.memory","2g") 
sparkConf.set("spark.serializer",
"org.apache.spark.serializer.KryoSerializer")  sparkConf.setAppName("Test
application")  sparkConf.set("spark.ui.port","4041") 
sparkConf.set("spark.local.ip","192.168.2.68")  val
oneJar="/samplesparkmaven/target/sample-spark-maven-one-jar.jar" 
sparkConf.setJars(List(oneJar))  val sc = new SparkContext(sparkConf)I'm
using Spark 2.1.0 in standalone mode with master and one worker. Does anyone
have idea where the problem might be or how to investigate it
further?Thanks,Zoran



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Worker-can-t-find-jar-submitted-programmatically-tp28398.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Spark Worker can't find jar submitted programmatically

Posted by Cosmin Posteuca <co...@gmail.com>.
Hi Zoran,

I think you are looking for --jars parameter/argument to spark-submit

When using spark-submit, the application jar along with any jars included
> with the --jars option will be automatically transferred to the cluster.
> URLs supplied after --jars must be separated by commas. (
> http://spark.apache.org/docs/latest/submitting-applications.html)


I don't know if this work on standalone mode, but for me work on yarn mode.

Thanks,
Cosmin

2017-02-17 2:46 GMT+02:00 jeremycod <zo...@gmail.com>:

> Hi, I'm trying to create application that would programmatically submit
> jar file to Spark standalone cluster running on my local PC. However, I'm
> always getting the error WARN TaskSetManager:66 - Lost task 1.0 in stage
> 0.0 (TID 1, 192.168.2.68, executor 0): java.lang.RuntimeException: Stream
> '/jars/sample-spark-maven-one-jar.jar' was not found. I'm creating the
> SparkContext in the following way: val sparkConf = new SparkConf()
> sparkConf.setMaster("spark://zoran-Latitude-E5420:7077")
> sparkConf.set("spark.cores_max","2") sparkConf.set("spark.executor.memory","2g")
> sparkConf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
> sparkConf.setAppName("Test application") sparkConf.set("spark.ui.port","4041")
> sparkConf.set("spark.local.ip","192.168.2.68") val
> oneJar="/samplesparkmaven/target/sample-spark-maven-one-jar.jar"
> sparkConf.setJars(List(oneJar)) val sc = new SparkContext(sparkConf) I'm
> using Spark 2.1.0 in standalone mode with master and one worker. Does
> anyone have idea where the problem might be or how to investigate it
> further? Thanks, Zoran
> ------------------------------
> View this message in context: Spark Worker can't find jar submitted
> programmatically
> <http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Worker-can-t-find-jar-submitted-programmatically-tp28398.html>
> Sent from the Apache Spark User List mailing list archive
> <http://apache-spark-user-list.1001560.n3.nabble.com/> at Nabble.com.
>