You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by lbustelo <gi...@bustelos.com> on 2014/06/13 21:41:35 UTC

spark-submit fails to get jar from http source

I'm running a 1.0.0 standalone cluster based on amplab/dockerscripts with 3
workers. I'm testing out spark-submit and I'm getting errors using
*--deploy-mode cluster* and using an http:// url to my JAR. I'm getting the
following error back.

Sending launch command to spark://master:7077
Driver successfully submitted as driver-20140613191831-0009
... waiting before polling master for driver state
... polling master for driver state
State of driver-20140613191831-0009 is ERROR
Exception from cluster was: java.io.IOException: No FileSystem for scheme:
http

I verified that my jar URL is accessible from within the spark nodes
(workers and master). I also ran with the same URL and *--deploy-mode
client*, and things worked. 

Documentation at
http://spark.apache.org/docs/latest/submitting-applications.html in section
/Advanced Dependency Management/ suggest that this should work.

Is this a known issue, or are my expectations wrong?

PS - I find it very limiting that spark-submit would not take care of
uploading my jar to the cluster. This is a fundamental requirement that most
frameworks support (i.e Storm, Hadoop, etc.). I do not consider this to be a
requirement specific to the JobServer work, rather part of the master's api.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/spark-submit-fails-to-get-jar-from-http-source-tp7592.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.