You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by YaoPau <jo...@gmail.com> on 2015/08/24 22:06:50 UTC

Run Spark job from within iPython+Spark?

I set up iPython Notebook to work with the pyspark shell, and now I'd like
use %run to basically 'spark-submit' another Python Spark file, and leave
the objects accessible within the Notebook.

I tried this, but got a "ValueError: Cannot run multiple SparkContexts at
once" error.  I then tried taking out the 'sc = SparkContext()' line from
the .py file, but then it couldn't access sc.

How can I %run another Python Spark file within iPython Notebook?



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Run-Spark-job-from-within-iPython-Spark-tp24427.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org