You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Matthew Farrellee (JIRA)" <ji...@apache.org> on 2014/09/21 17:35:33 UTC
[jira] [Commented] (SPARK-550) Hiding the default spark context in
the spark shell creates serialization issues
[ https://issues.apache.org/jira/browse/SPARK-550?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14142477#comment-14142477 ]
Matthew Farrellee commented on SPARK-550:
-----------------------------------------
a lot of code has changed in this space over the past 2 years. i'm going to close this, but feel free to re-open if you feel it's still an issue.
> Hiding the default spark context in the spark shell creates serialization issues
> --------------------------------------------------------------------------------
>
> Key: SPARK-550
> URL: https://issues.apache.org/jira/browse/SPARK-550
> Project: Spark
> Issue Type: Bug
> Reporter: tjhunter
>
> I copy-pasted a piece of code along these lines in the spark shell:
> ...
> val sc = new SparkContext("local[%d]" format num_splits,"myframework")
> val my_rdd = sc.textFile(...)
> my_rdd.count()
> This leads to the shell crashing with a java.io.NotSerializableException: spark.SparkContext
> It took me a while to realize it was due to the new spark context created. Maybe a warning/error should be triggered if the user tries to change the definition of sc?
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org