You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Tijo Thomas (JIRA)" <ji...@apache.org> on 2015/04/22 13:56:59 UTC
[jira] [Closed] (SPARK-6928) spark-shell stops working after the
replay command
[ https://issues.apache.org/jira/browse/SPARK-6928?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Tijo Thomas closed SPARK-6928.
------------------------------
Resolution: Not A Problem
> spark-shell stops working after the replay command
> --------------------------------------------------
>
> Key: SPARK-6928
> URL: https://issues.apache.org/jira/browse/SPARK-6928
> Project: Spark
> Issue Type: Bug
> Components: Spark Shell
> Affects Versions: 1.3.0
> Environment: Scala Version :Scala-2.10
> Reporter: Tijo Thomas
>
> Step to reproduce this issues.
> Step 1 :
> scala> sc.parallelize(1 to 10).map(_+"2").count();
> res0: Long = 10
> Step 2 :
> scala> :replay
> Replaying: sc.parallelize(1 to 10).map(_+"2").count();
> <console>:8: error: not found: value sc
> sc.parallelize(1 to 10).map(_+"2").count();
> ^
> // Note : After Replay command , Non of the spark api's are working as the SparkContext has gone out of scope.
> eg: getting this exception as given below
> scala> exit
> error:
> while compiling: <console>
> during phase: jvm
> library version: version 2.10.4
> compiler version: version 2.10.4
> reconstructed args:
> last tree to typer: Apply(constructor $read)
> symbol: constructor $read in class $read (flags: <method> <triedcooking>)
> symbol definition: def <init>(): $line20.$read
> tpe: $line20.$read
> symbol owners: constructor $read -> class $read -> package $line20
> context owners: class iwC -> package $line20
> ............
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org