You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "ASF GitHub Bot (JIRA)" <ji...@apache.org> on 2015/08/23 22:57:45 UTC
[jira] [Commented] (SPARK-10039) Resetting REPL state not work
[ https://issues.apache.org/jira/browse/SPARK-10039?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14708537#comment-14708537 ]
ASF GitHub Bot commented on SPARK-10039:
----------------------------------------
Github user felixcheung commented on the pull request:
https://github.com/apache/incubator-zeppelin/pull/228#issuecomment-133934888
We could call `sc.stop()` or `SparkIMain.reset()`?
Though apparently SparkIMain reset() has some issue: https://issues.apache.org/jira/browse/SPARK-10039
> Resetting REPL state not work
> -----------------------------
>
> Key: SPARK-10039
> URL: https://issues.apache.org/jira/browse/SPARK-10039
> Project: Spark
> Issue Type: Bug
> Components: Spark Shell
> Affects Versions: 1.4.1
> Reporter: Kevin Jung
> Priority: Minor
>
> Spark shell can't find a base directory of class server after running ":reset" command.
> {quote}
> scala> :reset
> scala> 1
> uncaught exception during compilation: java.lang.AssertiON-ERROR
> java.lang.AssertiON-ERROR: assertion failed: Tried to find '$line33' in '/tmp/spark-f47f3917-ac31-4138-bf1a-a8cefd094ac3' but it is not a directory
> ~~~impossible to command anymore including 'exit'~~~
> {quote}
> I figure out reset() method in SparkIMain try to delete virtualDirectory and then create again. But virtualDirectory.create() makes a file, not a directory. Details here.
> {quote}
> drwxrwxr-x. 3 root root 0 2015-08-17 09:09 spark-9cfc6b06-c902-4caf-8712-9ea63f17d017
> (After :reset)
> \-rw-rw-r--. 1 root root 0 2015-08-17 09:09 spark-9cfc6b06-c902-4caf-8712-9ea63f17d017
> {quote}
> "vd.delete; vd.givenPath.createDirectory(true);" will temporarily solve the problem.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org