You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Andrew Or (JIRA)" <ji...@apache.org> on 2014/06/17 01:24:01 UTC
[jira] [Created] (SPARK-2159) Spark shell exit() does not stop
SparkContext
Andrew Or created SPARK-2159:
--------------------------------
Summary: Spark shell exit() does not stop SparkContext
Key: SPARK-2159
URL: https://issues.apache.org/jira/browse/SPARK-2159
Project: Spark
Issue Type: Bug
Components: Spark Core
Affects Versions: 1.0.0
Reporter: Andrew Or
Priority: Minor
Fix For: 1.1.0
If you type "exit()" in spark shell, it is equivalent to a Ctrl+C and does not stop the SparkContext. This is used very commonly to exit a shell, and it would be good if it is equivalent to Ctrl+D instead, which does stop the SparkContext.
--
This message was sent by Atlassian JIRA
(v6.2#6252)