You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Cheolsoo Park (JIRA)" <ji...@apache.org> on 2015/04/20 09:01:15 UTC
[jira] [Created] (SPARK-7006) Inconsistent behavior for ctrl-c in
Spark shells
Cheolsoo Park created SPARK-7006:
------------------------------------
Summary: Inconsistent behavior for ctrl-c in Spark shells
Key: SPARK-7006
URL: https://issues.apache.org/jira/browse/SPARK-7006
Project: Spark
Issue Type: Wish
Components: Spark Shell, YARN
Affects Versions: 1.3.1
Environment: YARN
Reporter: Cheolsoo Park
Priority: Minor
When ctrl-c is pressed in shell, behaviors are not consistent across spark-sql, spark-shell, and pyspark resulting in confusion for users. Here is the summary-
||shell||after ctrl-c|
|spark-sql|cancels the running job|
|spark-shell|exits the shell|
|pyspark|throws error \[1\] and doesn't cancel the job|
Particularly, pyspark is worst because it gives a wrong impression that the job is cancelled although it is not.
Ideally, every shell should act like {{spark-sql}} because it allows users to cancel the running job while staying in shell. (Pressing ctrl-c twice exits the shell.)
\[1\] pyspark error for ctrl-c
{code}
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/cheolsoop/spark/jars/spark-1.3.1/python/pyspark/sql/dataframe.py", line 284, in count
return self._jdf.count()
File "/home/cheolsoop/spark/jars/spark-1.3.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", line 536, in __call__
File "/home/cheolsoop/spark/jars/spark-1.3.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", line 364, in send_command
File "/home/cheolsoop/spark/jars/spark-1.3.1/python/lib/py4j-0.8.2.1-src.zip/py4j/java_gateway.py", line 473, in send_command
File "/usr/lib/python2.7/socket.py", line 430, in readline
data = recv(1)
KeyboardInterrupt
{code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org