You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yasir Arfat (JIRA)" <ji...@apache.org> on 2016/06/22 23:10:16 UTC
[jira] [Commented] (SPARK-11744) bin/pyspark --version doesn't
return version and exit
[ https://issues.apache.org/jira/browse/SPARK-11744?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15345354#comment-15345354 ]
Yasir Arfat commented on SPARK-11744:
-------------------------------------
I have Problem with running the Python code using the pyspark on Spark.
I run the following code it gives me an error as it is given below
>>> text_file=sc.textFile("/Users/spark/Python/word")
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
NameError: name 'sc' is not defined
Anybody can help me.
> bin/pyspark --version doesn't return version and exit
> -----------------------------------------------------
>
> Key: SPARK-11744
> URL: https://issues.apache.org/jira/browse/SPARK-11744
> Project: Spark
> Issue Type: Bug
> Components: PySpark
> Affects Versions: 1.5.2
> Reporter: Nicholas Chammas
> Assignee: Saisai Shao
> Priority: Minor
> Fix For: 1.6.0
>
>
> {{bin/pyspark \-\-help}} offers a {{\-\-version}} option:
> {code}
> $ ./spark/bin/pyspark --help
> Usage: ./bin/pyspark [options]
> Options:
> ...
> --version, Print the version of current Spark
> ...
> {code}
> However, trying to get the version in this way doesn't yield the expected results.
> Instead of printing the version and exiting, we get the version, a stack trace, and then get dropped into a broken PySpark shell.
> {code}
> $ ./spark/bin/pyspark --version
> Python 2.7.10 (default, Aug 11 2015, 23:39:10)
> [GCC 4.8.3 20140911 (Red Hat 4.8.3-9)] on linux2
> Type "help", "copyright", "credits" or "license" for more information.
> Welcome to
> ____ __
> / __/__ ___ _____/ /__
> _\ \/ _ \/ _ `/ __/ '_/
> /___/ .__/\_,_/_/ /_/\_\ version 1.5.2
> /_/
>
> Type --help for more information.
> Traceback (most recent call last):
> File "/home/ec2-user/spark/python/pyspark/shell.py", line 43, in <module>
> sc = SparkContext(pyFiles=add_files)
> File "/home/ec2-user/spark/python/pyspark/context.py", line 110, in __init__
> SparkContext._ensure_initialized(self, gateway=gateway)
> File "/home/ec2-user/spark/python/pyspark/context.py", line 234, in _ensure_initialized
> SparkContext._gateway = gateway or launch_gateway()
> File "/home/ec2-user/spark/python/pyspark/java_gateway.py", line 94, in launch_gateway
> raise Exception("Java gateway process exited before sending the driver its port number")
> Exception: Java gateway process exited before sending the driver its port number
> >>>
> >>> sc
> Traceback (most recent call last):
> File "<stdin>", line 1, in <module>
> NameError: name 'sc' is not defined
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org