You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2014/11/24 09:29:12 UTC
[jira] [Commented] (SPARK-4475) PySpark failed to initialize if
localhost can not be resolved
[ https://issues.apache.org/jira/browse/SPARK-4475?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14222779#comment-14222779 ]
Apache Spark commented on SPARK-4475:
-------------------------------------
User 'lvsoft' has created a pull request for this issue:
https://github.com/apache/spark/pull/3425
> PySpark failed to initialize if localhost can not be resolved
> -------------------------------------------------------------
>
> Key: SPARK-4475
> URL: https://issues.apache.org/jira/browse/SPARK-4475
> Project: Spark
> Issue Type: Bug
> Components: PySpark
> Affects Versions: 1.0.2, 1.1.0, 1.2.0
> Reporter: Davies Liu
>
> {code}
> Traceback (most recent call last):
> File "/home/hduser/Downloads/spark-1.1.0/python/pyspark/shell.py", line 44, in <module>
> sc = SparkContext(appName="PySparkShell", pyFiles=add_files)
> File "/home/hduser/Downloads/spark-1.1.0/python/pyspark/context.py", line 107, in __init__
> conf)
> File "/home/hduser/Downloads/spark-1.1.0/python/pyspark/context.py", line 159, in _do_init
> self._accumulatorServer = accumulators._start_update_server()
> File "/home/hduser/Downloads/spark-1.1.0/python/pyspark/accumulators.py", line 251, in _start_update_server
> server = AccumulatorServer(("localhost", 0), _UpdateRequestHandler)
> File "/usr/lib/python2.7/SocketServer.py", line 408, in __init__
> self.server_bind()
> File "/usr/lib/python2.7/SocketServer.py", line 419, in server_bind
> self.socket.bind(self.server_address)
> File "/usr/lib/python2.7/socket.py", line 224, in meth
> return getattr(self._sock,name)(*args)
> socket.gaierror: [Errno -5] No address associated with hostname
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org