You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Masayoshi TSUZUKI (JIRA)" <ji...@apache.org> on 2014/10/06 09:36:33 UTC

[jira] [Created] (SPARK-3808) PySpark fails to start in Windows

Masayoshi TSUZUKI created SPARK-3808:
----------------------------------------

             Summary: PySpark fails to start in Windows
                 Key: SPARK-3808
                 URL: https://issues.apache.org/jira/browse/SPARK-3808
             Project: Spark
          Issue Type: Bug
          Components: PySpark, Windows
    Affects Versions: 1.1.0
         Environment: Windows
            Reporter: Masayoshi TSUZUKI
            Priority: Blocker


When we execute bin\pyspark.cmd in Windows, it fails to start.
We get following messages.
{noformat}
C:\XXXX>bin\pyspark.cmd
Running C:\XXXX\python.exe with PYTHONPATH=C:\XXXX\bin\..\python\lib\py4j-0.8.2.1-src.zip;C:\XXXX\bin\..\python;
Python 2.7.8 (default, Jun 30 2014, 16:03:49) [MSC v.1500 32 bit (Intel)] on win32
Type "help", "copyright", "credits" or "license" for more information.
="x" was unexpected at this time.
Traceback (most recent call last):
  File "C:\XXXX\bin\..\python\pyspark\shell.py", line 45, in <module>
    sc = SparkContext(appName="PySparkShell", pyFiles=add_files)
  File "C:\XXXX\python\pyspark\context.py", line 103, in __init__
    SparkContext._ensure_initialized(self, gateway=gateway)
  File "C:\XXXX\python\pyspark\context.py", line 212, in _ensure_initialized
    SparkContext._gateway = gateway or launch_gateway()
  File "C:\XXXX\python\pyspark\java_gateway.py", line 71, in launch_gateway
    raise Exception(error_msg)
Exception: Launching GatewayServer failed with exit code 255!
Warning: Expected GatewayServer to output a port, but found no output.

>>>
{noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org