You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Masayoshi TSUZUKI (JIRA)" <ji...@apache.org> on 2016/03/01 09:28:18 UTC

[jira] [Created] (SPARK-13592) pyspark failed to launch on Windows client

Masayoshi TSUZUKI created SPARK-13592:
-----------------------------------------

             Summary: pyspark failed to launch on Windows client
                 Key: SPARK-13592
                 URL: https://issues.apache.org/jira/browse/SPARK-13592
             Project: Spark
          Issue Type: Bug
          Components: PySpark, Windows
    Affects Versions: 1.6.0
            Reporter: Masayoshi TSUZUKI


When I executed pyspark on Windows, it failed.

{quote}
> bin\pyspark
"C:\Users\tsudukim\Documents\workspace\spark-dev3\bin\"
Python 2.7.8 (default, Jun 30 2014, 16:03:49) [MSC v.1500 32 bit (Intel)] on win32
Type "help", "copyright", "credits" or "license" for more information.
'spark-submit2.cmd' is not recognized as an internal or external command,
operable program or batch file.
Traceback (most recent call last):
  File "C:\Users\tsudukim\Documents\workspace\spark-dev3\bin\..\python\pyspark\shell.py", line 38, in <module>
    sc = SparkContext()
  File "C:\Users\tsudukim\Documents\workspace\spark-dev3\python\pyspark\context.py", line 112, in __init__
    SparkContext._ensure_initialized(self, gateway=gateway)
  File "C:\Users\tsudukim\Documents\workspace\spark-dev3\python\pyspark\context.py", line 245, in _ensure_initialized
    SparkContext._gateway = gateway or launch_gateway()
  File "C:\Users\tsudukim\Documents\workspace\spark-dev3\python\pyspark\java_gateway.py", line 94, in launch_gateway
    raise Exception("Java gateway process exited before sending the driver its port number")
Exception: Java gateway process exited before sending the driver its port number
>>>
{quote}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org