You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Parker Xiao (JIRA)" <ji...@apache.org> on 2017/05/14 04:07:04 UTC

[jira] [Created] (SPARK-20733) Permission Error: Access Denied

Parker Xiao  created SPARK-20733:
------------------------------------

             Summary: Permission Error: Access Denied
                 Key: SPARK-20733
                 URL: https://issues.apache.org/jira/browse/SPARK-20733
             Project: Spark
          Issue Type: Bug
          Components: PySpark
    Affects Versions: 2.1.1
         Environment: Windows 64-bit, Scala Version 2.11.8, Java 1.8.0_131, Python 3.6, Anaconda 4.3.1
            Reporter: Parker Xiao 
            Priority: Critical


I am experiencing the following issue when I tried to launch pyspark. 

    c:\spark>pyspark
    Python 3.6.0 |Anaconda 4.3.1 (64-bit)| (default, Dec 23 2016, 11:57:41) [MSC 
    v.1900 64 bit (AMD64)] on win32
    Type "help", "copyright", "credits" or "license" for more information.
    Traceback (most recent call last):
    File "C:\spark\bin\..\python\pyspark\shell.py", line 38, in <module>
   SparkContext._ensure_initialized()
    File "C:\spark\python\pyspark\context.py", line 259, in _ensure_initialized
   SparkContext._gateway = gateway or launch_gateway(conf)
    File "C:\spark\python\pyspark\java_gateway.py", line 80, in launch_gateway
   proc = Popen(command, stdin=PIPE, env=env)
    File "C:\Users\shuzhe\Anaconda3\lib\subprocess.py", line 707, in __init__
   restore_signals, start_new_session)
    File "C:\Users\shuzhe\Anaconda3\lib\subprocess.py", line 990, in _execute_child
startupinfo)
    PermissionError: [WinError 5] Access is denied

I know that some of these problems occur because of administrator problem. However, when I go to the folder and right click 'run as administrator', the problem still exists. So could anyone help me to figure out what the problem is? 



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org