You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2015/04/03 22:19:54 UTC

[jira] [Commented] (SPARK-6699) PySpark Acess Denied error in windows seen only in ver 1.3

    [ https://issues.apache.org/jira/browse/SPARK-6699?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14395030#comment-14395030 ] 

Sean Owen commented on SPARK-6699:
----------------------------------

Isn't this your problem?

{code}
No module named numpy
{code}

You haven't installed numpy maybe? this isn't a Spark issue.

> PySpark  Acess Denied error in windows seen only in ver 1.3
> -----------------------------------------------------------
>
>                 Key: SPARK-6699
>                 URL: https://issues.apache.org/jira/browse/SPARK-6699
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 1.3.0
>         Environment: Windows 8.1 x64
> Windows 7 SP1 x64
>            Reporter: RoCm
>
> Downloaded version 1.3 and tried to run pyspark
> I hit this error and unable to proceed (tried versions 1.2 and 1.1 works fine)
> Pasting the error logs below
> C:\Users\roXYZ\.babun\cygwin\home\roXYZ\spark-1.3.0-bin-hadoop2.4\bin>pyspark
> Running python with PYTHONPATH=C:\Users\roXYZ\.babun\cygwin\home\roXYZ\spark-1.3.0-bin-hadoop2.4\bin\..\python\lib\py4j-0.8.2.1-src.zip;
> C:\Users\roXYZ\.babun\cygwin\home\roXYZ\spark-1.3.0-bin-hadoop2.4\bin\..\python;
> Python 2.7.8 (default, Jun 30 2014, 16:03:49) [MSC v.1500 32 bit (Intel)] on win32
> Type "help", "copyright", "credits" or "license" for more information.
> No module named numpy
> Traceback (most recent call last): File "C:\Users\roXYZ\.babun\cygwin\home\roXYZ\spark-1.3.0-bin-hadoop2.4\bin\..\python\pyspark\shell.py", line 50, in <module>
>     sc = SparkContext(appName="PySparkShell", pyFiles=add_files)
> File "C:\Users\roXYZ\.babun\cygwin\home\roXYZ\spark-1.3.0-bin-hadoop2.4\python\pyspark\context.py", line 108, in __init__
>     SparkContext._ensure_initialized(self, gateway=gateway)
> File "C:\Users\roXYZ\.babun\cygwin\home\roXYZ\spark-1.3.0-bin-hadoop2.4\python\pyspark\context.py", line 222, in _ensure_initialized
>     SparkContext._gateway = gateway or launch_gateway()
>  File "C:\Users\roXYZ\.babun\cygwin\home\roXYZ\spark-1.3.0-bin-hadoop2.4\python\pyspark\java_gateway.py", line 65, in launch_gateway
>     proc = Popen(command, stdin=PIPE, env=env)
>   File "C:\Python27\lib\subprocess.py", line 710, in __init__    errread, errwrite)
>   File "C:\Python27\lib\subprocess.py", line 958, in _execute_child    startupinfo)
> WindowsError: [Error 5] Access is denied



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org