You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by "Pagliari, Roberto" <rp...@appcomsci.com> on 2014/11/06 01:21:45 UTC
SparkContext._lock Error
I'm using this system
Hadoop 1.0.4
Scala 2.9.3
Hive 0.9.0
With spark 1.1.0. When importing pyspark, I'm getting this error:
>>> from pyspark.sql import *
Traceback (most recent call last):
File "<stdin>", line 1, in ?
File "/<path>/spark-1.1.0/python/pyspark/__init__.py", line 63, in ?
from pyspark.context import SparkContext
File "/<path>/spark-1.1.0/python/pyspark/context.py", line 209
with SparkContext._lock:
^
SyntaxError: invalid syntax
How do I fix it?
Thank you,
Re: SparkContext._lock Error
Posted by Davies Liu <da...@databricks.com>.
PySpark requires Python 2.6/7.
On Wed, Nov 5, 2014 at 5:32 PM, Pagliari, Roberto
<rp...@appcomsci.com> wrote:
> I'm not on the cluster now so I cannot check. What is the minimum requirement for Python?
>
> Thanks,
>
> ________________________________________
> From: Davies Liu [davies@databricks.com]
> Sent: Wednesday, November 05, 2014 7:41 PM
> To: Pagliari, Roberto
> Cc: user@spark.apache.org
> Subject: Re: SparkContext._lock Error
>
> What's the version of Python? 2.4?
>
> Davies
>
> On Wed, Nov 5, 2014 at 4:21 PM, Pagliari, Roberto
> <rp...@appcomsci.com> wrote:
>> I’m using this system
>>
>>
>>
>> Hadoop 1.0.4
>>
>> Scala 2.9.3
>>
>> Hive 0.9.0
>>
>>
>>
>>
>>
>> With spark 1.1.0. When importing pyspark, I’m getting this error:
>>
>>
>>
>>>>> from pyspark.sql import *
>>
>> Traceback (most recent call last):
>>
>> File "<stdin>", line 1, in ?
>>
>> File "/<path>/spark-1.1.0/python/pyspark/__init__.py", line 63, in ?
>>
>> from pyspark.context import SparkContext
>>
>> File "/<path>/spark-1.1.0/python/pyspark/context.py", line 209
>>
>> with SparkContext._lock:
>>
>> ^
>>
>> SyntaxError: invalid syntax
>>
>>
>>
>> How do I fix it?
>>
>>
>>
>> Thank you,
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org
RE: SparkContext._lock Error
Posted by "Pagliari, Roberto" <rp...@appcomsci.com>.
I'm not on the cluster now so I cannot check. What is the minimum requirement for Python?
Thanks,
________________________________________
From: Davies Liu [davies@databricks.com]
Sent: Wednesday, November 05, 2014 7:41 PM
To: Pagliari, Roberto
Cc: user@spark.apache.org
Subject: Re: SparkContext._lock Error
What's the version of Python? 2.4?
Davies
On Wed, Nov 5, 2014 at 4:21 PM, Pagliari, Roberto
<rp...@appcomsci.com> wrote:
> I’m using this system
>
>
>
> Hadoop 1.0.4
>
> Scala 2.9.3
>
> Hive 0.9.0
>
>
>
>
>
> With spark 1.1.0. When importing pyspark, I’m getting this error:
>
>
>
>>>> from pyspark.sql import *
>
> Traceback (most recent call last):
>
> File "<stdin>", line 1, in ?
>
> File "/<path>/spark-1.1.0/python/pyspark/__init__.py", line 63, in ?
>
> from pyspark.context import SparkContext
>
> File "/<path>/spark-1.1.0/python/pyspark/context.py", line 209
>
> with SparkContext._lock:
>
> ^
>
> SyntaxError: invalid syntax
>
>
>
> How do I fix it?
>
>
>
> Thank you,
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org
Re: SparkContext._lock Error
Posted by Davies Liu <da...@databricks.com>.
What's the version of Python? 2.4?
Davies
On Wed, Nov 5, 2014 at 4:21 PM, Pagliari, Roberto
<rp...@appcomsci.com> wrote:
> I’m using this system
>
>
>
> Hadoop 1.0.4
>
> Scala 2.9.3
>
> Hive 0.9.0
>
>
>
>
>
> With spark 1.1.0. When importing pyspark, I’m getting this error:
>
>
>
>>>> from pyspark.sql import *
>
> Traceback (most recent call last):
>
> File "<stdin>", line 1, in ?
>
> File "/<path>/spark-1.1.0/python/pyspark/__init__.py", line 63, in ?
>
> from pyspark.context import SparkContext
>
> File "/<path>/spark-1.1.0/python/pyspark/context.py", line 209
>
> with SparkContext._lock:
>
> ^
>
> SyntaxError: invalid syntax
>
>
>
> How do I fix it?
>
>
>
> Thank you,
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org