You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Diana Carroll (JIRA)" <ji...@apache.org> on 2014/06/03 21:09:02 UTC

[jira] [Created] (SPARK-2003) SparkContext(SparkConf) doesn't work in pyspark

Diana Carroll created SPARK-2003:
------------------------------------

             Summary: SparkContext(SparkConf) doesn't work in pyspark
                 Key: SPARK-2003
                 URL: https://issues.apache.org/jira/browse/SPARK-2003
             Project: Spark
          Issue Type: Bug
          Components: Documentation, PySpark
    Affects Versions: 1.0.0
            Reporter: Diana Carroll


Using SparkConf with SparkContext as described in the Programming Guide does NOT work in Python:
conf = SparkConf.setAppName("blah")
sc = SparkContext(conf)
When I tried I got 
AttributeError: 'SparkConf' object has no attribute '_get_object_id'

[This equivalent code in Scala works fine:
val conf = new SparkConf().setAppName("blah")
val sc = new SparkContext(conf)]

I think this is because there's no equivalent for the Scala constructor SparkContext(SparkConf).  

Workaround:
If I explicitly set the conf parameter in the python call, it does work:
sconf = SparkConf.setAppName("blah")
sc = SparkContext(conf=sconf)



--
This message was sent by Atlassian JIRA
(v6.2#6252)