You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by takao <ta...@focaldata.com> on 2018/10/17 16:38:13 UTC

[PySpark SQL]: SparkConf does not exist in the JVM

Hi,

`pyspark.sql.SparkSession.builder.getOrCreate()` gives me an error, and I
wonder if anyone can help me with this.

The line of code that gives me an error is

```
with spark_session(master, app_name) as session:
```

where spark_session is Python's context manager:

```
@contextlib.contextmanager
def spark_session(master, app_name):
    session = pyspark.sql.SparkSession.builder\
        .master(master).appName(app_name)\
        .config("spark.executorEnv.PYTHONPATH", os.getenv("PYTHONPATH"))\
        .getOrCreate()
    try:
        yield session
    finally:
        session.stop()
```

The error message is

```
/usr/local/lib/python3.6/site-packages/pyspark/sql/session.py:170: in
getOrCreate
    sparkConf = SparkConf()
/usr/local/lib/python3.6/site-packages/pyspark/conf.py:116: in __init__
    self._jconf = _jvm.SparkConf(loadDefaults)
py4j.protocol.Py4JError: SparkConf does not exist in the JVM
```

I am using Spark local cluster, and Spark's version is 2.3.2. PySpark is
also version 2.3.2.

Thanks in advance.
Takao



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org