You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2020/10/14 03:19:00 UTC
[jira] [Updated] (SPARK-29536) PySpark does not work with Python
3.8.0
[ https://issues.apache.org/jira/browse/SPARK-29536?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun updated SPARK-29536:
----------------------------------
Affects Version/s: 2.4.7
> PySpark does not work with Python 3.8.0
> ---------------------------------------
>
> Key: SPARK-29536
> URL: https://issues.apache.org/jira/browse/SPARK-29536
> Project: Spark
> Issue Type: Test
> Components: PySpark
> Affects Versions: 2.4.7, 3.0.0
> Reporter: Hyukjin Kwon
> Assignee: Hyukjin Kwon
> Priority: Critical
> Fix For: 3.0.0
>
>
> You open a shell and run arbitrary codes:
> {code}
> File "/.../3.8/lib/python3.8/runpy.py", line 183, in _run_module_as_main
> mod_name, mod_spec, code = _get_module_details(mod_name, _Error)
> File "/.../3.8/lib/python3.8/runpy.py", line 109, in _get_module_details
> __import__(pkg_name)
> File /.../workspace/forked/spark/python/pyspark/__init__.py", line 51, in <module>
> from pyspark.context import SparkContext
> File "/.../spark/python/pyspark/context.py", line 31, in <module>
> from pyspark import accumulators
> File "/.../python/pyspark/accumulators.py", line 97, in <module>
> from pyspark.serializers import read_int, PickleSerializer
> File "/.../python/pyspark/serializers.py", line 71, in <module>
> from pyspark import cloudpickle
> File "/.../python/pyspark/cloudpickle.py", line 152, in <module>
> _cell_set_template_code = _make_cell_set_template_code()
> File "/.../spark/python/pyspark/cloudpickle.py", line 133, in _make_cell_set_template_code
> return types.CodeType(
> TypeError: an integer is required (got type bytes)
> {code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org