You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by yogita bhardwaj <yo...@iktara.ai> on 2022/09/20 21:01:42 UTC

Pyspark SparkContext issue

I am getting the py4j.protocol.Py4JJavaError while running SparkContext. Can you please help me to resolve this issue.
from pyspark import SparkContext
sc=SparkContext()
a=sc.parallelize([1,2,3,4])
print(f"a_take:{a.take(2)}")


py4j.protocol.Py4JJavaError: An error occurred while calling z:org.apache.spark.api.python.PythonRDD.runJob.


Sent from Mail for Windows