You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Ben Roling <be...@gmail.com> on 2020/03/12 21:21:19 UTC
Scala vs PySpark Inconsistency: SQLContext/SparkSession access from DataFrame/DataSet
I've noticed that DataSet.sqlContext is public in Scala but the equivalent
(DataFrame._sc) in PySpark is named as if it should be treated as private.
Is this intentional? If so, what's the rationale? If not, then it feels
like a bug and DataFrame should have some form of public access back to the
context/session. I'm happy to log the bug but thought I would ask here
first. Thanks!