You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Sergey Zhemzhitsky <sz...@gmail.com> on 2017/06/27 12:47:11 UTC
What is the purpose of having RDD.context and RDD.sparkContext at the
same time?
Hello spark gurus,
Could you please shed some light on what is the purpose of having two
identical functions in RDD,
RDD.context [1] and RDD.sparkContext [2].
RDD.context seems to be used more frequently across the source code.
[1]
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/rdd/RDD.scala#L1693
[2]
https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/rdd/RDD.scala#L146
Kind Regards,
Sergey