You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Jestin Ma <je...@gmail.com> on 2016/07/27 13:02:43 UTC

Spark 2.0 SparkSession, SparkConf, SparkContext

I know that Sparksession is replacing the SQL and HiveContexts, but what
about SparkConf and SparkContext? Are those still relevant in our programs?

Thank you!
Jestin

Re: Spark 2.0 SparkSession, SparkConf, SparkContext

Posted by Sun Rui <su...@163.com>.
If you want to keep using RDD API, then you still need to create SparkContext first.

If you want to use just Dataset/DataFrame/SQL API, then you can directly create a SparkSession. Generally the SparkContext is hidden although it is internally created and held within the SparkSession. Anytime you need the SparkContext, you can get it from SparkSession.sparkContext.   while SparkConf is accepted when creating a SparkSession, the formal way to set/get configurations for a SparkSession is through SparkSession.conf.set()/get()
> On Jul 27, 2016, at 21:02, Jestin Ma <je...@gmail.com> wrote:
> 
> I know that Sparksession is replacing the SQL and HiveContexts, but what about SparkConf and SparkContext? Are those still relevant in our programs?
> 
> Thank you!
> Jestin



---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: Spark 2.0 SparkSession, SparkConf, SparkContext

Posted by Sun Rui <su...@163.com>.
If you want to keep using RDD API, then you still need to create SparkContext first.

If you want to use just Dataset/DataFrame/SQL API, then you can directly create a SparkSession. Generally the SparkContext is hidden although it is internally created and held within the SparkSession. Anytime you need the SparkContext, you can get it from SparkSession.sparkContext.   while SparkConf is accepted when creating a SparkSession, the formal way to set/get configurations for a SparkSession is through SparkSession.conf.set()/get()
> On Jul 27, 2016, at 21:02, Jestin Ma <je...@gmail.com> wrote:
> 
> I know that Sparksession is replacing the SQL and HiveContexts, but what about SparkConf and SparkContext? Are those still relevant in our programs?
> 
> Thank you!
> Jestin



---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org