You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by "☼ R Nair (रविशंकर नायर)" <ra...@gmail.com> on 2017/02/03 18:48:25 UTC
sqlContext vs spark.
All,
In Spark 1.6.0, we used
val jdbcDF = sqlContext.read.format(-----)
for creating a data frame through hsbc.
In Spark 2.1.x, we have seen this is
val jdbcDF = *spark*.read.format(-----)
Does that mean we should not be using sqlContext going forward? Also, we
see that sqlContext is not auto initialized while running spark-shell.
Please advise, thanks
Best, Ravion
Re: sqlContext vs spark.
Posted by Jacek Laskowski <ja...@japila.pl>.
Hi,
Yes. Forget about SQLContext. It's been merged into SparkSession as of
Spark 2.0 (same about HiveContext).
Long live SparkSession! :-)
Jacek
On 3 Feb 2017 7:48 p.m., "☼ R Nair (रविशंकर नायर)" <
ravishankar.nair@gmail.com> wrote:
All,
In Spark 1.6.0, we used
val jdbcDF = sqlContext.read.format(-----)
for creating a data frame through hsbc.
In Spark 2.1.x, we have seen this is
val jdbcDF = *spark*.read.format(-----)
Does that mean we should not be using sqlContext going forward? Also, we
see that sqlContext is not auto initialized while running spark-shell.
Please advise, thanks
Best, Ravion