You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Chetan Khatri <ch...@gmail.com> on 2017/02/15 06:44:24 UTC

Update Public Documentation - SparkSession instead of SparkContext

Hello Spark Dev Team,

I was working with my team having most of the confusion that why your
public documentation is not updated with SparkSession if SparkSession is
the ongoing extension and best practice instead of creating sparkcontext.

Thanks.

Re: Update Public Documentation - SparkSession instead of SparkContext

Posted by Reynold Xin <rx...@databricks.com>.
There is an existing pull request to update it:
https://github.com/apache/spark/pull/16856

But it is a little bit tricky.



On Wed, Feb 15, 2017 at 7:44 AM, Chetan Khatri <ch...@gmail.com>
wrote:

> Hello Spark Dev Team,
>
> I was working with my team having most of the confusion that why your
> public documentation is not updated with SparkSession if SparkSession is
> the ongoing extension and best practice instead of creating sparkcontext.
>
> Thanks.
>

Re: Update Public Documentation - SparkSession instead of SparkContext

Posted by Chetan Khatri <ch...@gmail.com>.
Sorry, The context i am referring is for below URL
http://spark.apache.org/docs/2.0.1/programming-guide.html



On Wed, Feb 15, 2017 at 1:12 PM, Sean Owen <so...@cloudera.com> wrote:

> When asking a question like this, please actually link to what you are
> referring to. Some is intended.
>
>
> On Wed, Feb 15, 2017, 06:44 Chetan Khatri <ch...@gmail.com>
> wrote:
>
>> Hello Spark Dev Team,
>>
>> I was working with my team having most of the confusion that why your
>> public documentation is not updated with SparkSession if SparkSession is
>> the ongoing extension and best practice instead of creating sparkcontext.
>>
>> Thanks.
>>
>