You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by "Kun Huang (COSMOS)" <ku...@microsoft.com.INVALID> on 2020/05/26 16:20:50 UTC

How to enable hive support on an existing Spark session?

Hi Spark experts,

I am seeking for an approach to enable hive support manually on an existing Spark session.

Currently, HiveContext seems the best way for my scenario. However, this class has already been marked as deprecated and it is recommended to use SparkSession.builder.enableHiveSupport(). This should be called before creating Spark session.

I wonder if there are other workaround?

Thanks,
Kun

Re: How to enable hive support on an existing Spark session?

Posted by HARSH TAKKAR <ta...@gmail.com>.
Hi Kun,

You can use following spark property instead while launching the app
instead of manually enabling it in the code.

spark.sql.catalogImplementation=hive


Kind Regards
Harsh

On Tue, May 26, 2020 at 9:55 PM Kun Huang (COSMOS)
<ku...@microsoft.com.invalid> wrote:

>
> Hi Spark experts,
>
> I am seeking for an approach to enable hive support manually on an
> existing Spark session.
>
> Currently, HiveContext seems the best way for my scenario. However, this
> class has already been marked as deprecated and it is recommended to use
> SparkSession.builder.enableHiveSupport(). This should be called before
> creating Spark session.
>
> I wonder if there are other workaround?
>
> Thanks,
> Kun
>