You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by An Qin <aq...@qilinsoft.com> on 2017/12/11 06:43:56 UTC

Why Spark 2.2.1 still bundles old Hive jars?

Hi, all,

 

I want to include Sentry 2.0.0 in my Spark project. However it bundles
Hive 2.3.2. I find the newest Spark 2.2.1 still bundles old Hive jars,
for example, hive-exec-1.2.1.spark2.jar. Why does it upgrade to the new
Hive? Are they compatible?

 

Regards,

 

 

Qin An.

 

 

 


Re: Why Spark 2.2.1 still bundles old Hive jars?

Posted by Jacek Laskowski <ja...@japila.pl>.
Hi,

https://issues.apache.org/jira/browse/SPARK-19076

Pozdrawiam,
Jacek Laskowski
----
https://about.me/JacekLaskowski
Spark Structured Streaming https://bit.ly/spark-structured-streaming
Mastering Apache Spark 2 https://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski

On Mon, Dec 11, 2017 at 7:43 AM, An Qin <aq...@qilinsoft.com> wrote:

> Hi, all,
>
>
>
> I want to include Sentry 2.0.0 in my Spark project. However it bundles
> Hive 2.3.2. I find the newest Spark 2.2.1 still bundles old Hive jars, for
> example, hive-exec-1.2.1.spark2.jar. Why does it upgrade to the new Hive?
> Are they compatible?
>
>
>
> Regards,
>
>
>
>
>
> Qin An.
>
>
>
>
>
>
>