You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Pratik Malani (Jira)" <ji...@apache.org> on 2022/10/18 14:00:00 UTC

[jira] [Comment Edited] (SPARK-40736) Spark 3.3.0 doesn't works with Hive 3.1.2

    [ https://issues.apache.org/jira/browse/SPARK-40736?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17619571#comment-17619571 ] 

Pratik Malani edited comment on SPARK-40736 at 10/18/22 1:59 PM:
-----------------------------------------------------------------

Hi [~yumwang] 
The jars are set in SPARK_HOME as well as in a path set under 

spark.sql.hive.metastore.jars.

The same setup works well with Hive 2.3.9 and even now it is working fine with customized Hive 3.1.2 jars.
So from my opinion, SPARK_HOME is not a concern over here.
Only concern is, with customized Hive, the spark jobs do not get complete after the execution. They get stuck and spark jobs stays in Running state forever.


was (Author: JIRAUSER296450):
Hi [~yumwang] 
The jars are set in SPARK_HOME as well as in a path set under 

spark.sql.hive.metastore.jars.

The same setup works well with Hive 2.3.9 and even now it is working fine with customized Hive 3.1.2 jars.
So from my opinion, SPARK_HOME is not a concern over here.
Only concern is, with customized Hive, the spark jobs do not get complete after the execution. They get stuck and spark jobs states in Running state forever.

> Spark 3.3.0 doesn't works with Hive 3.1.2
> -----------------------------------------
>
>                 Key: SPARK-40736
>                 URL: https://issues.apache.org/jira/browse/SPARK-40736
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core, SQL
>    Affects Versions: 3.3.0
>            Reporter: Pratik Malani
>            Priority: Major
>              Labels: Hive, spark
>         Attachments: image-2022-10-12-18-19-24-455.png
>
>
> Hive 2.3.9 is impacted with CVE-2021-34538, so trying to use the Hive 3.1.2.
> Using Spark 3.3.0 with Hadoop 3.3.4 and Hive 3.1.2, getting below error when starting the Thriftserver
>  
> {noformat}
> Exception in thread "main" java.lang.IllegalAccessError: tried to access class org.apache.hive.service.server.HiveServer2$ServerOptionsProcessor from class org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$
>         at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveThriftServer2.scala:92)
>         at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThriftServer2.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:498)
>         at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
>         at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
>         at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
>         at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
>         at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
>         at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala){noformat}
> Using below command to start the Thriftserver
>  
> *spark-class org.apache.spark.deploy.SparkSubmit --class org.apache.spark.sql.hive.thriftserver.HiveThriftServer2 spark-internal*
>  
> Have set the SPARK_HOME correctly.
>  
> The same works well with Hive 2.3.9, but fails when we upgrade to Hive 3.1.2.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org