You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "jung bak (Jira)" <ji...@apache.org> on 2020/07/23 04:50:00 UTC

[jira] [Commented] (SPARK-32152) ./bin/spark-sql got error with reading hive metastore

    [ https://issues.apache.org/jira/browse/SPARK-32152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17163214#comment-17163214 ] 

jung bak commented on SPARK-32152:
----------------------------------

I got the hint in manual below.

[https://spark.apache.org/docs/latest/sql-data-sources-hive-tables.html#interacting-with-different-versions-of-hive-metastore]

You can close this issue.

> ./bin/spark-sql got error with reading hive metastore
> -----------------------------------------------------
>
>                 Key: SPARK-32152
>                 URL: https://issues.apache.org/jira/browse/SPARK-32152
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.0.0
>         Environment: Spark 3.0.0
> Hive 2.1.1
>            Reporter: jung bak
>            Priority: Major
>
> 1. Fist of all, I built Spark3.0.0 from source with below command.
> {quote}{{./build/mvn -Pyarn -Phive -Phive-thriftserver -Dskip Tests clean package}}
> {quote}
> 2. I set the ${SPARK_HOME}/conf/spark-defaults.conf as below.
> {quote}spark.sql.hive.metastore.version    2.1.1
> spark.sql.hive.metastore.jars    {color:#ff0000}maven{color}
> {quote}
> 3. There is no problem to run "${SPARK_HOME}/bin/spark-sql"
> 4. For production environment, I copied all downloaded jar files from maven to ${SPARK_HOME}/lib/
> 5. I changed ${SPARK_HOME}/conf/spark-defaluts.conf as below.
> {quote}spark.sql.hive.metastore.jars   {color:#ff0000}${SPARK_HOME}/lib/{color}
> {quote}
> 6. Then I got error running command ./bin/spark-sql as below.
> {quote}Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/ql/metadata/HiveException
> {quote}
> I found out that HiveException class is in the hive-exec-XXX.jar...
> Spark 3.0.0 was built with hive 2.3.7 by default, and I could find "hive-exec-2.3.7-core.jar" after I finished build. And I could find hive-exec-2.1.1.jar downloaded from maven when I use "spark.sql.hive.metastore.jars maven" in the spark-defaults.conf.
>  
> I thought that there are some conflict between hive 2.1.1 and hive 2.3.7 when I set the {color:#7a869a}spark.sql.hive.metastore.jars   ${SPARK_HOME}/lib/.{color}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org