You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "kevinshin (Jira)" <ji...@apache.org> on 2023/02/14 08:49:00 UTC
[jira] [Resolved] (SPARK-41727) ClassCastException when config spark.sql.hive.metastore* properties under jdk17
[ https://issues.apache.org/jira/browse/SPARK-41727?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
kevinshin resolved SPARK-41727.
-------------------------------
Resolution: Not A Bug
it is not a issue about spark, but for hive
> ClassCastException when config spark.sql.hive.metastore* properties under jdk17
> -------------------------------------------------------------------------------
>
> Key: SPARK-41727
> URL: https://issues.apache.org/jira/browse/SPARK-41727
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 3.3.1
> Environment: Apache spark3.3.1 \ HDP3.1.5 with hive 3.1.0
> Reporter: kevinshin
> Priority: Major
> Attachments: hms-init-error.txt
>
>
> Apache spark3.3.1 \ HDP3.1.5 with hive 3.1.0
> when config properties about spark.sql.hive.metastore* to use hive.metastore.version 3.1.2:
> *spark.sql.hive.metastore.jars /data/soft/spark3/standalone-metastore/**
> *spark.sql.hive.metastore.version 3.1.2*
> then start spark-shell with master = local[*] under jdk17
> try to select a hive table, will got error:
> 13:44:52.428 [main] ERROR org.apache.hadoop.hive.metastore.utils.MetaStoreUtils - Got exception: java.lang.ClassCastException class [Ljava.lang.Object; cannot be cast to class [Ljava.net.URI; ([Ljava.lang.Object; and [Ljava.net.URI; are in module java.base of loader 'bootstrap')
> java.lang.ClassCastException: class [Ljava.lang.Object; cannot be cast to class [Ljava.net.URI; ([Ljava.lang.Object; and [Ljava.net.URI; are in module java.base of loader 'bootstrap')
> at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.resolveUris(HiveMetaStoreClient.java:262) ~[hive-standalone-metastore-3.1.2.jar:3.1.2]
>
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org