You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yuming Wang (JIRA)" <ji...@apache.org> on 2019/04/05 09:36:00 UTC
[jira] [Commented] (SPARK-27176) Upgrade hadoop-3's built-in Hive
maven dependencies to 2.3.4
[ https://issues.apache.org/jira/browse/SPARK-27176?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16810652#comment-16810652 ]
Yuming Wang commented on SPARK-27176:
-------------------------------------
For Hive 2.3.4, we also need {{hive-llap-common}} and {{hive-llap-client}}:
{{hive-llap-common}} is used for registry functions:
{noformat}
scala> spark.range(10).write.saveAsTable("test_hadoop3")
java.lang.NoClassDefFoundError: org/apache/hadoop/hive/llap/security/LlapSigner$Signable
at java.lang.Class.getDeclaredConstructors0(Native Method)
at java.lang.Class.privateGetDeclaredConstructors(Class.java:2671)
at java.lang.Class.getConstructor0(Class.java:3075)
at java.lang.Class.getDeclaredConstructor(Class.java:2178)
at org.apache.hive.common.util.ReflectionUtil.newInstance(ReflectionUtil.java:79)
at org.apache.hadoop.hive.ql.exec.Registry.registerGenericUDTF(Registry.java:208)
at org.apache.hadoop.hive.ql.exec.Registry.registerGenericUDTF(Registry.java:201)
at org.apache.hadoop.hive.ql.exec.FunctionRegistry.<clinit>(FunctionRegistry.java:500)
at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:247)
at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:231)
at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:388)
at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:332)
at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:312)
at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:288)
at org.apache.spark.sql.hive.client.HiveClientImpl.client(HiveClientImpl.scala:250)
at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$withHiveState$1(HiveClientImpl.scala:272)
...
{noformat}
{{hive-llap-client}} is used for test Hive:
{noformat}
spark.sharedState.externalCatalog.unwrapped.asInstanceOf[HiveExternalCatalog]
.client.runSqlHive("SELECT COUNT(*) FROM test_hadoop3")
...
java.lang.NoClassDefFoundError: org/apache/hadoop/hive/llap/io/api/LlapProxy
at org.apache.hadoop.hive.ql.exec.GlobalWorkMapFactory.get(GlobalWorkMapFactory.java:102)
at org.apache.hadoop.hive.ql.exec.Utilities.clearWorkMapForConf(Utilities.java:3435)
at org.apache.hadoop.hive.ql.exec.Utilities.clearWork(Utilities.java:290)
at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:443)
at org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:151)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:199)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2183)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1839)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1526)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1237)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1227)
at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$runHive$1(HiveClientImpl.scala:733)
...
{noformat}
We can exclude {{org.apache.curator:curator-framework:jar}} and {{org.apache.curator:apache-curator.jar}} as they are used for add consistent node replacement to LLAP for splits, see HIVE-14589.
> Upgrade hadoop-3's built-in Hive maven dependencies to 2.3.4
> ------------------------------------------------------------
>
> Key: SPARK-27176
> URL: https://issues.apache.org/jira/browse/SPARK-27176
> Project: Spark
> Issue Type: Sub-task
> Components: Build, SQL
> Affects Versions: 3.0.0
> Reporter: Yuming Wang
> Priority: Major
>
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org