You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "Michael (Jira)" <ji...@apache.org> on 2020/02/14 14:04:00 UTC

[jira] [Commented] (HIVE-22668) ClassNotFoundException:HiveHBaseTableInputFormat when tez include reduce operation

    [ https://issues.apache.org/jira/browse/HIVE-22668?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17036994#comment-17036994 ] 

Michael commented on HIVE-22668:
--------------------------------

Hi [~zhaoyim],

I am sure that the command(ADD JAR /usr/hdp/3.1.4.0-315/hive/lib/hive-hbase-handler-3.1.0.3.1.4.0-315.jar) is execute successfully and no error in terminal , but this problem is still exist.

> ClassNotFoundException:HiveHBaseTableInputFormat when tez include reduce operation
> ----------------------------------------------------------------------------------
>
>                 Key: HIVE-22668
>                 URL: https://issues.apache.org/jira/browse/HIVE-22668
>             Project: Hive
>          Issue Type: Bug
>          Components: Beeline, Hive
>    Affects Versions: 3.1.0
>            Reporter: Michael
>            Priority: Blocker
>
> When I use beeline to execute script which the operation is insert data from hive to hbase.
> If the operation include reduce step, this exception will appearance.
> I try to add jar in beeline like this:
> {code:java}
> ADD JAR /usr/hdp/3.1.4.0-315/hive/lib/hive-hbase-handler-3.1.0.3.1.4.0-315.jar
> ADD JAR /usr/hdp/3.1.4.0-315/hive/lib/guava-28.0-jre.jar
> ADD JAR /usr/hdp/3.1.4.0-315/hive/lib/zookeeper-3.4.6.3.1.4.0-315.jar{code}
> but this problem always exist. 
> {code:java}
> Serialization trace:
> inputFileFormatClass (org.apache.hadoop.hive.ql.plan.TableDesc)
> tableInfo (org.apache.hadoop.hive.ql.plan.FileSinkDesc)
> conf (org.apache.hadoop.hive.ql.exec.FileSinkOperator)
> childOperators (org.apache.hadoop.hive.ql.exec.SelectOperator)
> childOperators (org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator)
> reducer (org.apache.hadoop.hive.ql.plan.ReduceWork)
>         at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:156)
>         at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:133)
>         at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:670)
>         at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClass(SerializationUtilities.java:185)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:326)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:314)
>         at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObjectOrNull(Kryo.java:759)
>         at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObjectOrNull(SerializationUtilities.java:203)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:132)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
>         at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
>         at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:218)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
>         at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
>         at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:218)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
>         at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)
>         at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClassAndObject(SerializationUtilities.java:180)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:134)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:40)
>         at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
>         at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:218)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
>         at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)
>         at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClassAndObject(SerializationUtilities.java:180)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:134)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:40)
>         at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
>         at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:218)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
>         at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
>         at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:218)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
>         at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:686)
>         at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:210)
>         at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializeObjectByKryo(SerializationUtilities.java:707)
>         at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:613)
>         at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:590)
>         at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:481)
>         ... 21 more
> Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:348)
>         at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:154)
>         ... 64 more
> ], TaskAttempt 3 failed, info=[Error: Error while running task ( failure ) : attempt_1577155991246_0014_1_09_000000_3:org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Failed to load plan: hdfs://192.168.1.100:8020/tmp/hive/root/038acecf-b1fc-44fb-865a-b7e7c39ecd6d/hive_2019-12-24_15-30-51_385_91017415271371349-10/root/_tez_scratch_dir/8eff6524-1dd4-493c-9cc7-cd08cd2545f3/reduce.xml
>         at org.apache.hadoop.hive.ql.exec.tez.ObjectCache.retrieve(ObjectCache.java:103)
>         at org.apache.hadoop.hive.ql.exec.ObjectCacheWrapper.retrieve(ObjectCacheWrapper.java:45)
>         at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordProcessor.<init>(ReduceRecordProcessor.java:98)
>         at org.apache.hadoop.hive.ql.exec.tez.TezProcessor.run(TezProcessor.java:244)
>         at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:374)
>         at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:73)
>         at org.apache.tez.runtime.task.TaskRunner2Callable$1.run(TaskRunner2Callable.java:61)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:422)
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
>         at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:61)
>         at org.apache.tez.runtime.task.TaskRunner2Callable.callInternal(TaskRunner2Callable.java:37)
>         at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
>         at com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
>         at com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:69)
>         at com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
>         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
>         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
>         at java.lang.Thread.run(Thread.java:748)
> Caused by: java.lang.RuntimeException: Failed to load plan: hdfs://192.168.1.100:8020/tmp/hive/root/038acecf-b1fc-44fb-865a-b7e7c39ecd6d/hive_2019-12-24_15-30-51_385_91017415271371349-10/root/_tez_scratch_dir/8eff6524-1dd4-493c-9cc7-cd08cd2545f3/reduce.xml
>         at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:509)
>         at org.apache.hadoop.hive.ql.exec.Utilities.getReduceWork(Utilities.java:353)
>         at org.apache.hadoop.hive.ql.exec.tez.ReduceRecordProcessor$1.call(ReduceRecordProcessor.java:101)
>         at org.apache.hadoop.hive.ql.exec.tez.ObjectCache.retrieve(ObjectCache.java:96)
>         ... 18 more
> Caused by: org.apache.hive.com.esotericsoftware.kryo.KryoException: Unable to find class: org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat
> Serialization trace:
> inputFileFormatClass (org.apache.hadoop.hive.ql.plan.TableDesc)
> tableInfo (org.apache.hadoop.hive.ql.plan.FileSinkDesc)
> conf (org.apache.hadoop.hive.ql.exec.FileSinkOperator)
> childOperators (org.apache.hadoop.hive.ql.exec.SelectOperator)
> childOperators (org.apache.hadoop.hive.ql.exec.CommonMergeJoinOperator)
> reducer (org.apache.hadoop.hive.ql.plan.ReduceWork)
>         at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:156)
>         at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:133)
>         at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:670)
>         at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClass(SerializationUtilities.java:185)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:326)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.DefaultSerializers$ClassSerializer.read(DefaultSerializers.java:314)
>         at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObjectOrNull(Kryo.java:759)
>         at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObjectOrNull(SerializationUtilities.java:203)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:132)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
>         at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
>         at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:218)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
>         at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
>         at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:218)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
>         at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)
>         at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClassAndObject(SerializationUtilities.java:180)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:134)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:40)
>         at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
>         at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:218)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
>         at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)
>         at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readClassAndObject(SerializationUtilities.java:180)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:134)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.CollectionSerializer.read(CollectionSerializer.java:40)
>         at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
>         at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:218)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
>         at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
>         at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:218)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
>         at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
>         at org.apache.hive.com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:686)
>         at org.apache.hadoop.hive.ql.exec.SerializationUtilities$KryoWithHooks.readObject(SerializationUtilities.java:210)
>         at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializeObjectByKryo(SerializationUtilities.java:707)
>         at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:613)
>         at org.apache.hadoop.hive.ql.exec.SerializationUtilities.deserializePlan(SerializationUtilities.java:590)
>         at org.apache.hadoop.hive.ql.exec.Utilities.getBaseWork(Utilities.java:481)
>         ... 21 more
> Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:348)
>         at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:154)
>         ... 64 more
> ]], Vertex did not succeed due to OWN_TASK_FAILURE, failedTasks:1 killedTasks:1, Vertex vertex_1577155991246_0014_1_09 [Reducer 2] killed/failed due to:OWN_TASK_FAILURE]DAG did not succeed due to VERTEX_FAILURE. failedVertices:1 killedVertices:0 (state=08S01,code=2)
> {code}
> Component Version:
> HDP: 3.1.4.0
> Hadoop: 3.1.1 
> Hive: 3.1.0
> HBase: 2.0.2
> Tez:0.9.1
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)