You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by "Ethan Guo (Jira)" <ji...@apache.org> on 2022/06/08 05:57:00 UTC

[jira] [Updated] (HUDI-3983) ClassNotFoundException when using hudi-spark-bundle to write table with hbase index

     [ https://issues.apache.org/jira/browse/HUDI-3983?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Ethan Guo updated HUDI-3983:
----------------------------
    Fix Version/s: 0.12.0
                       (was: 0.11.1)

> ClassNotFoundException when using hudi-spark-bundle to write table with hbase index
> -----------------------------------------------------------------------------------
>
>                 Key: HUDI-3983
>                 URL: https://issues.apache.org/jira/browse/HUDI-3983
>             Project: Apache Hudi
>          Issue Type: Bug
>            Reporter: xi chaomin
>            Priority: Critical
>             Fix For: 0.12.0
>
>
> I ran a spark job and encountered several ClassNotFoundExceptions. spark version is 3.1 and scala version is 2.12.
> 1. 
> {code:java}
> java.lang.NoClassDefFoundError: org/apache/hudi/org/apache/hadoop/hbase/protobuf/generated/AuthenticationProtos$TokenIdentifier$Kind
>      at org.apache.hudi.org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.translateException(RpcRetryingCallerImpl.java:222)
>      at org.apache.hudi.org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithoutRetries(RpcRetryingCallerImpl.java:195)
>      at org.apache.hudi.org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:395)
>      at org.apache.hudi.org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:369)
>      at org.apache.hudi.org.apache.hadoop.hbase.client.RpcRetryingCallerImpl.callWithRetries(RpcRetryingCallerImpl.java:108) {code}
> including org.apache.hbase:hbase-protocol in packaging/hudi-spark-bundle/pom.xml can solve this error.
> 2.
> {code:java}
>  java.lang.ClassNotFoundException: org.apache.hudi.org.apache.hbase.thirdparty.com.google.gson.GsonBuilder
>      at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>      at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>      at java.lang.ClassLoader.loadClass(ClassLoader.java:357) {code}
> including org.apache.hbase.thirdparty:hbase-shaded-gson n packaging/hudi-spark-bundle/pom.xml can solve this error.
> 3.
> {code:java}
>  java.lang.ClassNotFoundException: Class org.apache.hadoop.hbase.client.ClusterStatusListener$MulticastListener not found {code}
> There is a configuration in hbase-site.xml 
> {code:java}
> <property>
>   <name>hbase.status.listener.class&amp;amp;amp;lt;/name>
>   <value>org.apache.hadoop.hbase.client.ClusterStatusListener$MulticastListener</value>
>   <description>
>     Implementation of the status listener with a multicast message.
>   </description>
> </property> {code}
> I set _*hbase.status.listener.class*_ to _*org.apache.hudi.org.apache.hadoop.hbase.client.ClusterStatusListener$MulticastListener*_ in hbase configureation, the ClassNotFoundException has resolved, but get another exception
> {code:java}
> org.apache.hudi.org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Call to address=xxx:16020 failed on local exception: org.apache.hudi.org.apache.hadoop.hbase.exceptions.ConnectionClosedException: Connection closed {code}
> I remove the relocations related to hbase in packaging/hudi-spark-bundle/pom.xml , the job succeed.
> I checked the debug logs, but have no idea of the reason why ConnectionClosedException occurs when we use relocation.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)