You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by GitBox <gi...@apache.org> on 2022/09/16 09:40:16 UTC

[GitHub] [hudi] eric9204 opened a new issue, #6696: [SUPPORT][Consistent bucket index]NoClassDefFoundError: org/apache/logging/log4j/LogManager

eric9204 opened a new issue, #6696:
URL: https://github.com/apache/hudi/issues/6696

   **_Tips before filing an issue_**
   
   - Have you gone through our [FAQs](https://hudi.apache.org/learn/faq/)?
   
   - Join the mailing list to engage in conversations and get faster support at dev-subscribe@hudi.apache.org.
   
   - If you have triaged this as a bug, then file an [issue](https://issues.apache.org/jira/projects/HUDI/issues) directly.
   
   **Describe the problem you faced**
   
   A clear and concise description of the problem.
   
   **To Reproduce**
   
   Steps to reproduce the behavior:
   
   hoodie.datasource.write.operation=insert
   hoodie.merge.allow.duplicate.on.inserts=true
   hoodie.datasource.write.table.type=COPY_ON_WRITE
   hoodie.datasource.write.precombine.field=ts
   hoodie.datasource.write.recordkey.field=id
   hoodie.datasource.write.partitionpath.field=""
   hoodie.table.name=ss_bucket_dsj
   hoodie.datasource.write.keygenerator.class=org.apache.hudi.keygen.NonpartitionedKeyGenerator
   hoodie.upsert.shuffle.parallelism=8
   hoodie.insert.shuffle.parallelism=8
   hoodie.bulk_insert.shuffle.parallelism=8
   hoodie.parquet.small.file.limit=104857600
   hoodie.index.type=BUCKET
   hoodie.bucket.index.num.buckets=8
   hoodie.index.bucket.engine=CONSISTENT_HASHING
   hoodie.bucket.index.max.num.buckets=14
   hoodie.bucket.index.min.num.buckets=4
   hoodie.bucket.index.hash.field=id
   hoodie.storage.layout.partitioner.class=org.apache.hudi.table.action.commit.SparkBucketIndexPartitioner
   hoodie.storage.layout.type=BUCKET
   hoodie.metadata.enable=true
   hoodie.datasource.clustering.async.enable=true
   hoodie.clustering.async.enabled=true
   hoodie.clustering.async.max.commits=1
   hoodie.clustering.plan.strategy.target.file.max.bytes=104857600
   hoodie.clustering.plan.strategy.small.file.limit=104857600
   hoodie.clustering.plan.strategy.class=org.apache.hudi.client.clustering.plan.strategy.SparkConsistentBucketClusteringPlanStrategy
   hoodie.clustering.execution.strategy.class=org.apache.hudi.client.clustering.run.strategy.SparkConsistentBucketClusteringExecutionStrategy
   hoodie.clustering.updates.strategy=org.apache.hudi.client.clustering.update.strategy.SparkConsistentBucketDuplicateUpdateStrategy
   hoodie.clustering.plan.strategy.sort.columns=id
   hoodie.embed.timeline.server=false
   path=/tmp/hudi/ss_bucket_dsj
   
   **Expected behavior**
   
   A clear and concise description of what you expected to happen.
   
   **Environment Description**
   
   * Hudi version :
     Hudi-master
   * Spark version :
      Spark-3.1.2
   * Hive version :
     NONE
   * Hadoop version :
     Hadoop-3.3.0
   * Storage (HDFS/S3/GCS..) :
     HDFS
   * Running on Docker? (yes/no) :
     NO
   
   **Additional context**
   
   Add any other context about the problem here.
   
   **Stacktrace**
   
   ```
   22/09/16 10:41:15 ERROR MicroBatchExecution: Query [id = 51398de7-9998-4a9a-b96b-62f4441ee724, runId = a05e0523-862c-480e-815b-1e36a2516637] terminated with error
   java.lang.NoClassDefFoundError: org/apache/logging/log4j/LogManager
           at org.apache.hudi.client.clustering.plan.strategy.SparkConsistentBucketClusteringPlanStrategy.<clinit>(SparkConsistentBucketClusteringPlanStrategy.java:67)
           at java.lang.Class.forName0(Native Method)
           at java.lang.Class.forName(Class.java:264)
           at org.apache.hudi.common.util.ReflectionUtils.getClass(ReflectionUtils.java:54)
           at org.apache.hudi.common.util.ReflectionUtils.loadClass(ReflectionUtils.java:89)
           at org.apache.hudi.table.action.cluster.ClusteringPlanActionExecutor.createClusteringPlan(ClusteringPlanActionExecutor.java:83)
           at org.apache.hudi.table.action.cluster.ClusteringPlanActionExecutor.execute(ClusteringPlanActionExecutor.java:92)
           at org.apache.hudi.table.HoodieSparkCopyOnWriteTable.scheduleClustering(HoodieSparkCopyOnWriteTable.java:182)
           at org.apache.hudi.client.BaseHoodieWriteClient.scheduleTableServiceInternal(BaseHoodieWriteClient.java:1344)
           at org.apache.hudi.client.BaseHoodieWriteClient.scheduleTableService(BaseHoodieWriteClient.java:1326)
           at org.apache.hudi.client.BaseHoodieWriteClient.scheduleClusteringAtInstant(BaseHoodieWriteClient.java:1271)
           at org.apache.hudi.client.BaseHoodieWriteClient.scheduleClustering(BaseHoodieWriteClient.java:1262)
           at org.apache.hudi.HoodieSparkSqlWriter$.commitAndPerformPostOperations(HoodieSparkSqlWriter.scala:732)
           at org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:340)
           at org.apache.hudi.HoodieStreamingSink.$anonfun$addBatch$2(HoodieStreamingSink.scala:129)
           at scala.util.Try$.apply(Try.scala:213)
           at org.apache.hudi.HoodieStreamingSink.$anonfun$addBatch$1(HoodieStreamingSink.scala:128)
           at org.apache.hudi.HoodieStreamingSink.retry(HoodieStreamingSink.scala:214)
           at org.apache.hudi.HoodieStreamingSink.addBatch(HoodieStreamingSink.scala:127)
           at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runBatch$16(MicroBatchExecution.scala:586)
           at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103)
           at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163)
           at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90)
           at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772)
           at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64)
           at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runBatch$15(MicroBatchExecution.scala:584)
           at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken(ProgressReporter.scala:357)
           at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken$(ProgressReporter.scala:355)
           at org.apache.spark.sql.execution.streaming.StreamExecution.reportTimeTaken(StreamExecution.scala:68)
           at org.apache.spark.sql.execution.streaming.MicroBatchExecution.runBatch(MicroBatchExecution.scala:584)
           at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runActivatedStream$2(MicroBatchExecution.scala:226)
           at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
           at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken(ProgressReporter.scala:357)
           at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken$(ProgressReporter.scala:355)
           at org.apache.spark.sql.execution.streaming.StreamExecution.reportTimeTaken(StreamExecution.scala:68)
           at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runActivatedStream$1(MicroBatchExecution.scala:194)
           at org.apache.spark.sql.execution.streaming.ProcessingTimeExecutor.execute(TriggerExecutor.scala:57)
           at org.apache.spark.sql.execution.streaming.MicroBatchExecution.runActivatedStream(MicroBatchExecution.scala:188)
           at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:333)
           at org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:244)
   Caused by: java.lang.ClassNotFoundException: org.apache.logging.log4j.LogManager
           at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
           at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
           at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
           ... 40 more
   22/09/16 10:41:15 INFO ConsumerCoordinator: [Consumer clientId=consumer-spark-kafka-source-01e396b3-c0ae-4e18-a723-df762f255897--1151574722-driver-0-1, groupId=spark-kafka-source-01e396b3-c0ae-4e18-a723-df762f255897--1151574722-driver-0] Revoke previously assigned partitions bucket_resize-0, bucket_resize-2, bucket_resize-1
   22/09/16 10:41:15 INFO AbstractCoordinator: [Consumer clientId=consumer-spark-kafka-source-01e396b3-c0ae-4e18-a723-df762f255897--1151574722-driver-0-1, groupId=spark-kafka-source-01e396b3-c0ae-4e18-a723-df762f255897--1151574722-driver-0] Member consumer-spark-kafka-source-01e396b3-c0ae-4e18-a723-df762f255897--1151574722-driver-0-1-1cc1bb8f-a66c-4abb-8c5c-e0f696ca9cb5 sending LeaveGroup request to coordinator host-10-19-29-165:6667 (id: 2147482645 rack: null) due to the consumer is being closed
   ```
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [hudi] xushiyan commented on issue #6696: [SUPPORT][Consistent bucket index]NoClassDefFoundError: org/apache/logging/log4j/LogManager

Posted by GitBox <gi...@apache.org>.
xushiyan commented on issue #6696:
URL: https://github.com/apache/hudi/issues/6696#issuecomment-1250191577

   @eric9204 does the version you used contain this patch? https://github.com/apache/hudi/pull/6631, which is likely to address this problem
   cc @alexeykudinkin 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [hudi] eric9204 commented on issue #6696: [SUPPORT][Consistent bucket index]NoClassDefFoundError: org/apache/logging/log4j/LogManager

Posted by GitBox <gi...@apache.org>.
eric9204 commented on issue #6696:
URL: https://github.com/apache/hudi/issues/6696#issuecomment-1250200063

   > @eric9204 does the version you used contain this patch? #6631, which is likely to address this problem cc @alexeykudinkin
   
   @xushiyan  @YuweiXiao The version I used do contain this patch #6637 .
   But I found some problems in the  `SparkConsistentBucketClusteringPlanStrategy.java`
   
   When I modified this 
   
   ```
   import org.apache.logging.log4j.LogManager;
   import org.apache.logging.log4j.Logger; 
   ```
   
   to
   
   ```
   import org.apache.log4j.LogManager;
   import org.apache.log4j.Logger;
   ```
   This problem of NoClassDefFoundError didn't reappear.
   
   Maybe the same changes should be applied in these file.
   
   ```
   RDDConsistentBucketPartitioner.java
   SparkConsistentBucketClusteringExecutionStrategy.java
   SparkConsistentBucketDuplicateUpdateStrategy.java
   RDDConsistentBucketPartitioner.java
   ```
   
   The same import in the `SparkSizeBasedClusteringPlanStrategy.java` was 
   
   ```
   import org.apache.log4j.LogManager;
   import org.apache.log4j.Logger;
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [hudi] xushiyan commented on issue #6696: [SUPPORT][Consistent bucket index]NoClassDefFoundError: org/apache/logging/log4j/LogManager

Posted by GitBox <gi...@apache.org>.
xushiyan commented on issue #6696:
URL: https://github.com/apache/hudi/issues/6696#issuecomment-1250350567

   `import org.apache.log4j.LogManager;` is from the 1 to 2 bridging jar https://logging.apache.org/log4j/2.x/manual/migration.html#Log4j1.2Bridge
   
   while `import org.apache.logging.log4j.LogManager;` is the 2.x api. we should standardize it by using the bridge jar.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [hudi] eric9204 commented on issue #6696: [SUPPORT][Consistent bucket index]NoClassDefFoundError: org/apache/logging/log4j/LogManager

Posted by GitBox <gi...@apache.org>.
eric9204 commented on issue #6696:
URL: https://github.com/apache/hudi/issues/6696#issuecomment-1250518801

   @xushiyan Yes, I'm doing this.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [hudi] xushiyan closed issue #6696: [SUPPORT][Consistent bucket index]NoClassDefFoundError: org/apache/logging/log4j/LogManager

Posted by GitBox <gi...@apache.org>.
xushiyan closed issue #6696: [SUPPORT][Consistent bucket index]NoClassDefFoundError: org/apache/logging/log4j/LogManager
URL: https://github.com/apache/hudi/issues/6696


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [hudi] xushiyan commented on issue #6696: [SUPPORT][Consistent bucket index]NoClassDefFoundError: org/apache/logging/log4j/LogManager

Posted by GitBox <gi...@apache.org>.
xushiyan commented on issue #6696:
URL: https://github.com/apache/hudi/issues/6696#issuecomment-1249505357

   @eric9204 can you share how you ran the spark job with what jars please? so we can try to reproduce


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org