You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by GitBox <gi...@apache.org> on 2020/07/12 10:11:03 UTC

[GitHub] [hudi] RajasekarSribalan commented on issue #1823: [SUPPORT] MOR trigger compaction from Hudi CLI

RajasekarSribalan commented on issue #1823:
URL: https://github.com/apache/hudi/issues/1823#issuecomment-657201567


   Another issue is, I am getting below error during inline compaction. Pls help.
   
   com.esotericsoftware.kryo.KryoException: Unable to find class: org.apache.hudi.common.model.OverwriteWithLatestAvroPayload$$Lambda$46/188376151
   Serialization trace:
   orderingVal (org.apache.hudi.common.model.OverwriteWithLatestAvroPayload)
   data (org.apache.hudi.common.model.HoodieRecord)
   	at com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:156)
   	at com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:133)
   	at com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:670)
   	at com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:118)
   	at com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
   	at com.esotericsoftware.kryo.Kryo.readObject(Kryo.java:708)
   	at com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:125)
   	at com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:551)
   	at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:790)
   	at org.apache.hudi.common.util.SerializationUtils$KryoSerializerInstance.deserialize(SerializationUtils.java:107)
   	at org.apache.hudi.common.util.SerializationUtils.deserialize(SerializationUtils.java:81)
   	at org.apache.hudi.common.util.collection.DiskBasedMap.get(DiskBasedMap.java:217)
   	at org.apache.hudi.common.util.collection.DiskBasedMap.get(DiskBasedMap.java:211)
   	at org.apache.hudi.common.util.collection.DiskBasedMap.get(DiskBasedMap.java:207)
   	at org.apache.hudi.common.util.collection.ExternalSpillableMap.get(ExternalSpillableMap.java:168)
   	at org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.processNextRecord(HoodieMergedLogRecordScanner.java:114)
   	at org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.processAvroDataBlock(AbstractHoodieLogRecordScanner.java:277)
   	at org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.processQueuedBlocksForInstant(AbstractHoodieLogRecordScanner.java:305)
   	at org.apache.hudi.common.table.log.AbstractHoodieLogRecordScanner.scan(AbstractHoodieLogRecordScanner.java:152)
   	at org.apache.hudi.common.table.log.HoodieMergedLogRecordScanner.<init>(HoodieMergedLogRecordScanner.java:81)
   	at org.apache.hudi.table.compact.HoodieMergeOnReadTableCompactor.compact(HoodieMergeOnReadTableCompactor.java:126)
   	at org.apache.hudi.table.compact.HoodieMergeOnReadTableCompactor.lambda$compact$644ebad7$1(HoodieMergeOnReadTableCompactor.java:98)
   	at org.apache.spark.api.java.JavaPairRDD$$anonfun$toScalaFunction$1.apply(JavaPairRDD.scala:1040)
   	at scala.collection.Iterator$$anon$11.next(Iterator.scala:409)
   	at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434)
   	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440)
   	at org.apache.spark.storage.memory.MemoryStore.putIteratorAsBytes(MemoryStore.scala:371)
   	at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1055)
   	at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1029)
   	at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:969)
   	at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1029)
   	at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:760)
   	at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334)
   	at org.apache.spark.rdd.RDD.iterator(RDD.scala:285)
   	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
   	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
   	at org.apache.spark.rdd.RDD.iterator(RDD.scala:287)
   	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
   	at org.apache.spark.scheduler.Task.run(Task.scala:108)
   	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335)
   	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
   	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
   	at java.lang.Thread.run(Thread.java:748)
   Caused by: java.lang.ClassNotFoundException: org.apache.hudi.common.model.OverwriteWithLatestAvroPayload$$Lambda$46/188376151
   	at java.lang.Class.forName0(Native Method)
   	at java.lang.Class.forName(Class.java:348)
   	at com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:154)
   	... 42 more
   20/07/12 09:59:44 WARN storage.BlockManager: Putting block rdd_2_694 failed due to an exception


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org