You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by "sivabalan narayanan (Jira)" <ji...@apache.org> on 2023/03/30 02:17:00 UTC

[jira] [Updated] (HUDI-5681) Merge Into fails while deserializing expressions

     [ https://issues.apache.org/jira/browse/HUDI-5681?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

sivabalan narayanan updated HUDI-5681:
--------------------------------------
    Fix Version/s: 0.12.3

> Merge Into fails while deserializing expressions
> ------------------------------------------------
>
>                 Key: HUDI-5681
>                 URL: https://issues.apache.org/jira/browse/HUDI-5681
>             Project: Apache Hudi
>          Issue Type: Bug
>          Components: spark-sql
>            Reporter: Alexey Kudinkin
>            Assignee: Alexey Kudinkin
>            Priority: Blocker
>              Labels: pull-request-available
>             Fix For: 0.13.0, 0.12.3
>
>
> While running our benchmark suite against 0.13 RC, we've stumbled upon following exceptions:
> {code:java}
> 23/02/01 08:29:01 ERROR TaskSetManager: Task 1 in stage 947.0 failed 4 times; aborting job
> 2023-02-01T08:29:01.219 ERROR: merge:1:inventory
> Job aborted due to stage failure: Task 1 in stage 947.0 failed 4 times, most recent failure: Lost task 1.3 in stage 947.0 (TID 101955) (ip-172-31-18-9.us-west-2.compute.internal executor 140): org.apache.hudi.exception.HoodieUpsertException: Error upserting bucketType UPDATE for partition :1
> 	at org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor.handleUpsertPartition(BaseSparkCommitActionExecutor.java:336)
> 	at org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor.handleInsertPartition(BaseSparkCommitActionExecutor.java:342)
> 	at org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor.lambda$mapPartitionsAsRDD$a3ab3c4$1(BaseSparkCommitActionExecutor.java:253)
> 	at org.apache.spark.api.java.JavaRDDLike.$anonfun$mapPartitionsWithIndex$1(JavaRDDLike.scala:102)
> 	at org.apache.spark.api.java.JavaRDDLike.$anonfun$mapPartitionsWithIndex$1$adapted(JavaRDDLike.scala:102)
> 	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndex$2(RDD.scala:907)
> 	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndex$2$adapted(RDD.scala:907)
> 	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
> 	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:365)
> 	at org.apache.spark.rdd.RDD.iterator(RDD.scala:329)
> 	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
> 	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:365)
> 	at org.apache.spark.rdd.RDD.$anonfun$getOrCompute$1(RDD.scala:378)
> 	at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1525)
> 	at org.apache.spark.storage.BlockManager.org$apache$spark$storage$BlockManager$$doPut(BlockManager.scala:1435)
> 	at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1499)
> 	at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:1322)
> 	at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:376)
> 	at org.apache.spark.rdd.RDD.iterator(RDD.scala:327)
> 	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
> 	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:365)
> 	at org.apache.spark.rdd.RDD.iterator(RDD.scala:329)
> 	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
> 	at org.apache.spark.scheduler.Task.run(Task.scala:138)
> 	at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:548)
> 	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1516)
> 	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:551)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> 	at java.lang.Thread.run(Thread.java:750)
> Caused by: com.esotericsoftware.kryo.KryoException: Unable to find class: org.apache.spark.sql.catalyst.expressions.Literal
> 	at com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:160)
> 	at com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:133)
> 	at com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:693)
> 	at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:804)
> 	at com.twitter.chill.Tuple10Serializer.read(TupleSerializers.scala:221)
> 	at com.twitter.chill.Tuple10Serializer.read(TupleSerializers.scala:199)
> 	at com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:813)
> 	at org.apache.spark.serializer.KryoSerializerInstance.deserialize(KryoSerializer.scala:408)
> 	at org.apache.spark.sql.hudi.SerDeUtils$.toObject(SerDeUtils.scala:42)
> 	at org.apache.spark.sql.hudi.command.payload.ExpressionPayload$$anon$7.apply(ExpressionPayload.scala:423)
> 	at org.apache.spark.sql.hudi.command.payload.ExpressionPayload$$anon$7.apply(ExpressionPayload.scala:419)
> 	at com.github.benmanes.caffeine.cache.BoundedLocalCache.lambda$doComputeIfAbsent$14(BoundedLocalCache.java:2405)
> 	at java.util.concurrent.ConcurrentHashMap.compute(ConcurrentHashMap.java:1853)
> 	at com.github.benmanes.caffeine.cache.BoundedLocalCache.doComputeIfAbsent(BoundedLocalCache.java:2403)
> 	at com.github.benmanes.caffeine.cache.BoundedLocalCache.computeIfAbsent(BoundedLocalCache.java:2386)
> 	at com.github.benmanes.caffeine.cache.LocalCache.computeIfAbsent(LocalCache.java:108)
> 	at com.github.benmanes.caffeine.cache.LocalManualCache.get(LocalManualCache.java:62)
> 	at org.apache.spark.sql.hudi.command.payload.ExpressionPayload$.org$apache$spark$sql$hudi$command$payload$ExpressionPayload$$getEvaluator(ExpressionPayload.scala:419)
> 	at org.apache.spark.sql.hudi.command.payload.ExpressionPayload.processNotMatchedRecord(ExpressionPayload.scala:198)
> 	at org.apache.spark.sql.hudi.command.payload.ExpressionPayload.getInsertValue(ExpressionPayload.scala:255)
> 	at org.apache.hudi.common.model.HoodieAvroRecord.shouldIgnore(HoodieAvroRecord.java:173)
> 	at org.apache.hudi.io.HoodieMergeHandle.writeInsertRecord(HoodieMergeHandle.java:281)
> 	at org.apache.hudi.io.HoodieMergeHandle.writeIncomingRecords(HoodieMergeHandle.java:397)
> 	at org.apache.hudi.io.HoodieMergeHandle.close(HoodieMergeHandle.java:405)
> 	at org.apache.hudi.table.action.commit.HoodieMergeHelper.runMerge(HoodieMergeHelper.java:168)
> 	at org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor.handleUpdateInternal(BaseSparkCommitActionExecutor.java:372)
> 	at org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor.handleUpdate(BaseSparkCommitActionExecutor.java:363)
> 	at org.apache.hudi.table.action.commit.BaseSparkCommitActionExecutor.handleUpsertPartition(BaseSparkCommitActionExecutor.java:329)
> 	... 29 more
> Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.catalyst.expressions.Literal
> 	at org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:124)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
> 	at java.lang.Class.forName0(Native Method)
> 	at java.lang.Class.forName(Class.java:348)
> 	at com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:154)
> 	... 56 more
> Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.catalyst.expressions.Literal
> 	at java.lang.ClassLoader.findClass(ClassLoader.java:523)
> 	at org.apache.spark.util.ParentClassLoader.findClass(ParentClassLoader.java:35)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
> 	at org.apache.spark.util.ParentClassLoader.loadClass(ParentClassLoader.java:40)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
> 	at org.apache.spark.repl.ExecutorClassLoader.findClass(ExecutorClassLoader.scala:109)
> 	... 61 more {code}
>  
> Which seem to stem from inability of the class-loader to locate corresponding Spark classes.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)