You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by "Udit Mehrotra (Jira)" <ji...@apache.org> on 2021/08/13 00:16:00 UTC

[jira] [Updated] (HUDI-1542) Fix Flaky test : TestHoodieMetadata#testSync

     [ https://issues.apache.org/jira/browse/HUDI-1542?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Udit Mehrotra updated HUDI-1542:
--------------------------------
    Fix Version/s:     (was: 0.9.0)
                   0.10.0

> Fix Flaky test : TestHoodieMetadata#testSync
> --------------------------------------------
>
>                 Key: HUDI-1542
>                 URL: https://issues.apache.org/jira/browse/HUDI-1542
>             Project: Apache Hudi
>          Issue Type: Sub-task
>          Components: Writer Core
>            Reporter: Vinoth Chandar
>            Assignee: Prashant Wason
>            Priority: Blocker
>             Fix For: 0.10.0
>
>
> Only fails intermittently on CI.
> {code}
> [INFO] Running org.apache.hudi.metadata.TestHoodieBackedMetadata
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in [jar:file:/home/travis/.m2/repository/org/slf4j/slf4j-log4j12/1.7.16/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/home/travis/.m2/repository/org/apache/logging/log4j/log4j-slf4j-impl/2.6.2/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> [WARN ] 2021-01-20 09:25:31,716 org.apache.spark.util.Utils  - Your hostname, localhost resolves to a loopback address: 127.0.0.1; using 10.30.0.81 instead (on interface eth0)
> [WARN ] 2021-01-20 09:25:31,725 org.apache.spark.util.Utils  - Set SPARK_LOCAL_IP if you need to bind to another address
> [WARN ] 2021-01-20 09:25:32,412 org.apache.hadoop.util.NativeCodeLoader  - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
> [WARN ] 2021-01-20 09:25:36,645 org.apache.hudi.metadata.HoodieBackedTableMetadata  - Metadata table was not found at path /tmp/junit6813339032540265368/dataset/.hoodie/metadata
> [WARN ] 2021-01-20 09:25:36,700 org.apache.hudi.metadata.HoodieBackedTableMetadata  - Metadata table was not found at path /tmp/junit6813339032540265368/dataset/.hoodie/metadata
> [WARN ] 2021-01-20 09:26:30,250 org.apache.hudi.client.AbstractHoodieWriteClient  - Cannot find instant 20210120092628 in the timeline, for rollback
> [WARN ] 2021-01-20 09:26:45,980 org.apache.hudi.testutils.HoodieClientTestHarness  - Closing file-system instance used in previous test-run
> [WARN ] 2021-01-20 09:26:46,568 org.apache.hudi.metadata.HoodieBackedTableMetadata  - Metadata table was not found at path /tmp/junit5544286531112563801/dataset/.hoodie/metadata
> [WARN ] 2021-01-20 09:26:46,580 org.apache.hudi.metadata.HoodieBackedTableMetadata  - Metadata table was not found at path /tmp/junit5544286531112563801/dataset/.hoodie/metadata
> [WARN ] 2021-01-20 09:27:27,853 org.apache.hudi.client.AbstractHoodieWriteClient  - Cannot find instant 20210120092726 in the timeline, for rollback
> [WARN ] 2021-01-20 09:27:43,037 org.apache.hudi.testutils.HoodieClientTestHarness  - Closing file-system instance used in previous test-run
> [WARN ] 2021-01-20 09:27:46,017 org.apache.hudi.metadata.HoodieBackedTableMetadata  - Metadata table was not found at path /tmp/junit3284615140376500245/dataset/.hoodie/metadata
> [WARN ] 2021-01-20 09:28:05,357 org.apache.hudi.common.util.ClusteringUtils  - No content found in requested file for instant [==>20210120092805__replacecommit__REQUESTED]
> [WARN ] 2021-01-20 09:28:05,887 org.apache.hudi.common.util.ClusteringUtils  - No content found in requested file for instant [==>20210120092805__replacecommit__INFLIGHT]
> [WARN ] 2021-01-20 09:28:06,312 org.apache.hudi.common.util.ClusteringUtils  - No content found in requested file for instant [==>20210120092805__replacecommit__INFLIGHT]
> [WARN ] 2021-01-20 09:28:18,402 org.apache.hudi.testutils.HoodieClientTestHarness  - Closing file-system instance used in previous test-run
> [WARN ] 2021-01-20 09:28:22,013 org.apache.hudi.metadata.HoodieBackedTableMetadata  - Metadata table was not found at path /tmp/junit4284626513859445824/dataset/.hoodie/metadata
> [WARN ] 2021-01-20 09:28:40,354 org.apache.hudi.common.util.ClusteringUtils  - No content found in requested file for instant [==>20210120092840__replacecommit__REQUESTED]
> [WARN ] 2021-01-20 09:28:40,780 org.apache.hudi.common.util.ClusteringUtils  - No content found in requested file for instant [==>20210120092840__replacecommit__INFLIGHT]
> [WARN ] 2021-01-20 09:28:41,162 org.apache.hudi.common.util.ClusteringUtils  - No content found in requested file for instant [==>20210120092840__replacecommit__INFLIGHT]
> =====[ 605 seconds still running ]=====
> [ERROR] 2021-01-20 09:28:50,683 org.apache.hudi.timeline.service.FileSystemViewHandler  - Got runtime exception servicing request partition=2015%2F03%2F16&basepath=%2Ftmp%2Fjunit4284626513859445824%2Fdataset&lastinstantts=20210120092836&timelinehash=f16fb2749c64768742dc0546e22bfbe7d1d96619959730dc70f1d9aacf68594c
> org.apache.hudi.exception.HoodieMetadataException: Failed to retrieve files in partition /tmp/junit4284626513859445824/dataset/2015/03/16 from metadata
> 	at org.apache.hudi.metadata.BaseTableMetadata.getAllFilesInPartition(BaseTableMetadata.java:135)
> 	at org.apache.hudi.metadata.HoodieMetadataFileSystemView.listPartition(HoodieMetadataFileSystemView.java:65)
> 	at org.apache.hudi.common.table.view.AbstractTableFileSystemView.lambda$ensurePartitionLoadedCorrectly$9(AbstractTableFileSystemView.java:280)
> 	at java.util.concurrent.ConcurrentHashMap.computeIfAbsent(ConcurrentHashMap.java:1660)
> 	at org.apache.hudi.common.table.view.AbstractTableFileSystemView.ensurePartitionLoadedCorrectly(AbstractTableFileSystemView.java:269)
> 	at org.apache.hudi.common.table.view.AbstractTableFileSystemView.getLatestFileSlices(AbstractTableFileSystemView.java:544)
> 	at org.apache.hudi.timeline.service.handlers.FileSliceHandler.getLatestFileSlices(FileSliceHandler.java:72)
> 	at org.apache.hudi.timeline.service.FileSystemViewHandler.lambda$registerFileSlicesAPI$9(FileSystemViewHandler.java:221)
> 	at org.apache.hudi.timeline.service.FileSystemViewHandler$ViewHandler.handle(FileSystemViewHandler.java:359)
> 	at io.javalin.security.SecurityUtil.noopAccessManager(SecurityUtil.kt:22)
> 	at io.javalin.Javalin.lambda$addHandler$0(Javalin.java:606)
> 	at io.javalin.core.JavalinServlet$service$2$1.invoke(JavalinServlet.kt:46)
> 	at io.javalin.core.JavalinServlet$service$2$1.invoke(JavalinServlet.kt:17)
> 	at io.javalin.core.JavalinServlet$service$1.invoke(JavalinServlet.kt:143)
> 	at io.javalin.core.JavalinServlet$service$2.invoke(JavalinServlet.kt:41)
> 	at io.javalin.core.JavalinServlet.service(JavalinServlet.kt:107)
> 	at io.javalin.core.util.JettyServerUtil$initialize$httpHandler$1.doHandle(JettyServerUtil.kt:72)
> 	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:203)
> 	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:480)
> 	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1668)
> 	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:201)
> 	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1247)
> 	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:144)
> 	at org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:61)
> 	at org.eclipse.jetty.server.handler.StatisticsHandler.handle(StatisticsHandler.java:174)
> 	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
> 	at org.eclipse.jetty.server.Server.handle(Server.java:502)
> 	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:370)
> 	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:267)
> 	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:305)
> 	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
> 	at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
> 	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:333)
> 	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:310)
> 	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:168)
> 	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.produce(EatWhatYouKill.java:132)
> 	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:765)
> 	at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:683)
> 	at java.lang.Thread.run(Thread.java:748)
> Caused by: org.apache.hudi.exception.HoodieMetadataException: Metadata record for partition 2015/03/16 is inconsistent: HoodieMetadataPayload {key=2015/03/16, type=2, creations=[.199dcb05-17ad-401b-b30d-4fbba798d182-0_20210120092832.log.1_4-174-312, .1f463414-7145-44a9-b5e2-984d0ae65faf-0_20210120092820.log.1_2-174-310, .5c00188a-6254-4fc6-9692-cefaef87a937-0_20210120092832.log.1_5-174-313, .f36417a7-4cb0-4c5d-af16-6ba6a50d0979-0_20210120092832.log.1_3-174-311, 0bd48d0e-98b6-410a-a43d-d034e1435f10-0_1-187-333_20210120092840.parquet, 199dcb05-17ad-401b-b30d-4fbba798d182-0_0-120-200_20210120092832.parquet, 5c00188a-6254-4fc6-9692-cefaef87a937-0_3-120-203_20210120092832.parquet, b647ab42-2622-4584-ad9d-3078286d0e92-0_2-120-202_20210120092832.parquet, bec05629-05ef-4730-9605-0b924381e856-0_4-120-204_20210120092832.parquet, d99cce73-b251-475a-8d5e-67537c8cb9c2-0_7-120-207_20210120092832.parquet, f36417a7-4cb0-4c5d-af16-6ba6a50d0979-0_5-120-205_20210120092832.parquet], deletions=[199dcb05-17ad-401b-b30d-4fbba798d182-0_0-2-2_20210120092818.parquet, b647ab42-2622-4584-ad9d-3078286d0e92-0_2-9-20_20210120092820.parquet, bec05629-05ef-4730-9605-0b924381e856-0_0-9-18_20210120092820.parquet, d99cce73-b251-475a-8d5e-67537c8cb9c2-0_3-9-21_20210120092820.parquet, f36417a7-4cb0-4c5d-af16-6ba6a50d0979-0_1-2-3_20210120092818.parquet], }
> 	at org.apache.hudi.metadata.BaseTableMetadata.fetchAllFilesInPartition(BaseTableMetadata.java:212)
> 	at org.apache.hudi.metadata.BaseTableMetadata.getAllFilesInPartition(BaseTableMetadata.java:130)
> 	... 38 more
> [ERROR] 2021-01-20 09:28:50,734 org.apache.spark.executor.Executor  - Exception in task 0.0 in stage 279.0 (TID 428)
> org.apache.hudi.exception.HoodieRemoteException: Server Error
> 	at org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.getLatestFileSlices(RemoteHoodieTableFileSystemView.java:285)
> 	at org.apache.hudi.table.action.rollback.RollbackUtils.generateAppendRollbackBlocksAction(RollbackUtils.java:213)
> 	at org.apache.hudi.table.action.rollback.RollbackUtils.lambda$generateRollbackRequestsUsingFileListingMOR$e97f040e$1(RollbackUtils.java:191)
> 	at org.apache.hudi.client.common.HoodieSparkEngineContext.lambda$flatMap$7d470b86$1(HoodieSparkEngineContext.java:78)
> 	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$1$1.apply(JavaRDDLike.scala:125)
> 	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$1$1.apply(JavaRDDLike.scala:125)
> 	at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
> 	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
> 	at scala.collection.Iterator$class.foreach(Iterator.scala:891)
> 	at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
> 	at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
> 	at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
> 	at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
> 	at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:310)
> 	at scala.collection.AbstractIterator.to(Iterator.scala:1334)
> 	at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)
> 	at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1334)
> 	at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)
> 	at scala.collection.AbstractIterator.toArray(Iterator.scala:1334)
> 	at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$13.apply(RDD.scala:945)
> 	at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$13.apply(RDD.scala:945)
> 	at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
> 	at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
> 	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
> 	at org.apache.spark.scheduler.Task.run(Task.scala:123)
> 	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
> 	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
> 	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> 	at java.lang.Thread.run(Thread.java:748)
> Caused by: org.apache.http.client.HttpResponseException: Server Error
> 	at org.apache.http.impl.client.AbstractResponseHandler.handleResponse(AbstractResponseHandler.java:69)
> 	at org.apache.http.client.fluent.Response.handleResponse(Response.java:90)
> 	at org.apache.http.client.fluent.Response.returnContent(Response.java:97)
> 	at org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.executeRequest(RemoteHoodieTableFileSystemView.java:179)
> 	at org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.getLatestFileSlices(RemoteHoodieTableFileSystemView.java:281)
> 	... 30 more
> [ERROR] 2021-01-20 09:28:50,896 org.apache.spark.scheduler.TaskSetManager  - Task 0 in stage 279.0 failed 1 times; aborting job
> [WARN ] 2021-01-20 09:28:51,275 org.apache.hudi.testutils.HoodieClientTestHarness  - Closing file-system instance used in previous test-run
> [WARN ] 2021-01-20 09:28:51,784 org.apache.hudi.metadata.HoodieBackedTableMetadata  - Metadata table was not found at path /tmp/junit5755239143054851850/dataset/.hoodie/metadata
> [WARN ] 2021-01-20 09:28:51,788 org.apache.hudi.metadata.HoodieBackedTableMetadata  - Metadata table was not found at path /tmp/junit5755239143054851850/dataset/.hoodie/metadata
> [WARN ] 2021-01-20 09:28:53,031 org.apache.hudi.testutils.HoodieClientTestHarness  - Closing file-system instance used in previous test-run
> [WARN ] 2021-01-20 09:28:53,474 org.apache.hudi.metadata.HoodieBackedTableMetadata  - Metadata table was not found at path /tmp/junit6090307657771595490/dataset/.hoodie/metadata
> [WARN ] 2021-01-20 09:28:53,491 org.apache.hudi.metadata.HoodieBackedTableMetadata  - Metadata table was not found at path /tmp/junit6090307657771595490/dataset/.hoodie/metadata
> [WARN ] 2021-01-20 09:29:16,194 org.apache.hudi.testutils.HoodieClientTestHarness  - Closing file-system instance used in previous test-run
> [WARN ] 2021-01-20 09:29:16,653 org.apache.hudi.metadata.HoodieBackedTableMetadata  - Metadata table was not found at path /tmp/junit634503492776558438/dataset/.hoodie/metadata
> [WARN ] 2021-01-20 09:29:16,659 org.apache.hudi.metadata.HoodieBackedTableMetadata  - Metadata table was not found at path /tmp/junit634503492776558438/dataset/.hoodie/metadata
> [WARN ] 2021-01-20 09:29:52,708 org.apache.hudi.testutils.HoodieClientTestHarness  - Closing file-system instance used in previous test-run
> [WARN ] 2021-01-20 09:29:53,184 org.apache.hudi.metadata.HoodieBackedTableMetadata  - Metadata table was not found at path /tmp/junit231662502999062428/dataset/.hoodie/metadata
> [WARN ] 2021-01-20 09:29:53,192 org.apache.hudi.metadata.HoodieBackedTableMetadata  - Metadata table was not found at path /tmp/junit231662502999062428/dataset/.hoodie/metadata
> [WARN ] 2021-01-20 09:30:34,633 org.apache.hudi.testutils.HoodieClientTestHarness  - Closing file-system instance used in previous test-run
> [WARN ] 2021-01-20 09:30:35,252 org.apache.hudi.metadata.HoodieBackedTableMetadata  - Metadata table was not found at path /tmp/junit8359983742697170561/dataset/.hoodie/metadata
> [WARN ] 2021-01-20 09:30:35,264 org.apache.hudi.metadata.HoodieBackedTableMetadata  - Metadata table was not found at path /tmp/junit8359983742697170561/dataset/.hoodie/metadata
> [WARN ] 2021-01-20 09:30:39,301 org.apache.hudi.testutils.HoodieClientTestHarness  - Closing file-system instance used in previous test-run
> [WARN ] 2021-01-20 09:30:41,114 org.apache.hudi.testutils.HoodieClientTestHarness  - Closing file-system instance used in previous test-run
> [WARN ] 2021-01-20 09:30:41,614 org.apache.hudi.metadata.HoodieBackedTableMetadata  - Metadata table was not found at path /tmp/junit557579703100865720/dataset/.hoodie/metadata
> [WARN ] 2021-01-20 09:30:41,624 org.apache.hudi.metadata.HoodieBackedTableMetadata  - Metadata table was not found at path /tmp/junit557579703100865720/dataset/.hoodie/metadata
> [WARN ] 2021-01-20 09:30:50,657 org.apache.hudi.testutils.HoodieClientTestHarness  - Closing file-system instance used in previous test-run
> [WARN ] 2021-01-20 09:30:51,138 org.apache.hudi.metadata.HoodieBackedTableMetadata  - Metadata table was not found at path /tmp/junit3605142015784394328/dataset/.hoodie/metadata
> [WARN ] 2021-01-20 09:30:51,146 org.apache.hudi.metadata.HoodieBackedTableMetadata  - Metadata table was not found at path /tmp/junit3605142015784394328/dataset/.hoodie/metadata
> [WARN ] 2021-01-20 09:30:59,353 org.apache.hudi.testutils.HoodieClientTestHarness  - Closing file-system instance used in previous test-run
> [WARN ] 2021-01-20 09:30:59,831 org.apache.hudi.metadata.HoodieBackedTableMetadata  - Metadata table was not found at path /tmp/junit7479308570019763056/dataset/.hoodie/metadata
> [WARN ] 2021-01-20 09:30:59,842 org.apache.hudi.metadata.HoodieBackedTableMetadata  - Metadata table was not found at path /tmp/junit7479308570019763056/dataset/.hoodie/metadata
> [WARN ] 2021-01-20 09:31:02,984 org.apache.hudi.testutils.HoodieClientTestHarness  - Closing file-system instance used in previous test-run
> [WARN ] 2021-01-20 09:31:03,453 org.apache.hudi.metadata.HoodieBackedTableMetadata  - Metadata table was not found at path /tmp/junit3159262396622529530/dataset/.hoodie/metadata
> [WARN ] 2021-01-20 09:31:03,466 org.apache.hudi.metadata.HoodieBackedTableMetadata  - Metadata table was not found at path /tmp/junit3159262396622529530/dataset/.hoodie/metadata
> [WARN ] 2021-01-20 09:31:03,477 org.apache.hudi.metadata.HoodieBackedTableMetadata  - Metadata table was not found at path /tmp/junit3159262396622529530/dataset/.hoodie/metadata
> [WARN ] 2021-01-20 09:31:17,087 org.apache.hudi.testutils.HoodieClientTestHarness  - Closing file-system instance used in previous test-run
> [WARN ] 2021-01-20 09:31:17,671 org.apache.hudi.metadata.HoodieBackedTableMetadata  - Metadata table was not found at path /tmp/junit5695889147703781800/dataset/.hoodie/metadata
> [WARN ] 2021-01-20 09:31:17,683 org.apache.hudi.metadata.HoodieBackedTableMetadata  - Metadata table was not found at path /tmp/junit5695889147703781800/dataset/.hoodie/metadata
> [WARN ] 2021-01-20 09:31:59,424 org.apache.hudi.testutils.HoodieClientTestHarness  - Closing file-system instance used in previous test-run
> [ERROR] Tests run: 15, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 389.035 s <<< FAILURE! - in org.apache.hudi.metadata.TestHoodieBackedMetadata
> [ERROR] org.apache.hudi.metadata.TestHoodieBackedMetadata.testSync(HoodieTableType)[2]  Time elapsed: 32.857 s  <<< ERROR!
> org.apache.hudi.exception.HoodieRestoreException: Failed to restore to 20210120092832
> 	at org.apache.hudi.metadata.TestHoodieBackedMetadata.testSync(TestHoodieBackedMetadata.java:520)
> Caused by: org.apache.spark.SparkException: 
> Job aborted due to stage failure: Task 0 in stage 279.0 failed 1 times, most recent failure: Lost task 0.0 in stage 279.0 (TID 428, localhost, executor driver): org.apache.hudi.exception.HoodieRemoteException: Server Error
> 	at org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.getLatestFileSlices(RemoteHoodieTableFileSystemView.java:285)
> 	at org.apache.hudi.table.action.rollback.RollbackUtils.generateAppendRollbackBlocksAction(RollbackUtils.java:213)
> 	at org.apache.hudi.table.action.rollback.RollbackUtils.lambda$generateRollbackRequestsUsingFileListingMOR$e97f040e$1(RollbackUtils.java:191)
> 	at org.apache.hudi.client.common.HoodieSparkEngineContext.lambda$flatMap$7d470b86$1(HoodieSparkEngineContext.java:78)
> 	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$1$1.apply(JavaRDDLike.scala:125)
> 	at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$1$1.apply(JavaRDDLike.scala:125)
> 	at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
> 	at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
> 	at scala.collection.Iterator$class.foreach(Iterator.scala:891)
> 	at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
> 	at scala.collection.generic.Growable$class.$plus$plus$eq(Growable.scala:59)
> 	at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:104)
> 	at scala.collection.mutable.ArrayBuffer.$plus$plus$eq(ArrayBuffer.scala:48)
> 	at scala.collection.TraversableOnce$class.to(TraversableOnce.scala:310)
> 	at scala.collection.AbstractIterator.to(Iterator.scala:1334)
> 	at scala.collection.TraversableOnce$class.toBuffer(TraversableOnce.scala:302)
> 	at scala.collection.AbstractIterator.toBuffer(Iterator.scala:1334)
> 	at scala.collection.TraversableOnce$class.toArray(TraversableOnce.scala:289)
> 	at scala.collection.AbstractIterator.toArray(Iterator.scala:1334)
> 	at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$13.apply(RDD.scala:945)
> 	at org.apache.spark.rdd.RDD$$anonfun$collect$1$$anonfun$13.apply(RDD.scala:945)
> 	at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
> 	at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2101)
> 	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
> 	at org.apache.spark.scheduler.Task.run(Task.scala:123)
> 	at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
> 	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
> 	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> 	at java.lang.Thread.run(Thread.java:748)
> Caused by: org.apache.http.client.HttpResponseException: Server Error
> 	at org.apache.http.impl.client.AbstractResponseHandler.handleResponse(AbstractResponseHandler.java:69)
> 	at org.apache.http.client.fluent.Response.handleResponse(Response.java:90)
> 	at org.apache.http.client.fluent.Response.returnContent(Response.java:97)
> 	at org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.executeRequest(RemoteHoodieTableFileSystemView.java:179)
> 	at org.apache.hudi.common.table.view.RemoteHoodieTableFileSystemView.getLatestFileSlices(RemoteHoodieTableFileSystemView.java:281)
> 	... 30 more
> Driver stacktrace:
> 	at org.apache.hudi.metadata.TestHoodieBackedMetadata.testSync(TestHoodieBackedMetadata.java:520)
> Caused by: org.apache.hudi.exception.HoodieRemoteException: Server Error
> Caused by: org.apache.http.client.HttpResponseException: Server Error
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)