You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by GitBox <gi...@apache.org> on 2022/01/24 12:37:35 UTC

[GitHub] [hudi] VIKASPATID edited a comment on issue #4635: [SUPPORT] Bulk write failing due to hudi timeline archive exception

VIKASPATID edited a comment on issue #4635:
URL: https://github.com/apache/hudi/issues/4635#issuecomment-1020055298


   We tried hudi 0.10.0, but running into issue with bulk write, multi writer 
   ```
   py4j.protocol.Py4JJavaError: An error occurred while calling o240.save.
   : org.apache.hudi.exception.HoodieRemoteException: Failed to delete marker directory s3://xxxxx/tmp/tmp/tmp/tables/deeptick/.hoodie/.temp/20220124051311124
   Read timed out
           at org.apache.hudi.table.marker.TimelineServerBasedWriteMarkers.deleteMarkerDir(TimelineServerBasedWriteMarkers.java:91)
           at org.apache.hudi.table.marker.WriteMarkers.quietDeleteMarkerDir(WriteMarkers.java:88)
           at org.apache.hudi.client.AbstractHoodieWriteClient.postCommit(AbstractHoodieWriteClient.java:450)
           at org.apache.hudi.client.AbstractHoodieWriteClient.commitStats(AbstractHoodieWriteClient.java:197)
           at org.apache.hudi.client.SparkRDDWriteClient.commit(SparkRDDWriteClient.java:124)
           at org.apache.hudi.HoodieSparkSqlWriter$.commitAndPerformPostOperations(HoodieSparkSqlWriter.scala:633)
           at org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:284)
           at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:164)
           at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:46)
           at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
           at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
           at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:90)
           at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:194)
           at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:232)
           at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
           at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:229)
           at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:190)
           at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:134)
           at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:133)
           at org.apache.spark.sql.DataFrameWriter.$anonfun$runCommand$1(DataFrameWriter.scala:989)
           at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:107)
           at org.apache.spark.sql.execution.SQLExecution$.withTracker(SQLExecution.scala:232)
           at org.apache.spark.sql.execution.SQLExecution$.executeQuery$1(SQLExecution.scala:110)
           at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$6(SQLExecution.scala:135)
           at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:107)
           at org.apache.spark.sql.execution.SQLExecution$.withTracker(SQLExecution.scala:232)
           at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:135)
           at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:253)
           at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:134)
           at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775)
           at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:68)
           at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:989)
           at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:438)
           at org.apache.spark.sql.DataFrameWriter.saveInternal(DataFrameWriter.scala:415)
           at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:301)
           at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
           at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
           at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
           at java.lang.reflect.Method.invoke(Method.java:498)
           at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
           at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
           at py4j.Gateway.invoke(Gateway.java:282)
           at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
           at py4j.commands.CallCommand.execute(CallCommand.java:79)
           at py4j.GatewayConnection.run(GatewayConnection.java:238)
           at java.lang.Thread.run(Thread.java:748)
   Caused by: java.net.SocketTimeoutException: Read timed out
           at java.net.SocketInputStream.socketRead0(Native Method)
           at java.net.SocketInputStream.socketRead(SocketInputStream.java:116)
           at java.net.SocketInputStream.read(SocketInputStream.java:171)
           at java.net.SocketInputStream.read(SocketInputStream.java:141)
           at org.apache.http.impl.io.SessionInputBufferImpl.streamRead(SessionInputBufferImpl.java:137)
           at org.apache.http.impl.io.SessionInputBufferImpl.fillBuffer(SessionInputBufferImpl.java:153)
           at org.apache.http.impl.io.SessionInputBufferImpl.readLine(SessionInputBufferImpl.java:280)
           at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:138)
           at org.apache.http.impl.conn.DefaultHttpResponseParser.parseHead(DefaultHttpResponseParser.java:56)
           at org.apache.http.impl.io.AbstractMessageParser.parse(AbstractMessageParser.java:259)
           at org.apache.http.impl.DefaultBHttpClientConnection.receiveResponseHeader(DefaultBHttpClientConnection.java:163)
           at org.apache.http.impl.conn.CPoolProxy.receiveResponseHeader(CPoolProxy.java:157)
           at org.apache.http.protocol.HttpRequestExecutor.doReceiveResponse(HttpRequestExecutor.java:273)
           at org.apache.http.protocol.HttpRequestExecutor.execute(HttpRequestExecutor.java:125)
           at org.apache.http.impl.execchain.MainClientExec.execute(MainClientExec.java:272)
           at org.apache.http.impl.execchain.ProtocolExec.execute(ProtocolExec.java:186)
           at org.apache.http.impl.execchain.RetryExec.execute(RetryExec.java:89)
           at org.apache.http.impl.execchain.RedirectExec.execute(RedirectExec.java:110)
           at org.apache.http.impl.client.InternalHttpClient.doExecute(InternalHttpClient.java:185)
           at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:83)
           at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:108)
           at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:56)
           at org.apache.http.client.fluent.Request.execute(Request.java:151)
           at org.apache.hudi.table.marker.TimelineServerBasedWriteMarkers.executeRequestToTimelineServer(TimelineServerBasedWriteMarkers.java:177)
           at org.apache.hudi.table.marker.TimelineServerBasedWriteMarkers.deleteMarkerDir(TimelineServerBasedWriteMarkers.java:88)
           ... 45 more
   ```


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org