You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by GitBox <gi...@apache.org> on 2020/09/04 02:07:14 UTC

[GitHub] [hudi] prashanthvg89 opened a new issue #2065: [SUPPORT] Intermittent IllegalArgumentException while saving to Hudi dataset from Spark streaming job

prashanthvg89 opened a new issue #2065:
URL: https://github.com/apache/hudi/issues/2065


   **_Tips before filing an issue_**
   
   - Have you gone through our [FAQs](https://cwiki.apache.org/confluence/display/HUDI/FAQ)?
   
   - Join the mailing list to engage in conversations and get faster support at dev-subscribe@hudi.apache.org.
   
   - If you have triaged this as a bug, then file an [issue](https://issues.apache.org/jira/projects/HUDI/issues) directly.
   
   **Describe the problem you faced**
   
   A clear and concise description of the problem.
   
   **To Reproduce**
   
   Steps to reproduce the behavior:
   
   1. Spark streaming job with 15 minute batch interval
   2. UPSERT to an existing Hudi dataset
   3. Run for about 2 days
   
   **Expected behavior**
   
   Error should not be seen intermittently. But, it fails after running for about 2 days just fine.
   
   **Environment Description**
   
   * Hudi version : 0.5.2-incubating
   
   * Spark version : 2.4.4
   
   * Hive version : 2.3.6
   
   * Hadoop version : EMR release label 5.29.0
   
   * Storage (HDFS/S3/GCS..) : S3
   
   * Running on Docker? (yes/no) : no
   
   
   **Additional context**
   
   There is not much of logging here. A little more details could have helped. But, I observed a pattern that happened thrice in a row, where the job fails after 2 days of continuous run
   
   **Stacktrace**
   
   java.lang.IllegalArgumentException
           at com.google.common.base.Preconditions.checkArgument(Preconditions.java:76)
           at org.apache.hudi.common.table.timeline.HoodieActiveTimeline.transitionState(HoodieActiveTimeline.java:324)
           at org.apache.hudi.common.table.timeline.HoodieActiveTimeline.transitionCleanInflightToComplete(HoodieActiveTimeline.java:290)
           at org.apache.hudi.client.HoodieCleanClient.runClean(HoodieCleanClient.java:183)
           at org.apache.hudi.client.HoodieCleanClient.clean(HoodieCleanClient.java:98)
           at org.apache.hudi.client.HoodieWriteClient.clean(HoodieWriteClient.java:835)
           at org.apache.hudi.client.HoodieWriteClient.postCommit(HoodieWriteClient.java:512)
           at org.apache.hudi.client.AbstractHoodieWriteClient.commit(AbstractHoodieWriteClient.java:157)
           at org.apache.hudi.client.AbstractHoodieWriteClient.commit(AbstractHoodieWriteClient.java:101)
           at org.apache.hudi.client.AbstractHoodieWriteClient.commit(AbstractHoodieWriteClient.java:92)
           at org.apache.hudi.HoodieSparkSqlWriter$.checkWriteStatus(HoodieSparkSqlWriter.scala:262)
           at org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:184)
           at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:91)
           at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:45)
           at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
           at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
           at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
           at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
           at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
           at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:156)
           at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
           at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
           at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
           at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:83)
           at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:83)
           at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
           at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:676)
           at org.apache.spark.sql.execution.SQLExecution$$anonfun$withNewExecutionId$1.apply(SQLExecution.scala:84)
           at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:165)
           at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:74)
           at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:676)
           at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:285)
           at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:271)
           at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:229)
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] bvaradar commented on issue #2065: [SUPPORT] Intermittent IllegalArgumentException while saving to Hudi dataset from Spark streaming job

Posted by GitBox <gi...@apache.org>.
bvaradar commented on issue #2065:
URL: https://github.com/apache/hudi/issues/2065#issuecomment-691778939


   @prashanthvg89 : Please reopen if you still run into problems.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] bvaradar commented on issue #2065: [SUPPORT] Intermittent IllegalArgumentException while saving to Hudi dataset from Spark streaming job

Posted by GitBox <gi...@apache.org>.
bvaradar commented on issue #2065:
URL: https://github.com/apache/hudi/issues/2065#issuecomment-687210223


   Are you running with eventual consistency guard enabled ?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] prashanthvg89 commented on issue #2065: [SUPPORT] Intermittent IllegalArgumentException while saving to Hudi dataset from Spark streaming job

Posted by GitBox <gi...@apache.org>.
prashanthvg89 commented on issue #2065:
URL: https://github.com/apache/hudi/issues/2065#issuecomment-690675022


   So far it's running good. Usually, it used to fail after 2 days previously and now it's been close to that and no errors. I'll update by end of this week


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] prashanthvg89 commented on issue #2065: [SUPPORT] Intermittent IllegalArgumentException while saving to Hudi dataset from Spark streaming job

Posted by GitBox <gi...@apache.org>.
prashanthvg89 commented on issue #2065:
URL: https://github.com/apache/hudi/issues/2065#issuecomment-687239989


   No


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] bvaradar commented on issue #2065: [SUPPORT] Intermittent IllegalArgumentException while saving to Hudi dataset from Spark streaming job

Posted by GitBox <gi...@apache.org>.
bvaradar commented on issue #2065:
URL: https://github.com/apache/hudi/issues/2065#issuecomment-687250418


   Yes, Can you set this for S3 and retry from beginning.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] prashanthvg89 edited a comment on issue #2065: [SUPPORT] Intermittent IllegalArgumentException while saving to Hudi dataset from Spark streaming job

Posted by GitBox <gi...@apache.org>.
prashanthvg89 edited a comment on issue #2065:
URL: https://github.com/apache/hudi/issues/2065#issuecomment-687239989


   No. Should I set this property to true?
   
   hoodie.consistency.check.enabled


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [hudi] bvaradar closed issue #2065: [SUPPORT] Intermittent IllegalArgumentException while saving to Hudi dataset from Spark streaming job

Posted by GitBox <gi...@apache.org>.
bvaradar closed issue #2065:
URL: https://github.com/apache/hudi/issues/2065


   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org