You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by "Ashish (Jira)" <ji...@apache.org> on 2020/01/17 18:11:00 UTC

[jira] [Commented] (HUDI-545) Throw NoSuchElementException when init IncrementalRelation

    [ https://issues.apache.org/jira/browse/HUDI-545?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17018225#comment-17018225 ] 

Ashish commented on HUDI-545:
-----------------------------

Steps to reproduce the issue:

1. There are no changes to the datasets so few commits are empty as shown below

!image-2020-01-17-10-07-31-105.png!

 

2.

Now if we try to perform Incremental Pull, it throws "java.util.NoSuchElementException".

val hoodieIncrementalView = [spark.read|http://spark.read/] .format("org.apache.hudi") .option(VIEW_TYPE_OPT_KEY, VIEW_TYPE_INCREMENTAL_OPT_VAL) .option(BEGIN_INSTANTTIME_OPT_KEY, "20200108054310") .load(hudiDirectory)

> Throw NoSuchElementException when init IncrementalRelation
> ----------------------------------------------------------
>
>                 Key: HUDI-545
>                 URL: https://issues.apache.org/jira/browse/HUDI-545
>             Project: Apache Hudi (incubating)
>          Issue Type: Bug
>          Components: Incremental Pull
>            Reporter: lamber-ken
>            Priority: Major
>         Attachments: image-2020-01-17-10-03-23-268.png, image-2020-01-17-10-07-31-105.png
>
>
> If there is an empty commit in HUDI storage then Incremental Pulling throws "java.util.NoSuchElementException". 
> {code:java}
> 20/01/16 19:22:49 ERROR Client: Application diagnostics message: User class threw exception: java.util.NoSuchElementException
>     at java.util.HashMap$HashIterator.nextNode(HashMap.java:1447)
>     at java.util.HashMap$ValueIterator.next(HashMap.java:1474)
>     at org.apache.hudi.IncrementalRelation.<init>(IncrementalRelation.scala:80)
>     at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:65)
>     at org.apache.hudi.DefaultSource.createRelation(DefaultSource.scala:46)
>     at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:318)
>     at org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:223)
>     at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:211)
>     at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178)
>     at com.amazon.finautopolarisdataplanesparkemr.emr.java.SparkHiveDataLoadCoreHudiReader.readData(SparkHiveDataLoadCoreHudiReader.java:147)
>     at com.amazon.finautopolarisdataplanesparkemr.emr.java.SparkHiveDataLoadCoreHudiReader.start(SparkHiveDataLoadCoreHudiReader.java:73)
>     at com.amazon.finautopolarisdataplanesparkemr.emr.java.SparkHiveDataLoadCoreHudiReader.main(SparkHiveDataLoadCoreHudiReader.java:36)
>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>     at java.lang.reflect.Method.invoke(Method.java:498)
>     at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:684)
> Exception in thread "main" org.apache.spark.SparkException: Application application_1579116139216_0082 finished with failed status
>     at org.apache.spark.deploy.yarn.Client.run(Client.scala:1149)
>     at org.apache.spark.deploy.yarn.YarnClusterApplication.start(Client.scala:1526)
>     at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:853)
>     at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
>     at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
>     at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
>     at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:928)
>     at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:937)
>     at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)