You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by GitBox <gi...@apache.org> on 2022/06/10 14:29:20 UTC

[GitHub] [hudi] Reimus commented on issue #5358: [SUPPORT] read hudi cow table with spark, throw exception: File does not exist

Reimus commented on issue #5358:
URL: https://github.com/apache/hudi/issues/5358#issuecomment-1152423045

   I am having same issue, after ingestion job crashed.
   
   This does not seem to be a simple file being deleted due to cleaning - since error persists even after no other tasks are writing to the table.
   
   Unfortunately I don't know how to reproduce the error.
   
   Timeline of events:
   
   1. Upsert operation crashed (all operations were inline) with error: Caused by: java.io.FileNotFoundException: File does not exist: 
   2. Some reads succeed, others - if they touch affected partitions fail with same error as above.
   
   
       Hudi version : 0.11.0
   
       Spark version : 3.1.2
   
       Hadoop version : 3.0.0
   
       Storage (HDFS/S3/GCS..) : HDFS
   
       Running on Docker? (yes/no) : no
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org