You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by GitBox <gi...@apache.org> on 2019/10/25 14:55:28 UTC

[GitHub] [incubator-hudi] vinothchandar edited a comment on issue #971: Setting "hoodie.parquet.max.file.size" to a value >= 2 GiB leads to no data being generated

vinothchandar edited a comment on issue #971: Setting "hoodie.parquet.max.file.size" to a value >= 2 GiB leads to no data being generated
URL: https://github.com/apache/incubator-hudi/issues/971#issuecomment-546385234
 
 
   Okay able to repro using even 2.4. Its an integer overflow somewhere in the config passing path. What happens is the workload profile computes negative number of records assigned and thus skips assigning them. Tracking hows it happening still, bit puzzling since the `HoodieWriteConfig` level its all long
   
   Aha. 
   
   ```
   scala> String.valueOf(3 *1024 * 1024 * 1024)
   res1: String = -1073741824
   
   scala>
   ```
   
   can you try just doing `((Long) (3 *1024 * 1024 * 1024L)).toString();` in Java or `(3 *1024 * 1024 * 1024L).toString` in scala? 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services