You are viewing a plain text version of this content. The canonical link for it is here.
Posted to github@beam.apache.org by GitBox <gi...@apache.org> on 2022/06/03 18:33:43 UTC

[GitHub] [beam] kennknowles opened a new issue, #18421: BigQueryIO write is slow/fail with a bounded source

kennknowles opened a new issue, #18421:
URL: https://github.com/apache/beam/issues/18421

   BigQueryIO Writer is slow / fail if the input source is bounded.
   
   EDIT: Input BQ: 294 GB, 741,896,827 events
   
   If the input source is bounded (GCS / BQ select / ...), BigQueryIO Writer use the "[Method.FILE_LOADS](https://github.com/apache/beam/blob/master/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java#L1168)" instead of streaming inserts.
   
   Large amounts of input datas result in a  java.lang.OutOfMemoryError / Java heap space (500 millions rows).
   
   
   !PrepareWrite.BatchLoads.png|thumbnail!
   
   We cannot use "Method.STREAMING_INSERTS" or control the batchs sizes since
   [withMaxFilesPerBundle](https://github.com/apache/beam/blob/master/sdks/java/io/google-cloud-platform/src/main/java/org/apache/beam/sdk/io/gcp/bigquery/BigQueryIO.java#L1131) is private :(
   
   Someone reported a similar problem with GCS -\> BQ on Stackoverflow: 
   [Why is writing to BigQuery from a Dataflow/Beam pipeline slow?](https://stackoverflow.com/questions/45889992/why-is-writing-to-bigquery-from-a-dataflow-beam-pipeline-slow#comment78954153_45889992)
   
   
   
   Imported from Jira [BEAM-2840](https://issues.apache.org/jira/browse/BEAM-2840). Original Jira may contain additional context.
   Reported by: vspiewak.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org