You are viewing a plain text version of this content. The canonical link for it is here.
Posted to github@beam.apache.org by GitBox <gi...@apache.org> on 2022/09/02 13:04:35 UTC

[GitHub] [beam] cozos commented on issue #22986: [Bug]: WriteToBigquery Deadletter pattern does not work with FILE_LOADS method

cozos commented on issue #22986:
URL: https://github.com/apache/beam/issues/22986#issuecomment-1235481552

   Hi @brucearctor thanks for the suggestion. Upon further investigation, I've found out that for me (who uses `FileFormat.AVRO`), `WriteToBigQuery/BigQueryBatchFileLoads` transform fails when writing the temporary file (typically due to schema mismatch).
   
   So, like you mentioned, the `max_bad_records` is a workaround for `FileFormat.JSON`, but unfortunately doesn't even get to the `LoadJob` step in Avro format.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org