You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by GitBox <gi...@apache.org> on 2020/02/27 23:24:24 UTC

[GitHub] [beam] pabloem commented on a change in pull request #10979: [BEAM-8841] Support writing data to BigQuery via Avro in Python SDK

pabloem commented on a change in pull request #10979: [BEAM-8841] Support writing data to BigQuery via Avro in Python SDK
URL: https://github.com/apache/beam/pull/10979#discussion_r384769744
 
 

 ##########
 File path: sdks/python/apache_beam/io/gcp/bigquery.py
 ##########
 @@ -1361,87 +1369,18 @@ def __init__(
     self.triggering_frequency = triggering_frequency
     self.insert_retry_strategy = insert_retry_strategy
     self._validate = validate
+    self._temp_file_format = temp_file_format or bigquery_tools.FileFormat.JSON
 
 Review comment:
   I'm happy to make AVRO the default format if possible. I guess the issue is that users need to provide the schema, right? Otherwise we cannot write the avro files.
   
   We could make AVRO the default, and add a check that the schema was provided (i.e. is neither None nor autodetect) - and error out if that's the case? What do you think?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services