You are viewing a plain text version of this content. The canonical link for it is here.
Posted to github@beam.apache.org by GitBox <gi...@apache.org> on 2021/06/07 18:11:36 UTC

[GitHub] [beam] pabloem commented on a change in pull request #14690: [BEAM-12445] Moving streaming inserts with new client

pabloem commented on a change in pull request #14690:
URL: https://github.com/apache/beam/pull/14690#discussion_r646830329



##########
File path: sdks/python/apache_beam/io/gcp/bigquery_test.py
##########
@@ -809,15 +808,14 @@ def test_dofn_client_process_performs_batching(self):
     fn.process(('project_id:dataset_id.table_id', {'month': 1}))
 
     # InsertRows not called as batch size is not hit yet
-    self.assertFalse(client.tabledata.InsertAll.called)
+    self.assertFalse(client.insert_rows_json.called)

Review comment:
       no. The API formats everything into JSON. Other API calls use automatic JSON conversions.

##########
File path: sdks/python/apache_beam/io/gcp/bigquery.py
##########
@@ -1312,10 +1312,11 @@ def _flush_batch(self, destination):
           skip_invalid_rows=True)
       self.batch_latency_metric.update((time.time() - start) * 1000)
 
-      failed_rows = [rows[entry.index] for entry in errors]
+      failed_rows = [rows[entry['index']] for entry in errors]

Review comment:
       that's right.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org