You are viewing a plain text version of this content. The canonical link for it is here.
Posted to github@beam.apache.org by GitBox <gi...@apache.org> on 2020/08/24 23:22:37 UTC

[GitHub] [beam] rezarokni commented on a change in pull request #12663: [BEAM-10597] Propagate BigQuery streaming insert throttled time to Dataflow worker in Python SDK

rezarokni commented on a change in pull request #12663:
URL: https://github.com/apache/beam/pull/12663#discussion_r475953736



##########
File path: sdks/python/apache_beam/io/gcp/bigquery.py
##########
@@ -1257,6 +1260,7 @@ def _flush_batch(self, destination):
         _LOGGER.info(
             'Sleeping %s seconds before retrying insertion.', retry_backoff)
         time.sleep(retry_backoff)
+        self._throttled_secs.inc(retry_backoff)

Review comment:
       @pabloem Sorry missed this one...
   
   Yes agree with Metrics for this type of value, as the ever increasing number will just chart as a liner value without the ability to usefully capture the changes. 




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org