You are viewing a plain text version of this content. The canonical link for it is here.
Posted to github@beam.apache.org by GitBox <gi...@apache.org> on 2020/08/21 19:54:53 UTC

[GitHub] [beam] pabloem commented on a change in pull request #12663: [BEAM-10597] Propagate BigQuery streaming insert throttled time to Dataflow worker in Python SDK

pabloem commented on a change in pull request #12663:
URL: https://github.com/apache/beam/pull/12663#discussion_r474913739



##########
File path: sdks/python/apache_beam/io/gcp/bigquery.py
##########
@@ -1257,6 +1260,7 @@ def _flush_batch(self, destination):
         _LOGGER.info(
             'Sleeping %s seconds before retrying insertion.', retry_backoff)
         time.sleep(retry_backoff)
+        self._throttled_secs.inc(retry_backoff)

Review comment:
       I wonder if we should make this a `Metrics.distribution` type metric? @rezarokni do you have an opinion about this? Since the metric may be exported to monitoring systems, a counter may just show a 'sometimes increasing' chart vs a distribution showing a rate of increase?




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org