You are viewing a plain text version of this content. The canonical link for it is here.
Posted to github@beam.apache.org by "AnandInguva (via GitHub)" <gi...@apache.org> on 2023/02/24 16:33:49 UTC

[GitHub] [beam] AnandInguva commented on a diff in pull request #24599: Update Protobuf and GCP dependencies in Beam Python SDK

AnandInguva commented on code in PR #24599:
URL: https://github.com/apache/beam/pull/24599#discussion_r1117281654


##########
sdks/python/apache_beam/io/gcp/datastore/v1new/util.py:
##########
@@ -137,3 +137,22 @@ def report_latency(self, now, latency_ms, num_mutations):
       num_mutations: int, number of mutations contained in the RPC.
     """
     self._commit_time_per_entity_ms.add(now, latency_ms / num_mutations)
+
+
+def extract_byte_size(proto_message):
+  """
+    Gets the byte size from a google.protobuf or proto-plus message

Review Comment:
   There are some helper functions which converts key or entity to protobuf. There are certain piece of test code that replicates this behavior using `FakeMutation` object. In tests, FakeMutation object has attr ByteSize() in which we calculate byte size of messages. 
   
   To replace the FakeMutation class, we need to refactor some pieces of code. So for now, I am keeping this as it is  now and I filed an issue https://github.com/apache/beam/issues/25625 to tackle this not only for this library but for other GCP dependencies as well.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org