You are viewing a plain text version of this content. The canonical link for it is here.
Posted to github@beam.apache.org by "BjornPrime (via GitHub)" <gi...@apache.org> on 2023/05/15 20:09:39 UTC

[GitHub] [beam] BjornPrime commented on a diff in pull request #25965: [DO NOT APPROVE; testing purposes only] Replace storage v1 client with GCS client

BjornPrime commented on code in PR #25965:
URL: https://github.com/apache/beam/pull/25965#discussion_r1194313340


##########
sdks/python/apache_beam/io/gcp/gcsio.py:
##########
@@ -296,160 +234,87 @@ def delete_batch(self, paths):
     """
     if not paths:
       return []
-
-    paths = iter(paths)
+    if len(paths) > MAX_BATCH_OPERATION_SIZE:
+      raise TooManyRequests("Batch larger than %s", MAX_BATCH_OPERATION_SIZE)
     result_statuses = []
-    while True:
-      paths_chunk = list(islice(paths, MAX_BATCH_OPERATION_SIZE))

Review Comment:
   The lines I added above should ensure that an error is raised if more than 100 paths are requested. I'm pretty sure the previous logic was just truncating the request. If that's what we want to do, I can change it back, but I think throwing an error is more accurate.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org