You are viewing a plain text version of this content. The canonical link for it is here.
Posted to github@beam.apache.org by GitBox <gi...@apache.org> on 2022/09/08 13:32:26 UTC

[GitHub] [beam] damccorm opened a new pull request, #23087: Support models returning a dictionary of outputs

damccorm opened a new pull request, #23087:
URL: https://github.com/apache/beam/pull/23087

   The return value of the `forward` call for many models is a dictionary with the predictions along with more metadata. This supports that return type.
   
   Fixes #22240
   
   ------------------------
   
   Thank you for your contribution! Follow this checklist to help us incorporate your contribution quickly and easily:
   
    - [ ] [**Choose reviewer(s)**](https://beam.apache.org/contribute/#make-your-change) and mention them in a comment (`R: @username`).
    - [ ] Mention the appropriate issue in your description (for example: `addresses #123`), if applicable. This will automatically add a link to the pull request in the issue. If you would like the issue to automatically close on merging the pull request, comment `fixes #<ISSUE NUMBER>` instead.
    - [ ] Update `CHANGES.md` with noteworthy changes.
    - [ ] If this contribution is large, please file an Apache [Individual Contributor License Agreement](https://www.apache.org/licenses/icla.pdf).
   
   See the [Contributor Guide](https://beam.apache.org/contribute) for more tips on [how to make review process smoother](https://beam.apache.org/contribute/get-started-contributing/#make-the-reviewers-job-easier).
   
   To check the build health, please visit [https://github.com/apache/beam/blob/master/.test-infra/BUILD_STATUS.md](https://github.com/apache/beam/blob/master/.test-infra/BUILD_STATUS.md)
   
   GitHub Actions Tests Status (on master branch)
   ------------------------------------------------------------------------------------------------
   [![Build python source distribution and wheels](https://github.com/apache/beam/workflows/Build%20python%20source%20distribution%20and%20wheels/badge.svg?branch=master&event=schedule)](https://github.com/apache/beam/actions?query=workflow%3A%22Build+python+source+distribution+and+wheels%22+branch%3Amaster+event%3Aschedule)
   [![Python tests](https://github.com/apache/beam/workflows/Python%20tests/badge.svg?branch=master&event=schedule)](https://github.com/apache/beam/actions?query=workflow%3A%22Python+Tests%22+branch%3Amaster+event%3Aschedule)
   [![Java tests](https://github.com/apache/beam/workflows/Java%20Tests/badge.svg?branch=master&event=schedule)](https://github.com/apache/beam/actions?query=workflow%3A%22Java+Tests%22+branch%3Amaster+event%3Aschedule)
   [![Go tests](https://github.com/apache/beam/workflows/Go%20tests/badge.svg?branch=master&event=schedule)](https://github.com/apache/beam/actions?query=workflow%3A%22Go+tests%22+branch%3Amaster+event%3Aschedule)
   
   See [CI.md](https://github.com/apache/beam/blob/master/CI.md) for more information about GitHub Actions CI.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [beam] damccorm merged pull request #23087: Support models returning a dictionary of outputs

Posted by GitBox <gi...@apache.org>.
damccorm merged PR #23087:
URL: https://github.com/apache/beam/pull/23087


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [beam] damccorm commented on pull request #23087: Support models returning a dictionary of outputs

Posted by GitBox <gi...@apache.org>.
damccorm commented on PR #23087:
URL: https://github.com/apache/beam/pull/23087#issuecomment-1241117805

   > Overall, looks good! A few comments; and can you please add a test for SklearnModelHandlerPandas?
   
   Good catch - was an oversight on my part.
   
   Thanks for the feedback, I think I've responded to everything at this point!


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [beam] damccorm commented on a diff in pull request #23087: Support models returning a dictionary of outputs

Posted by GitBox <gi...@apache.org>.
damccorm commented on code in PR #23087:
URL: https://github.com/apache/beam/pull/23087#discussion_r966325894


##########
sdks/python/apache_beam/ml/inference/pytorch_inference_test.py:
##########
@@ -63,6 +63,17 @@
              for f1, f2 in TWO_FEATURES_EXAMPLES]).reshape(-1, 1))
 ]
 
+TWO_FEATURES_DICT_OUT_PREDICTIONS = [
+    PredictionResult(ex, pred) for ex,
+    pred in zip(
+        TWO_FEATURES_EXAMPLES,
+        [{
+            "output1": torch.Tensor([f1 * 2.0 + f2 * 3 + 0.5]),
+            "output2": torch.Tensor([f1 * 2.0 + f2 * 3 + 0.5])
+        } for f1,
+         f2 in TWO_FEATURES_EXAMPLES])
+]

Review Comment:
   That's probably fine - I was aiming for greater completeness/simplicty modularity/DRY (see subheader [Make Your Tests Complete and Concise](https://abseil.io/resources/swe-book/html/ch12.html) for a more fleshed out philosophy),  but I agree this ends up being better. Done



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [beam] damccorm commented on pull request #23087: Support models returning a dictionary of outputs

Posted by GitBox <gi...@apache.org>.
damccorm commented on PR #23087:
URL: https://github.com/apache/beam/pull/23087#issuecomment-1241985653

   Oh good to know - it looks like that is passing anyways with this change fortunately - https://ci-beam.apache.org/job/beam_PostCommit_Python37/5697/


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [beam] AnandInguva commented on pull request #23087: Support models returning a dictionary of outputs

Posted by GitBox <gi...@apache.org>.
AnandInguva commented on PR #23087:
URL: https://github.com/apache/beam/pull/23087#issuecomment-1241972775

   The inference IT tests are enabled only on Python 3.7 and Python 3.9 post commit suites. (lowest and highest version)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [beam] codecov[bot] commented on pull request #23087: Support models returning a dictionary of outputs

Posted by GitBox <gi...@apache.org>.
codecov[bot] commented on PR #23087:
URL: https://github.com/apache/beam/pull/23087#issuecomment-1240825051

   # [Codecov](https://codecov.io/gh/apache/beam/pull/23087?src=pr&el=h1&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation) Report
   > Merging [#23087](https://codecov.io/gh/apache/beam/pull/23087?src=pr&el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation) (48bbf3b) into [master](https://codecov.io/gh/apache/beam/commit/595c7e90c9b956809cb7a0254086e860ebfa7aee?el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation) (595c7e9) will **increase** coverage by `0.00%`.
   > The diff coverage is `30.76%`.
   
   ```diff
   @@           Coverage Diff           @@
   ##           master   #23087   +/-   ##
   =======================================
     Coverage   73.58%   73.58%           
   =======================================
     Files         716      716           
     Lines       95301    95309    +8     
   =======================================
   + Hits        70127    70133    +6     
   - Misses      23878    23880    +2     
     Partials     1296     1296           
   ```
   
   | Flag | Coverage Δ | |
   |---|---|---|
   | python | `83.40% <30.76%> (-0.01%)` | :arrow_down: |
   
   Flags with carried forward coverage won't be shown. [Click here](https://docs.codecov.io/docs/carryforward-flags?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation#carryforward-flags-in-the-pull-request-comment) to find out more.
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/23087?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation) | Coverage Δ | |
   |---|---|---|
   | [...am/examples/inference/pytorch\_language\_modeling.py](https://codecov.io/gh/apache/beam/pull/23087/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvaW5mZXJlbmNlL3B5dG9yY2hfbGFuZ3VhZ2VfbW9kZWxpbmcucHk=) | `0.00% <0.00%> (ø)` | |
   | [...thon/apache\_beam/ml/inference/pytorch\_inference.py](https://codecov.io/gh/apache/beam/pull/23087/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vbWwvaW5mZXJlbmNlL3B5dG9yY2hfaW5mZXJlbmNlLnB5) | `0.00% <0.00%> (ø)` | |
   | [...thon/apache\_beam/ml/inference/sklearn\_inference.py](https://codecov.io/gh/apache/beam/pull/23087/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vbWwvaW5mZXJlbmNlL3NrbGVhcm5faW5mZXJlbmNlLnB5) | `92.53% <66.66%> (-2.55%)` | :arrow_down: |
   | [.../python/apache\_beam/testing/test\_stream\_service.py](https://codecov.io/gh/apache/beam/pull/23087/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy90ZXN0X3N0cmVhbV9zZXJ2aWNlLnB5) | `88.09% <0.00%> (-4.77%)` | :arrow_down: |
   | [...che\_beam/runners/interactive/interactive\_runner.py](https://codecov.io/gh/apache/beam/pull/23087/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9pbnRlcmFjdGl2ZS9pbnRlcmFjdGl2ZV9ydW5uZXIucHk=) | `90.06% <0.00%> (-1.33%)` | :arrow_down: |
   | [...hon/apache\_beam/runners/direct/test\_stream\_impl.py](https://codecov.io/gh/apache/beam/pull/23087/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9kaXJlY3QvdGVzdF9zdHJlYW1faW1wbC5weQ==) | `93.28% <0.00%> (-0.75%)` | :arrow_down: |
   | [...beam/runners/portability/local\_job\_service\_main.py](https://codecov.io/gh/apache/beam/pull/23087/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9sb2NhbF9qb2Jfc2VydmljZV9tYWluLnB5) | `14.43% <0.00%> (-0.42%)` | :arrow_down: |
   | [...beam/runners/portability/expansion\_service\_main.py](https://codecov.io/gh/apache/beam/pull/23087/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9leHBhbnNpb25fc2VydmljZV9tYWluLnB5) | `0.00% <0.00%> (ø)` | |
   | [...ks/python/apache\_beam/runners/worker/sdk\_worker.py](https://codecov.io/gh/apache/beam/pull/23087/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy93b3JrZXIvc2RrX3dvcmtlci5weQ==) | `89.09% <0.00%> (+0.15%)` | :arrow_up: |
   | [...hon/apache\_beam/runners/worker/bundle\_processor.py](https://codecov.io/gh/apache/beam/pull/23087/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy93b3JrZXIvYnVuZGxlX3Byb2Nlc3Nvci5weQ==) | `93.67% <0.00%> (+0.24%)` | :arrow_up: |
   | ... and [2 more](https://codecov.io/gh/apache/beam/pull/23087/diff?src=pr&el=tree-more&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation) | |
   
   :mega: We’re building smart automated test selection to slash your CI/CD build times. [Learn more](https://about.codecov.io/iterative-testing/?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Software+Foundation)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [beam] damccorm commented on pull request #23087: Support models returning a dictionary of outputs

Posted by GitBox <gi...@apache.org>.
damccorm commented on PR #23087:
URL: https://github.com/apache/beam/pull/23087#issuecomment-1240881174

   R: @yeandy 
   
   I ended up picking this one up as a good ramp up item, lmk if you think there's a better approach here though.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [beam] yeandy commented on pull request #23087: Support models returning a dictionary of outputs

Posted by GitBox <gi...@apache.org>.
yeandy commented on PR #23087:
URL: https://github.com/apache/beam/pull/23087#issuecomment-1241997074

   Sorry about misleading you with the 3.8 run. I always get confused which tests are enabled for which suites.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [beam] github-actions[bot] commented on pull request #23087: Support models returning a dictionary of outputs

Posted by GitBox <gi...@apache.org>.
github-actions[bot] commented on PR #23087:
URL: https://github.com/apache/beam/pull/23087#issuecomment-1240928989

   Stopping reviewer notifications for this pull request: review requested by someone other than the bot, ceding control


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [beam] yeandy commented on pull request #23087: Support models returning a dictionary of outputs

Posted by GitBox <gi...@apache.org>.
yeandy commented on PR #23087:
URL: https://github.com/apache/beam/pull/23087#issuecomment-1241026741

   Let's also run postcommit to ensure the `test_torch_run_inference_bert_for_masked_lm` test passes.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [beam] yeandy commented on a diff in pull request #23087: Support models returning a dictionary of outputs

Posted by GitBox <gi...@apache.org>.
yeandy commented on code in PR #23087:
URL: https://github.com/apache/beam/pull/23087#discussion_r966233619


##########
sdks/python/apache_beam/ml/inference/pytorch_inference.py:
##########
@@ -281,6 +293,18 @@ def run_inference(
         batched_tensors = _convert_to_device(batched_tensors, self._device)
         key_to_batched_tensors[key] = batched_tensors
       predictions = model(**key_to_batched_tensors, **inference_args)
+      if isinstance(predictions, dict):
+        # Go from one dictionary of type: {key_type1: Iterable<val_type1>,
+        # key_type2: Iterable<val_type2>, ...} where each Iterable is of
+        # length batch_size, to a list of dictionaries:
+        # [{key_type1: value_type1, key_type2: value_type2}]
+        predictions_per_tensor = [
+            dict(zip(predictions, v)) for v in zip(*predictions.values())

Review Comment:
   nit: To be more explicit that we want the keys here
   ```suggestion
               dict(zip(predictions.keys(), v)) for v in zip(*predictions.values())
   ```



##########
sdks/python/apache_beam/ml/inference/pytorch_inference.py:
##########
@@ -164,6 +164,18 @@ def run_inference(
       batched_tensors = torch.stack(batch)
       batched_tensors = _convert_to_device(batched_tensors, self._device)
       predictions = model(batched_tensors, **inference_args)
+      if isinstance(predictions, dict):
+        # Go from one dictionary of type: {key_type1: Iterable<val_type1>,
+        # key_type2: Iterable<val_type2>, ...} where each Iterable is of
+        # length batch_size, to a list of dictionaries:
+        # [{key_type1: value_type1, key_type2: value_type2}]
+        predictions_per_tensor = [
+            dict(zip(predictions, v)) for v in zip(*predictions.values())
+        ]
+        return [
+            PredictionResult(x, y) for x,
+            y in zip(batch, predictions_per_tensor)
+        ]

Review Comment:
   Would it make sense to refactor the two occurrences of this into a helper?



##########
sdks/python/apache_beam/ml/inference/sklearn_inference.py:
##########
@@ -183,6 +195,18 @@ def run_inference(
     splits = [
         vectorized_batch.iloc[[i]] for i in range(vectorized_batch.shape[0])
     ]
+    if isinstance(predictions, dict):
+      # Go from one dictionary of type: {key_type1: Iterable<value_type1>,
+      # key_type2: Iterable<value_type2>, ...} where each Iterable is of
+      # length batch_size, to a list of dictionaries:
+      # [{key_type1: value_type1, key_type2: value_type2}]
+      predictions_per_split = [
+          dict(zip(predictions, v)) for v in zip(*predictions.values())
+      ]
+      return [
+          PredictionResult(example, inference) for example,
+          inference in zip(splits, predictions_per_split)
+      ]

Review Comment:
   Similarly to the Pytorch file, could we refactor this into a helper? Though it looks a bit more difficult since we're working with different types (numpy vs pandas) now, so I don't think we have to. It may make sense to explicitly keep the logic here to clarify. I'll leave this up to you.



##########
sdks/python/apache_beam/ml/inference/pytorch_inference_test.py:
##########
@@ -63,6 +63,17 @@
              for f1, f2 in TWO_FEATURES_EXAMPLES]).reshape(-1, 1))
 ]
 
+TWO_FEATURES_DICT_OUT_PREDICTIONS = [
+    PredictionResult(ex, pred) for ex,
+    pred in zip(
+        TWO_FEATURES_EXAMPLES,
+        [{
+            "output1": torch.Tensor([f1 * 2.0 + f2 * 3 + 0.5]),
+            "output2": torch.Tensor([f1 * 2.0 + f2 * 3 + 0.5])
+        } for f1,
+         f2 in TWO_FEATURES_EXAMPLES])
+]

Review Comment:
   Could we somehow use the existing `TWO_FEATURES_PREDICTIONS` to construct `TWO_FEATURES_DICT_OUT_PREDICTIONS` so that if the former changes the computation, the latter will follow?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [beam] damccorm commented on a diff in pull request #23087: Support models returning a dictionary of outputs

Posted by GitBox <gi...@apache.org>.
damccorm commented on code in PR #23087:
URL: https://github.com/apache/beam/pull/23087#discussion_r966326215


##########
sdks/python/apache_beam/ml/inference/sklearn_inference.py:
##########
@@ -183,6 +195,18 @@ def run_inference(
     splits = [
         vectorized_batch.iloc[[i]] for i in range(vectorized_batch.shape[0])
     ]
+    if isinstance(predictions, dict):
+      # Go from one dictionary of type: {key_type1: Iterable<value_type1>,
+      # key_type2: Iterable<value_type2>, ...} where each Iterable is of
+      # length batch_size, to a list of dictionaries:
+      # [{key_type1: value_type1, key_type2: value_type2}]
+      predictions_per_split = [
+          dict(zip(predictions, v)) for v in zip(*predictions.values())
+      ]
+      return [
+          PredictionResult(example, inference) for example,
+          inference in zip(splits, predictions_per_split)
+      ]

Review Comment:
   I think it can be refactored cleanly - done



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [beam] damccorm commented on pull request #23087: Support models returning a dictionary of outputs

Posted by GitBox <gi...@apache.org>.
damccorm commented on PR #23087:
URL: https://github.com/apache/beam/pull/23087#issuecomment-1241118013

   Run Python 3.8 PostCommit


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [beam] yeandy commented on pull request #23087: Support models returning a dictionary of outputs

Posted by GitBox <gi...@apache.org>.
yeandy commented on PR #23087:
URL: https://github.com/apache/beam/pull/23087#issuecomment-1241028113

   Run Python 3.8 PostCommit


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: github-unsubscribe@beam.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org