You are viewing a plain text version of this content. The canonical link for it is here.
Posted to github@beam.apache.org by GitBox <gi...@apache.org> on 2021/02/12 20:00:14 UTC

[GitHub] [beam] pabloem opened a new pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

pabloem opened a new pull request #13985:
URL: https://github.com/apache/beam/pull/13985


   **Please** add a meaningful description for your change here
   
   ------------------------
   
   Thank you for your contribution! Follow this checklist to help us incorporate your contribution quickly and easily:
   
    - [ ] [**Choose reviewer(s)**](https://beam.apache.org/contribute/#make-your-change) and mention them in a comment (`R: @username`).
    - [ ] Format the pull request title like `[BEAM-XXX] Fixes bug in ApproximateQuantiles`, where you replace `BEAM-XXX` with the appropriate JIRA issue, if applicable. This will automatically link the pull request to the issue.
    - [ ] Update `CHANGES.md` with noteworthy changes.
    - [ ] If this contribution is large, please file an Apache [Individual Contributor License Agreement](https://www.apache.org/licenses/icla.pdf).
   
   See the [Contributor Guide](https://beam.apache.org/contribute) for more tips on [how to make review process smoother](https://beam.apache.org/contribute/#make-reviewers-job-easier).
   
   Post-Commit Tests Status (on master branch)
   ------------------------------------------------------------------------------------------------
   
   Lang | SDK | Dataflow | Flink | Samza | Spark | Twister2
   --- | --- | --- | --- | --- | --- | ---
   Go | [![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Go/lastCompletedBuild/) | --- | [![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Go_VR_Flink/lastCompletedBuild/) | --- | [![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Go_VR_Spark/lastCompletedBuild/) | ---
   Java | [![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Java/lastCompletedBuild/) | [![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow/lastCompletedBuild/)<br>[![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Java_VR_Dataflow_V2/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Java_VR_Dataflow_V2/lastCompletedBuild/)<br>[![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow_Java11/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Dataflow_Java11/lastCompletedBuild/) | [![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/badge/icon)](https://ci-beam
 .apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink/lastCompletedBuild/)<br>[![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink_Java11/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Flink_Java11/lastCompletedBuild/)<br>[![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Flink_Batch/lastCompletedBuild/)<br>[![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Flink_Streaming/lastCompletedBuild/) | [![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Samza/lastCompletedBuild/) | [![Build Status](https://ci-beam.a
 pache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Spark/lastCompletedBuild/)<br>[![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Java_PVR_Spark_Batch/lastCompletedBuild/)<br>[![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Java_ValidatesRunner_SparkStructuredStreaming/lastCompletedBuild/) | [![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Twister2/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Java_ValidatesRunner_Twister2/lastCompletedBuild/)
   Python | [![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Python36/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Python36/lastCompletedBuild/)<br>[![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Python37/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Python37/lastCompletedBuild/)<br>[![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Python38/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Python38/lastCompletedBuild/) | [![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow/lastCompletedBuild/)<br>[![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/lastCompletedBuild/)<br>[![Build Status](https://ci-beam
 .apache.org/job/beam_PostCommit_Py_ValCont/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Py_ValCont/lastCompletedBuild/) | [![Build Status](https://ci-beam.apache.org/job/beam_PreCommit_Python_PVR_Flink_Cron/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PreCommit_Python_PVR_Flink_Cron/lastCompletedBuild/)<br>[![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Flink/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Flink/lastCompletedBuild/) | --- | [![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_Python_VR_Spark/lastCompletedBuild/) | ---
   XLang | [![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_XVR_Direct/lastCompletedBuild/) | [![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/lastCompletedBuild/) | [![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_XVR_Flink/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_XVR_Flink/lastCompletedBuild/) | --- | [![Build Status](https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PostCommit_XVR_Spark/lastCompletedBuild/) | ---
   
   Pre-Commit Tests Status (on master branch)
   ------------------------------------------------------------------------------------------------
   
   --- |Java | Python | Go | Website | Whitespace | Typescript
   --- | --- | --- | --- | --- | --- | ---
   Non-portable | [![Build Status](https://ci-beam.apache.org/job/beam_PreCommit_Java_Cron/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PreCommit_Java_Cron/lastCompletedBuild/) | [![Build Status](https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PreCommit_Python_Cron/lastCompletedBuild/)<br>[![Build Status](https://ci-beam.apache.org/job/beam_PreCommit_PythonLint_Cron/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PreCommit_PythonLint_Cron/lastCompletedBuild/)<br>[![Build Status](https://ci-beam.apache.org/job/beam_PreCommit_PythonDocker_Cron/badge/icon)](https://ci-beam.apache.org/job/beam_PreCommit_PythonDocker_Cron/lastCompletedBuild/) <br>[![Build Status](https://ci-beam.apache.org/job/beam_PreCommit_PythonDocs_Cron/badge/icon)](https://ci-beam.apache.org/job/beam_PreCommit_PythonDocs_Cron/lastCompletedBuild/) | [![Build Status](https://ci-beam.apache.org/job/be
 am_PreCommit_Go_Cron/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PreCommit_Go_Cron/lastCompletedBuild/) | [![Build Status](https://ci-beam.apache.org/job/beam_PreCommit_Website_Cron/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PreCommit_Website_Cron/lastCompletedBuild/) | [![Build Status](https://ci-beam.apache.org/job/beam_PreCommit_Whitespace_Cron/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PreCommit_Whitespace_Cron/lastCompletedBuild/) | [![Build Status](https://ci-beam.apache.org/job/beam_PreCommit_Typescript_Cron/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PreCommit_Typescript_Cron/lastCompletedBuild/)
   Portable | --- | [![Build Status](https://ci-beam.apache.org/job/beam_PreCommit_Portable_Python_Cron/lastCompletedBuild/badge/icon)](https://ci-beam.apache.org/job/beam_PreCommit_Portable_Python_Cron/lastCompletedBuild/) | --- | --- | --- | ---
   
   See [.test-infra/jenkins/README](https://github.com/apache/beam/blob/master/.test-infra/jenkins/README.md) for trigger phrase, status and link of all Jenkins jobs.
   
   
   GitHub Actions Tests Status (on master branch)
   ------------------------------------------------------------------------------------------------
   [![Build python source distribution and wheels](https://github.com/apache/beam/workflows/Build%20python%20source%20distribution%20and%20wheels/badge.svg?branch=master&event=schedule)](https://github.com/apache/beam/actions?query=workflow%3A%22Build+python+source+distribution+and+wheels%22+branch%3Amaster+event%3Aschedule)
   [![Python tests](https://github.com/apache/beam/workflows/Python%20tests/badge.svg?branch=master&event=schedule)](https://github.com/apache/beam/actions?query=workflow%3A%22Python+Tests%22+branch%3Amaster+event%3Aschedule)
   [![Java tests](https://github.com/apache/beam/workflows/Java%20Tests/badge.svg?branch=master&event=schedule)](https://github.com/apache/beam/actions?query=workflow%3A%22Java+Tests%22+branch%3Amaster+event%3Aschedule)
   
   See [CI.md](https://github.com/apache/beam/blob/master/CI.md) for more information about GitHub Actions CI.
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (cc6607c) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.26%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   83.03%   +0.26%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58565    +1003     
   ==========================================
   + Hits        47647    48631     +984     
   - Misses       9915     9934      +19     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [...dks/python/apache\_beam/runners/pipeline\_context.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9waXBlbGluZV9jb250ZXh0LnB5) | | |
   | [...dks/python/apache\_beam/io/gcp/gce\_metadata\_util.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL2djZV9tZXRhZGF0YV91dGlsLnB5) | | |
   | [...ks/python/apache\_beam/internal/metrics/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW50ZXJuYWwvbWV0cmljcy9fX2luaXRfXy5weQ==) | | |
   | [...s/python/apache\_beam/testing/pipeline\_verifiers.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9waXBlbGluZV92ZXJpZmllcnMucHk=) | | |
   | [sdks/python/apache\_beam/io/parquetio.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vcGFycXVldGlvLnB5) | | |
   | [sdks/python/apache\_beam/examples/wordcount.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvd29yZGNvdW50LnB5) | | |
   | [...python/apache\_beam/examples/complete/distribopt.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvY29tcGxldGUvZGlzdHJpYm9wdC5weQ==) | | |
   | [...ache\_beam/examples/cookbook/datastore\_wordcount.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvY29va2Jvb2svZGF0YXN0b3JlX3dvcmRjb3VudC5weQ==) | | |
   | [sdks/python/apache\_beam/io/external/kafka.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZXh0ZXJuYWwva2Fma2EucHk=) | | |
   | [...\_beam/testing/benchmarks/nexmark/queries/query5.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL25leG1hcmsvcXVlcmllcy9xdWVyeTUucHk=) | | |
   | ... and [923 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...2452188](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (b874eed) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.22%`.
   > The diff coverage is `90.28%`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   83.00%   +0.22%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58463     +901     
   ==========================================
   + Hits        47647    48526     +879     
   - Misses       9915     9937      +22     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [sdks/python/apache\_beam/transforms/trigger.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdHJhbnNmb3Jtcy90cmlnZ2VyLnB5) | `88.34% <75.00%> (-0.02%)` | :arrow_down: |
   | [...nners/portability/fn\_api\_runner/trigger\_manager.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9mbl9hcGlfcnVubmVyL3RyaWdnZXJfbWFuYWdlci5weQ==) | `90.58% <90.58%> (ø)` | |
   | [...dks/python/apache\_beam/options/pipeline\_options.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vb3B0aW9ucy9waXBlbGluZV9vcHRpb25zLnB5) | `94.60% <100.00%> (ø)` | |
   | [sdks/python/apache\_beam/utils/interactive\_utils.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdXRpbHMvaW50ZXJhY3RpdmVfdXRpbHMucHk=) | `88.09% <0.00%> (-7.15%)` | :arrow_down: |
   | [...\_beam/runners/portability/sdk\_container\_builder.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9zZGtfY29udGFpbmVyX2J1aWxkZXIucHk=) | `40.00% <0.00%> (-3.28%)` | :arrow_down: |
   | [sdks/python/apache\_beam/dataframe/frames.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZGF0YWZyYW1lL2ZyYW1lcy5weQ==) | `90.80% <0.00%> (-1.18%)` | :arrow_down: |
   | [sdks/python/apache\_beam/internal/metrics/metric.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW50ZXJuYWwvbWV0cmljcy9tZXRyaWMucHk=) | `86.45% <0.00%> (-1.05%)` | :arrow_down: |
   | [...ache\_beam/runners/portability/local\_job\_service.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9sb2NhbF9qb2Jfc2VydmljZS5weQ==) | `80.53% <0.00%> (-0.64%)` | :arrow_down: |
   | [...nners/portability/fn\_api\_runner/worker\_handlers.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9mbl9hcGlfcnVubmVyL3dvcmtlcl9oYW5kbGVycy5weQ==) | `79.61% <0.00%> (-0.39%)` | :arrow_down: |
   | [...ks/python/apache\_beam/runners/worker/sdk\_worker.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy93b3JrZXIvc2RrX3dvcmtlci5weQ==) | `89.69% <0.00%> (-0.16%)` | :arrow_down: |
   | ... and [37 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...b874eed](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (cc6607c) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.26%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   83.03%   +0.26%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58565    +1003     
   ==========================================
   + Hits        47647    48631     +984     
   - Misses       9915     9934      +19     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [...he\_beam/io/flink/flink\_streaming\_impulse\_source.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZmxpbmsvZmxpbmtfc3RyZWFtaW5nX2ltcHVsc2Vfc291cmNlLnB5) | | |
   | [sdks/python/apache\_beam/pipeline.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcGlwZWxpbmUucHk=) | | |
   | [sdks/python/apache\_beam/coders/row\_coder.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vY29kZXJzL3Jvd19jb2Rlci5weQ==) | | |
   | [sdks/python/apache\_beam/io/gcp/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL19faW5pdF9fLnB5) | | |
   | [...ache\_beam/portability/api/beam\_artifact\_api\_pb2.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcG9ydGFiaWxpdHkvYXBpL2JlYW1fYXJ0aWZhY3RfYXBpX3BiMi5weQ==) | | |
   | [sdks/python/apache\_beam/runners/sdf\_utils.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9zZGZfdXRpbHMucHk=) | | |
   | [...ransforms/transforms\_keyword\_only\_args\_test\_py3.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdHJhbnNmb3Jtcy90cmFuc2Zvcm1zX2tleXdvcmRfb25seV9hcmdzX3Rlc3RfcHkzLnB5) | | |
   | [...es/snippets/transforms/aggregation/cogroupbykey.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvc25pcHBldHMvdHJhbnNmb3Jtcy9hZ2dyZWdhdGlvbi9jb2dyb3VwYnlrZXkucHk=) | | |
   | [...n/apache\_beam/examples/complete/game/user\_score.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvY29tcGxldGUvZ2FtZS91c2VyX3Njb3JlLnB5) | | |
   | [sdks/python/apache\_beam/coders/typecoders.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vY29kZXJzL3R5cGVjb2RlcnMucHk=) | | |
   | ... and [923 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...cc6607c](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (b874eed) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.22%`.
   > The diff coverage is `90.28%`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   83.00%   +0.22%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58463     +901     
   ==========================================
   + Hits        47647    48526     +879     
   - Misses       9915     9937      +22     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [sdks/python/apache\_beam/transforms/trigger.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdHJhbnNmb3Jtcy90cmlnZ2VyLnB5) | `88.34% <75.00%> (-0.02%)` | :arrow_down: |
   | [...nners/portability/fn\_api\_runner/trigger\_manager.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9mbl9hcGlfcnVubmVyL3RyaWdnZXJfbWFuYWdlci5weQ==) | `90.58% <90.58%> (ø)` | |
   | [...dks/python/apache\_beam/options/pipeline\_options.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vb3B0aW9ucy9waXBlbGluZV9vcHRpb25zLnB5) | `94.60% <100.00%> (ø)` | |
   | [sdks/python/apache\_beam/utils/interactive\_utils.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdXRpbHMvaW50ZXJhY3RpdmVfdXRpbHMucHk=) | `88.09% <0.00%> (-7.15%)` | :arrow_down: |
   | [...\_beam/runners/portability/sdk\_container\_builder.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9zZGtfY29udGFpbmVyX2J1aWxkZXIucHk=) | `40.00% <0.00%> (-3.28%)` | :arrow_down: |
   | [sdks/python/apache\_beam/dataframe/frames.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZGF0YWZyYW1lL2ZyYW1lcy5weQ==) | `90.80% <0.00%> (-1.18%)` | :arrow_down: |
   | [sdks/python/apache\_beam/internal/metrics/metric.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW50ZXJuYWwvbWV0cmljcy9tZXRyaWMucHk=) | `86.45% <0.00%> (-1.05%)` | :arrow_down: |
   | [...ache\_beam/runners/portability/local\_job\_service.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9sb2NhbF9qb2Jfc2VydmljZS5weQ==) | `80.53% <0.00%> (-0.64%)` | :arrow_down: |
   | [...nners/portability/fn\_api\_runner/worker\_handlers.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9mbl9hcGlfcnVubmVyL3dvcmtlcl9oYW5kbGVycy5weQ==) | `79.61% <0.00%> (-0.39%)` | :arrow_down: |
   | [...ks/python/apache\_beam/runners/worker/sdk\_worker.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy93b3JrZXIvc2RrX3dvcmtlci5weQ==) | `89.69% <0.00%> (-0.16%)` | :arrow_down: |
   | ... and [37 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...568db95](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] pabloem commented on a change in pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
pabloem commented on a change in pull request #13985:
URL: https://github.com/apache/beam/pull/13985#discussion_r599146609



##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')
+
+
+class ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):
+    super(TriggerMergeContext, self).__init__(all_windows)
+    self.trigger_context = context
+    self.windowing = windowing
+    self.merged_away: typing.Dict[BoundedWindow, BoundedWindow] = {}
+
+  def merge(self, to_be_merged, merge_result):
+    _LOGGER.debug("Merging %s into %s", to_be_merged, merge_result)
+    self.trigger_context.merge_state(to_be_merged, merge_result)
+    for window in to_be_merged:
+      if window != merge_result:
+        self.merged_away[window] = merge_result
+        # Clear state associated with PaneInfo since it is
+        # not preserved across merges.
+        self.trigger_context.for_window(window).clear_state(None)
+    self.windowing.triggerfn.on_merge(
+        to_be_merged,
+        merge_result,
+        self.trigger_context.for_window(merge_result))
+
+
+@typehints.with_input_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+@typehints.with_output_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+class GeneralTriggerManagerDoFn(DoFn):
+  """A trigger manager that supports all windowing / triggering cases.
+
+  This implements a DoFn that manages triggering in a per-key basis. All
+  elements for a single key are processed together. Per-key state holds data
+  related to all windows.
+  """
+
+  KNOWN_WINDOWS = SetStateSpec('known_windows', IntervalWindowCoder())
+  FINISHED_WINDOWS = SetStateSpec('finished_windows', IntervalWindowCoder())
+  LAST_KNOWN_TIME = CombiningValueStateSpec('last_known_time', combine_fn=max)
+  LAST_KNOWN_WATERMARK = CombiningValueStateSpec(
+      'last_known_watermark', combine_fn=max)
+
+  # TODO(pabloem) What's the coder for the elements/keys here?
+  WINDOW_ELEMENT_PAIRS = BagStateSpec(
+      'element_bag', TupleCoder([IntervalWindowCoder(), PickleCoder()]))
+  WINDOW_TAG_VALUES = BagStateSpec(
+      'per_window_per_tag_value_state',
+      TupleCoder([IntervalWindowCoder(), StrUtf8Coder(), VarIntCoder()]))
+
+  PROCESSING_TIME_TIMER = TimerSpec(
+      'processing_time_timer', TimeDomain.REAL_TIME)
+  WATERMARK_TIMER = TimerSpec('watermark_timer', TimeDomain.WATERMARK)
+
+  def __init__(self, windowing: Windowing):
+    self.windowing = windowing
+    # Only session windows are merging. Other windows are non-merging.
+    self.merging_windows = isinstance(windowing.windowfn, Sessions)
+
+  def process(
+      self,
+      element: typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]],
+      element_bag: BagRuntimeState = DoFn.StateParam(WINDOW_ELEMENT_PAIRS),  # type: ignore
+      time_state: AccumulatingRuntimeState = DoFn.StateParam(LAST_KNOWN_TIME),  # type: ignore
+      watermark_state: AccumulatingRuntimeState = DoFn.StateParam(  # type: ignore
+          LAST_KNOWN_WATERMARK),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      windows_state: SetRuntimeState = DoFn.StateParam(KNOWN_WINDOWS),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER),
+      *args, **kwargs):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=time_state,
+        watermark_state=watermark_state,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    key, windowed_values = element
+    watermark = read_watermark(watermark_state)
+
+    windows_to_elements = collections.defaultdict(list)
+    for wv in windowed_values:
+      for window in wv.windows:
+        # ignore expired windows
+        if watermark > window.end + self.windowing.allowed_lateness:
+          continue
+        if window in finished_windows_state.read():
+          continue
+        windows_to_elements[window].append(
+            TimestampedValue(wv.value, wv.timestamp))
+
+    # Processing merging of windows
+    if self.merging_windows:
+      old_windows = set(windows_state.read())
+      all_windows = old_windows.union(list(windows_to_elements))
+      if all_windows != old_windows:
+        merge_context = TriggerMergeContext(
+            all_windows, context, self.windowing)
+        self.windowing.windowfn.merge(merge_context)
+
+        merged_windows_to_elements = collections.defaultdict(list)
+        for window, values in windows_to_elements.items():
+          while window in merge_context.merged_away:
+            window = merge_context.merged_away[window]
+          merged_windows_to_elements[window].extend(values)
+        windows_to_elements = merged_windows_to_elements
+
+      if old_windows != all_windows:
+        for w in windows_to_elements:
+          windows_state.add(w)
+    # Done processing merging of windows
+
+    seen_windows = set()
+    for w in windows_to_elements:
+      window_context = context.for_window(w)
+      seen_windows.add(w)
+      for value_w_timestamp in windows_to_elements[w]:
+        _LOGGER.debug(value_w_timestamp)
+        element_bag.add((w, value_w_timestamp))
+        self.windowing.triggerfn.on_element(windowed_values, w, window_context)
+
+    return self._trigger_fire(
+        key, TimeDomain.WATERMARK, watermark, None, context, seen_windows)
+
+  def _trigger_fire(
+      self,
+      key: K,
+      time_domain,
+      timestamp: Timestamp,
+      timer_tag: typing.Optional[str],
+      context: 'FnRunnerStatefulTriggerContext',
+      windows_of_interest: typing.Optional[typing.Set[BoundedWindow]] = None):
+    windows_to_elements = context.windows_to_elements_map()
+    context.elements_bag_state.clear()
+
+    fired_windows = set()
+    finished_windows = set()
+    _LOGGER.debug(
+        '%s - tag %s - timestamp %s', time_domain, timer_tag, timestamp)
+    for w, elems in windows_to_elements.items():
+      if windows_of_interest is not None and w not in windows_of_interest:
+        # windows_of_interest=None means that we care about all windows.
+        # If we care only about some windows, and this window is not one of
+        # them, then we do not intend to fire this window.
+        continue
+      window_context = context.for_window(w)
+      if self.windowing.triggerfn.should_fire(time_domain,
+                                              timestamp,
+                                              w,
+                                              window_context):
+        finished = self.windowing.triggerfn.on_fire(
+            timestamp, w, window_context)
+        _LOGGER.debug('Firing on window %s. Finished: %s', w, finished)
+        fired_windows.add(w)
+        if finished:
+          context.finished_windows_state.add(w)
+          finished_windows.add(w)
+        # TODO(pabloem): Format the output: e.g. pane info
+        elems = [WindowedValue(e.value, e.timestamp, (w, )) for e in elems]
+        yield (key, elems)
+
+    # Add elements that were not fired back into state.
+    for w, elems in windows_to_elements.items():
+      for e in elems:
+        if (w in finished_windows or
+            (w in fired_windows and
+             self.windowing.accumulation_mode == AccumulationMode.DISCARDING)):
+          continue
+        context.elements_bag_state.add((w, e))
+
+  @on_timer(PROCESSING_TIME_TIMER)
+  def processing_time_trigger(
+      self,
+      key=DoFn.KeyParam,
+      timer_tag=DoFn.DynamicTimerTagParam,
+      timestamp=DoFn.TimestampParam,
+      processing_time_state=DoFn.StateParam(LAST_KNOWN_TIME),
+      element_bag=DoFn.StateParam(WINDOW_ELEMENT_PAIRS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER)):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=processing_time_state,
+        watermark_state=None,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    result = self._trigger_fire(
+        key, TimeDomain.REAL_TIME, timestamp, timer_tag, context)
+    processing_time_state.add(timestamp)
+    return result
+
+  @on_timer(WATERMARK_TIMER)
+  def watermark_trigger(
+      self,
+      key=DoFn.KeyParam,
+      timer_tag=DoFn.DynamicTimerTagParam,
+      timestamp=DoFn.TimestampParam,
+      watermark_state=DoFn.StateParam(LAST_KNOWN_WATERMARK),
+      element_bag=DoFn.StateParam(WINDOW_ELEMENT_PAIRS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER)):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=None,
+        watermark_state=watermark_state,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    result = self._trigger_fire(
+        key, TimeDomain.WATERMARK, timestamp, timer_tag, context)
+    watermark_state.add(timestamp)
+    return result
+
+
+class FnRunnerStatefulTriggerContext(TriggerContext):
+  def __init__(
+      self,
+      processing_time_timer: RuntimeTimer,
+      watermark_timer: RuntimeTimer,
+      current_time_state: typing.Optional[AccumulatingRuntimeState],
+      watermark_state: typing.Optional[AccumulatingRuntimeState],
+      elements_bag_state: BagRuntimeState,
+      window_tag_values_bag_state: BagRuntimeState,
+      finished_windows_state: SetRuntimeState):
+    self.timers = {
+        TimeDomain.REAL_TIME: processing_time_timer,
+        TimeDomain.WATERMARK: watermark_timer
+    }
+    self.current_times = {
+        TimeDomain.REAL_TIME: current_time_state,
+        TimeDomain.WATERMARK: watermark_state
+    }
+    self.elements_bag_state = elements_bag_state
+    self.window_tag_values_bag_state = window_tag_values_bag_state
+    self.finished_windows_state = finished_windows_state
+
+  def windows_to_elements_map(
+      self
+  ) -> typing.Dict[BoundedWindow, typing.List[windowed_value.WindowedValue]]:
+    window_element_pairs: typing.Iterable[typing.Tuple[
+        BoundedWindow,
+        windowed_value.WindowedValue]] = self.elements_bag_state.read()
+    result: typing.Dict[BoundedWindow,
+                        typing.List[windowed_value.WindowedValue]] = {}
+    for w, e in window_element_pairs:
+      if w not in result:
+        result[w] = []
+      result[w].append(e)
+    return result
+
+  def for_window(self, window):
+    return PerWindowTriggerContext(window, self)
+
+  def get_current_time(self):
+    return self.current_times[TimeDomain.REAL_TIME].read()
+
+  def set_timer(self, name, time_domain, timestamp):
+    _LOGGER.debug('Setting timer (%s, %s) at %s', time_domain, name, timestamp)
+    self.timers[time_domain].set(timestamp, dynamic_timer_tag=name)
+
+  def clear_timer(self, name, time_domain):
+    _LOGGER.debug('Clearing timer (%s, %s)', time_domain, name)
+    self.timers[time_domain].clear(dynamic_timer_tag=name)
+
+  def merge_state(self, to_be_merged, merge_result):
+    all_triplets = self.window_tag_values_bag_state.read()
+    # Collect all the triplets for the window we are merging away, and tag them
+    # with the new window (merge_result).
+    merging_away_triplets = [(merge_result, t[1], t[2]) for t in all_triplets
+                             if t[0] in to_be_merged]
+
+    # Collect all of the other triplets, and joining them with the newly-tagged
+    # set of triplets.
+    resulting_triplets = [
+        t for t in all_triplets if t[0] not in to_be_merged

Review comment:
       it is not reiterable. I've fixed that. Thanks!

##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')
+
+
+class ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):
+    super(TriggerMergeContext, self).__init__(all_windows)
+    self.trigger_context = context
+    self.windowing = windowing
+    self.merged_away: typing.Dict[BoundedWindow, BoundedWindow] = {}
+
+  def merge(self, to_be_merged, merge_result):
+    _LOGGER.debug("Merging %s into %s", to_be_merged, merge_result)
+    self.trigger_context.merge_state(to_be_merged, merge_result)
+    for window in to_be_merged:
+      if window != merge_result:
+        self.merged_away[window] = merge_result
+        # Clear state associated with PaneInfo since it is
+        # not preserved across merges.
+        self.trigger_context.for_window(window).clear_state(None)
+    self.windowing.triggerfn.on_merge(
+        to_be_merged,
+        merge_result,
+        self.trigger_context.for_window(merge_result))
+
+
+@typehints.with_input_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+@typehints.with_output_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+class GeneralTriggerManagerDoFn(DoFn):
+  """A trigger manager that supports all windowing / triggering cases.
+
+  This implements a DoFn that manages triggering in a per-key basis. All
+  elements for a single key are processed together. Per-key state holds data
+  related to all windows.
+  """
+
+  KNOWN_WINDOWS = SetStateSpec('known_windows', IntervalWindowCoder())
+  FINISHED_WINDOWS = SetStateSpec('finished_windows', IntervalWindowCoder())
+  LAST_KNOWN_TIME = CombiningValueStateSpec('last_known_time', combine_fn=max)
+  LAST_KNOWN_WATERMARK = CombiningValueStateSpec(
+      'last_known_watermark', combine_fn=max)
+
+  # TODO(pabloem) What's the coder for the elements/keys here?
+  WINDOW_ELEMENT_PAIRS = BagStateSpec(
+      'element_bag', TupleCoder([IntervalWindowCoder(), PickleCoder()]))
+  WINDOW_TAG_VALUES = BagStateSpec(
+      'per_window_per_tag_value_state',
+      TupleCoder([IntervalWindowCoder(), StrUtf8Coder(), VarIntCoder()]))
+
+  PROCESSING_TIME_TIMER = TimerSpec(
+      'processing_time_timer', TimeDomain.REAL_TIME)
+  WATERMARK_TIMER = TimerSpec('watermark_timer', TimeDomain.WATERMARK)
+
+  def __init__(self, windowing: Windowing):
+    self.windowing = windowing
+    # Only session windows are merging. Other windows are non-merging.
+    self.merging_windows = isinstance(windowing.windowfn, Sessions)
+
+  def process(
+      self,
+      element: typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]],
+      element_bag: BagRuntimeState = DoFn.StateParam(WINDOW_ELEMENT_PAIRS),  # type: ignore

Review comment:
       done. WDYT?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] pabloem commented on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
pabloem commented on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-788244952


   r: @tvalentyn do you think you would be able to take a look at this PR?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (177f098) into [master](https://codecov.io/gh/apache/beam/commit/67dc5fe49bd0dbc33ccccf7cde5749dc17adec08?el=desc) (67dc5fe) will **increase** coverage by `0.06%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   83.39%   83.46%   +0.06%     
   ==========================================
     Files         469      470       +1     
     Lines       58721    59001     +280     
   ==========================================
   + Hits        48972    49246     +274     
   - Misses       9749     9755       +6     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [...ites/tox/py38/build/srcs/sdks/python/gen\_protos.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vZ2VuX3Byb3Rvcy5weQ==) | | |
   | [.../sdks/python/apache\_beam/examples/snippets/util.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvc25pcHBldHMvdXRpbC5weQ==) | | |
   | [...n/apache\_beam/typehints/typed\_pipeline\_test\_py3.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdHlwZWhpbnRzL3R5cGVkX3BpcGVsaW5lX3Rlc3RfcHkzLnB5) | | |
   | [.../srcs/sdks/python/apache\_beam/transforms/window.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdHJhbnNmb3Jtcy93aW5kb3cucHk=) | | |
   | [...uild/srcs/sdks/python/apache\_beam/io/gcp/pubsub.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL3B1YnN1Yi5weQ==) | | |
   | [...s/sdks/python/apache\_beam/internal/gcp/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW50ZXJuYWwvZ2NwL19faW5pdF9fLnB5) | | |
   | [...sdks/python/apache\_beam/internal/gcp/json\_value.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW50ZXJuYWwvZ2NwL2pzb25fdmFsdWUucHk=) | | |
   | [...runners/interactive/display/pcoll\_visualization.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9pbnRlcmFjdGl2ZS9kaXNwbGF5L3Bjb2xsX3Zpc3VhbGl6YXRpb24ucHk=) | | |
   | [...nners/portability/fn\_api\_runner/worker\_handlers.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9mbl9hcGlfcnVubmVyL3dvcmtlcl9oYW5kbGVycy5weQ==) | | |
   | [...examples/snippets/transforms/aggregation/sample.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvc25pcHBldHMvdHJhbnNmb3Jtcy9hZ2dyZWdhdGlvbi9zYW1wbGUucHk=) | | |
   | ... and [929 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [67dc5fe...177f098](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] pabloem commented on a change in pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
pabloem commented on a change in pull request #13985:
URL: https://github.com/apache/beam/pull/13985#discussion_r599143429



##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')
+
+
+class ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):
+    super(TriggerMergeContext, self).__init__(all_windows)
+    self.trigger_context = context
+    self.windowing = windowing
+    self.merged_away: typing.Dict[BoundedWindow, BoundedWindow] = {}
+
+  def merge(self, to_be_merged, merge_result):
+    _LOGGER.debug("Merging %s into %s", to_be_merged, merge_result)
+    self.trigger_context.merge_state(to_be_merged, merge_result)
+    for window in to_be_merged:
+      if window != merge_result:
+        self.merged_away[window] = merge_result
+        # Clear state associated with PaneInfo since it is
+        # not preserved across merges.
+        self.trigger_context.for_window(window).clear_state(None)
+    self.windowing.triggerfn.on_merge(
+        to_be_merged,
+        merge_result,
+        self.trigger_context.for_window(merge_result))
+
+
+@typehints.with_input_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+@typehints.with_output_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+class GeneralTriggerManagerDoFn(DoFn):
+  """A trigger manager that supports all windowing / triggering cases.
+
+  This implements a DoFn that manages triggering in a per-key basis. All
+  elements for a single key are processed together. Per-key state holds data
+  related to all windows.
+  """
+
+  KNOWN_WINDOWS = SetStateSpec('known_windows', IntervalWindowCoder())
+  FINISHED_WINDOWS = SetStateSpec('finished_windows', IntervalWindowCoder())
+  LAST_KNOWN_TIME = CombiningValueStateSpec('last_known_time', combine_fn=max)
+  LAST_KNOWN_WATERMARK = CombiningValueStateSpec(
+      'last_known_watermark', combine_fn=max)
+
+  # TODO(pabloem) What's the coder for the elements/keys here?
+  WINDOW_ELEMENT_PAIRS = BagStateSpec(
+      'element_bag', TupleCoder([IntervalWindowCoder(), PickleCoder()]))
+  WINDOW_TAG_VALUES = BagStateSpec(
+      'per_window_per_tag_value_state',
+      TupleCoder([IntervalWindowCoder(), StrUtf8Coder(), VarIntCoder()]))
+
+  PROCESSING_TIME_TIMER = TimerSpec(
+      'processing_time_timer', TimeDomain.REAL_TIME)
+  WATERMARK_TIMER = TimerSpec('watermark_timer', TimeDomain.WATERMARK)
+
+  def __init__(self, windowing: Windowing):
+    self.windowing = windowing
+    # Only session windows are merging. Other windows are non-merging.
+    self.merging_windows = isinstance(windowing.windowfn, Sessions)
+
+  def process(
+      self,
+      element: typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]],
+      element_bag: BagRuntimeState = DoFn.StateParam(WINDOW_ELEMENT_PAIRS),  # type: ignore
+      time_state: AccumulatingRuntimeState = DoFn.StateParam(LAST_KNOWN_TIME),  # type: ignore
+      watermark_state: AccumulatingRuntimeState = DoFn.StateParam(  # type: ignore
+          LAST_KNOWN_WATERMARK),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      windows_state: SetRuntimeState = DoFn.StateParam(KNOWN_WINDOWS),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER),
+      *args, **kwargs):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=time_state,
+        watermark_state=watermark_state,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    key, windowed_values = element
+    watermark = read_watermark(watermark_state)
+
+    windows_to_elements = collections.defaultdict(list)
+    for wv in windowed_values:
+      for window in wv.windows:
+        # ignore expired windows
+        if watermark > window.end + self.windowing.allowed_lateness:
+          continue
+        if window in finished_windows_state.read():
+          continue
+        windows_to_elements[window].append(
+            TimestampedValue(wv.value, wv.timestamp))
+
+    # Processing merging of windows
+    if self.merging_windows:
+      old_windows = set(windows_state.read())
+      all_windows = old_windows.union(list(windows_to_elements))
+      if all_windows != old_windows:
+        merge_context = TriggerMergeContext(
+            all_windows, context, self.windowing)
+        self.windowing.windowfn.merge(merge_context)
+
+        merged_windows_to_elements = collections.defaultdict(list)
+        for window, values in windows_to_elements.items():
+          while window in merge_context.merged_away:
+            window = merge_context.merged_away[window]
+          merged_windows_to_elements[window].extend(values)
+        windows_to_elements = merged_windows_to_elements
+
+      if old_windows != all_windows:
+        for w in windows_to_elements:
+          windows_state.add(w)
+    # Done processing merging of windows
+
+    seen_windows = set()
+    for w in windows_to_elements:
+      window_context = context.for_window(w)
+      seen_windows.add(w)
+      for value_w_timestamp in windows_to_elements[w]:
+        _LOGGER.debug(value_w_timestamp)
+        element_bag.add((w, value_w_timestamp))
+        self.windowing.triggerfn.on_element(windowed_values, w, window_context)
+
+    return self._trigger_fire(
+        key, TimeDomain.WATERMARK, watermark, None, context, seen_windows)
+
+  def _trigger_fire(

Review comment:
       done.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (a57fb3d) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.25%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   83.03%   +0.25%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58535     +973     
   ==========================================
   + Hits        47647    48604     +957     
   - Misses       9915     9931      +16     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [...n/apache\_beam/examples/cookbook/bigquery\_schema.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvY29va2Jvb2svYmlncXVlcnlfc2NoZW1hLnB5) | | |
   | [...apache\_beam/examples/complete/juliaset/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvY29tcGxldGUvanVsaWFzZXQvX19pbml0X18ucHk=) | | |
   | [...che\_beam/examples/streaming\_wordcount\_debugging.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvc3RyZWFtaW5nX3dvcmRjb3VudF9kZWJ1Z2dpbmcucHk=) | | |
   | [...testing/benchmarks/nexmark/queries/winning\_bids.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL25leG1hcmsvcXVlcmllcy93aW5uaW5nX2JpZHMucHk=) | | |
   | [...s/python/apache\_beam/runners/portability/stager.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9zdGFnZXIucHk=) | | |
   | [...\_beam/testing/benchmarks/nexmark/queries/query0.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL25leG1hcmsvcXVlcmllcy9xdWVyeTAucHk=) | | |
   | [sdks/python/apache\_beam/io/hadoopfilesystem.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vaGFkb29wZmlsZXN5c3RlbS5weQ==) | | |
   | [...ks/python/apache\_beam/runners/worker/sideinputs.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy93b3JrZXIvc2lkZWlucHV0cy5weQ==) | | |
   | [...on/apache\_beam/runners/direct/watermark\_manager.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9kaXJlY3Qvd2F0ZXJtYXJrX21hbmFnZXIucHk=) | | |
   | [...python/apache\_beam/typehints/typehints\_test\_py3.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdHlwZWhpbnRzL3R5cGVoaW50c190ZXN0X3B5My5weQ==) | | |
   | ... and [923 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...cc6607c](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (964d13a) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.26%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   83.03%   +0.26%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58535     +973     
   ==========================================
   + Hits        47647    48607     +960     
   - Misses       9915     9928      +13     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [...\_beam/testing/benchmarks/nexmark/queries/query4.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL25leG1hcmsvcXVlcmllcy9xdWVyeTQucHk=) | | |
   | [sdks/python/apache\_beam/portability/utils.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcG9ydGFiaWxpdHkvdXRpbHMucHk=) | | |
   | [...e\_beam/examples/complete/game/hourly\_team\_score.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvY29tcGxldGUvZ2FtZS9ob3VybHlfdGVhbV9zY29yZS5weQ==) | | |
   | [...apache\_beam/runners/dataflow/native\_io/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9kYXRhZmxvdy9uYXRpdmVfaW8vX19pbml0X18ucHk=) | | |
   | [...runners/interactive/display/pcoll\_visualization.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9pbnRlcmFjdGl2ZS9kaXNwbGF5L3Bjb2xsX3Zpc3VhbGl6YXRpb24ucHk=) | | |
   | [...e\_beam/io/gcp/big\_query\_query\_to\_table\_pipeline.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL2JpZ19xdWVyeV9xdWVyeV90b190YWJsZV9waXBlbGluZS5weQ==) | | |
   | [...dks/python/apache\_beam/transforms/create\_source.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdHJhbnNmb3Jtcy9jcmVhdGVfc291cmNlLnB5) | | |
   | [sdks/python/apache\_beam/runners/job/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9qb2IvX19pbml0X18ucHk=) | | |
   | [...hon/apache\_beam/io/gcp/datastore/v1new/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL2RhdGFzdG9yZS92MW5ldy9fX2luaXRfXy5weQ==) | | |
   | [...beam/portability/api/beam\_artifact\_api\_pb2\_urns.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcG9ydGFiaWxpdHkvYXBpL2JlYW1fYXJ0aWZhY3RfYXBpX3BiMl91cm5zLnB5) | | |
   | ... and [923 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...964d13a](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] pabloem commented on a change in pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
pabloem commented on a change in pull request #13985:
URL: https://github.com/apache/beam/pull/13985#discussion_r604365751



##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,457 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+
+
+class _ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):
+    super(TriggerMergeContext, self).__init__(all_windows)
+    self.trigger_context = context
+    self.windowing = windowing
+    self.merged_away: typing.Dict[BoundedWindow, BoundedWindow] = {}
+
+  def merge(self, to_be_merged, merge_result):
+    _LOGGER.debug("Merging %s into %s", to_be_merged, merge_result)
+    self.trigger_context.merge_windows(to_be_merged, merge_result)
+    for window in to_be_merged:
+      if window != merge_result:
+        self.merged_away[window] = merge_result
+        # Clear state associated with PaneInfo since it is
+        # not preserved across merges.
+        self.trigger_context.for_window(window).clear_state(None)
+    self.windowing.triggerfn.on_merge(
+        to_be_merged,
+        merge_result,
+        self.trigger_context.for_window(merge_result))
+
+
+@typehints.with_input_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+@typehints.with_output_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+class GeneralTriggerManagerDoFn(DoFn):
+  """A trigger manager that supports all windowing / triggering cases.
+
+  This implements a DoFn that manages triggering in a per-key basis. All
+  elements for a single key are processed together. Per-key state holds data
+  related to all windows.
+  """
+
+  # TODO(BEAM-12026) Add support for Global and custom window fns.
+  KNOWN_WINDOWS = SetStateSpec('known_windows', IntervalWindowCoder())
+  FINISHED_WINDOWS = SetStateSpec('finished_windows', IntervalWindowCoder())
+  LAST_KNOWN_TIME = CombiningValueStateSpec('last_known_time', combine_fn=max)
+  LAST_KNOWN_WATERMARK = CombiningValueStateSpec(
+      'last_known_watermark', combine_fn=max)
+
+  # TODO(pabloem) What's the coder for the elements/keys here?
+  WINDOW_ELEMENT_PAIRS = BagStateSpec(
+      'all_elements', TupleCoder([IntervalWindowCoder(), PickleCoder()]))
+  WINDOW_TAG_VALUES = BagStateSpec(
+      'per_window_per_tag_value_state',
+      TupleCoder([IntervalWindowCoder(), StrUtf8Coder(), VarIntCoder()]))
+
+  PROCESSING_TIME_TIMER = TimerSpec(
+      'processing_time_timer', TimeDomain.REAL_TIME)
+  WATERMARK_TIMER = TimerSpec('watermark_timer', TimeDomain.WATERMARK)
+
+  def __init__(self, windowing: Windowing):
+    self.windowing = windowing
+    # Only session windows are merging. Other windows are non-merging.
+    self.merging_windows = self.windowing.windowfn.is_merging()
+
+  def process(
+      self,
+      element: typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]],
+      all_elements: BagRuntimeState = DoFn.StateParam(WINDOW_ELEMENT_PAIRS),  # type: ignore
+      latest_processing_time: AccumulatingRuntimeState = DoFn.StateParam(LAST_KNOWN_TIME),  # type: ignore
+      latest_watermark: AccumulatingRuntimeState = DoFn.StateParam(  # type: ignore
+          LAST_KNOWN_WATERMARK),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      windows_state: SetRuntimeState = DoFn.StateParam(KNOWN_WINDOWS),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER),
+      *args, **kwargs):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        latest_processing_time=latest_processing_time,
+        latest_watermark=latest_watermark,
+        all_elements_state=all_elements,
+        window_tag_values=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    key, windowed_values = element
+    watermark = read_watermark(latest_watermark)
+
+    windows_to_elements = collections.defaultdict(list)
+    for wv in windowed_values:
+      for window in wv.windows:
+        # ignore expired windows
+        if watermark > window.end + self.windowing.allowed_lateness:
+          continue
+        if window in finished_windows_state.read():
+          continue
+        windows_to_elements[window].append(
+            TimestampedValue(wv.value, wv.timestamp))
+
+    # Processing merging of windows
+    if self.merging_windows:
+      old_windows = set(windows_state.read())
+      all_windows = old_windows.union(list(windows_to_elements))
+      if all_windows != old_windows:
+        merge_context = TriggerMergeContext(
+            all_windows, context, self.windowing)
+        self.windowing.windowfn.merge(merge_context)
+
+        merged_windows_to_elements = collections.defaultdict(list)
+        for window, values in windows_to_elements.items():
+          while window in merge_context.merged_away:
+            window = merge_context.merged_away[window]
+          merged_windows_to_elements[window].extend(values)
+        windows_to_elements = merged_windows_to_elements
+
+      for w in windows_to_elements:
+        windows_state.add(w)
+    # Done processing merging of windows
+
+    seen_windows = set()
+    for w in windows_to_elements:
+      window_context = context.for_window(w)
+      seen_windows.add(w)
+      for value_w_timestamp in windows_to_elements[w]:
+        _LOGGER.debug(value_w_timestamp)
+        all_elements.add((w, value_w_timestamp))
+        self.windowing.triggerfn.on_element(windowed_values, w, window_context)
+
+    return self._fire_eligible_windows(
+        key, TimeDomain.WATERMARK, watermark, None, context, seen_windows)
+
+  def _fire_eligible_windows(
+      self,
+      key: K,
+      time_domain,
+      timestamp: Timestamp,
+      timer_tag: typing.Optional[str],
+      context: 'FnRunnerStatefulTriggerContext',
+      windows_of_interest: typing.Optional[typing.Set[BoundedWindow]] = None):
+    windows_to_elements = context.windows_to_elements_map()
+    context.all_elements_state.clear()
+
+    fired_windows = set()
+    _LOGGER.debug(
+        '%s - tag %s - timestamp %s', time_domain, timer_tag, timestamp)
+    for w, elems in windows_to_elements.items():
+      if windows_of_interest is not None and w not in windows_of_interest:
+        # windows_of_interest=None means that we care about all windows.
+        # If we care only about some windows, and this window is not one of
+        # them, then we do not intend to fire this window.
+        continue
+      window_context = context.for_window(w)
+      if self.windowing.triggerfn.should_fire(time_domain,
+                                              timestamp,
+                                              w,
+                                              window_context):
+        finished = self.windowing.triggerfn.on_fire(
+            timestamp, w, window_context)
+        _LOGGER.debug('Firing on window %s. Finished: %s', w, finished)
+        fired_windows.add(w)
+        if finished:
+          context.finished_windows_state.add(w)
+        # TODO(pabloem): Format the output: e.g. pane info
+        elems = [WindowedValue(e.value, e.timestamp, (w, )) for e in elems]
+        yield (key, elems)
+
+    finished_windows: typing.Set[BoundedWindow] = set(
+        context.finished_windows_state.read())
+    # Add elements that were not fired back into state.
+    for w, elems in windows_to_elements.items():
+      for e in elems:
+        if (w in finished_windows or
+            (w in fired_windows and
+             self.windowing.accumulation_mode == AccumulationMode.DISCARDING)):
+          continue
+        context.all_elements_state.add((w, e))
+
+  @on_timer(PROCESSING_TIME_TIMER)
+  def processing_time_trigger(
+      self,
+      key=DoFn.KeyParam,
+      timer_tag=DoFn.DynamicTimerTagParam,
+      timestamp=DoFn.TimestampParam,
+      latest_processing_time=DoFn.StateParam(LAST_KNOWN_TIME),
+      all_elements=DoFn.StateParam(WINDOW_ELEMENT_PAIRS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER)):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        latest_processing_time=latest_processing_time,
+        latest_watermark=None,
+        all_elements_state=all_elements,
+        window_tag_values=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    result = self._fire_eligible_windows(
+        key, TimeDomain.REAL_TIME, timestamp, timer_tag, context)
+    latest_processing_time.add(timestamp)
+    return result
+
+  @on_timer(WATERMARK_TIMER)
+  def watermark_trigger(
+      self,
+      key=DoFn.KeyParam,
+      timer_tag=DoFn.DynamicTimerTagParam,
+      timestamp=DoFn.TimestampParam,
+      latest_watermark=DoFn.StateParam(LAST_KNOWN_WATERMARK),
+      all_elements=DoFn.StateParam(WINDOW_ELEMENT_PAIRS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER)):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        latest_processing_time=None,
+        latest_watermark=latest_watermark,
+        all_elements_state=all_elements,
+        window_tag_values=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    result = self._fire_eligible_windows(
+        key, TimeDomain.WATERMARK, timestamp, timer_tag, context)
+    latest_watermark.add(timestamp)
+    return result
+
+
+class FnRunnerStatefulTriggerContext(TriggerContext):
+  def __init__(
+      self,
+      processing_time_timer: RuntimeTimer,
+      watermark_timer: RuntimeTimer,
+      latest_processing_time: typing.Optional[AccumulatingRuntimeState],
+      latest_watermark: typing.Optional[AccumulatingRuntimeState],
+      all_elements_state: BagRuntimeState,
+      window_tag_values: BagRuntimeState,
+      finished_windows_state: SetRuntimeState):
+    self.timers = {
+        TimeDomain.REAL_TIME: processing_time_timer,
+        TimeDomain.WATERMARK: watermark_timer
+    }
+    self.current_times = {
+        TimeDomain.REAL_TIME: latest_processing_time,
+        TimeDomain.WATERMARK: latest_watermark
+    }
+    self.all_elements_state = all_elements_state
+    self.window_tag_values = window_tag_values
+    self.finished_windows_state = finished_windows_state
+
+  def windows_to_elements_map(
+      self
+  ) -> typing.Dict[BoundedWindow, typing.List[windowed_value.WindowedValue]]:
+    window_element_pairs: typing.Iterable[typing.Tuple[
+        BoundedWindow,
+        windowed_value.WindowedValue]] = self.all_elements_state.read()
+    result: typing.Dict[BoundedWindow,
+                        typing.List[windowed_value.WindowedValue]] = {}
+    for w, e in window_element_pairs:
+      if w not in result:
+        result[w] = []
+      result[w].append(e)
+    return result
+
+  def for_window(self, window):
+    return PerWindowTriggerContext(window, self)
+
+  def get_current_time(self):
+    return self.current_times[TimeDomain.REAL_TIME].read()
+
+  def set_timer(self, name, time_domain, timestamp):
+    _LOGGER.debug('Setting timer (%s, %s) at %s', time_domain, name, timestamp)
+    self.timers[time_domain].set(timestamp, dynamic_timer_tag=name)
+
+  def clear_timer(self, name, time_domain):
+    _LOGGER.debug('Clearing timer (%s, %s)', time_domain, name)
+    self.timers[time_domain].clear(dynamic_timer_tag=name)
+
+  def merge_windows(self, to_be_merged, merge_result):
+    all_triplets = list(self.window_tag_values.read())
+    # Collect all the triplets for the window we are merging away, and tag them
+    # with the new window (merge_result).
+    merging_away_triplets = [(merge_result, t[1], t[2]) for t in all_triplets

Review comment:
       Done. Thanks @y1chi !




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] pabloem commented on a change in pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
pabloem commented on a change in pull request #13985:
URL: https://github.com/apache/beam/pull/13985#discussion_r599149072



##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')

Review comment:
       removed V. K is used.

##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')
+
+
+class ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):
+    super(TriggerMergeContext, self).__init__(all_windows)
+    self.trigger_context = context
+    self.windowing = windowing
+    self.merged_away: typing.Dict[BoundedWindow, BoundedWindow] = {}
+
+  def merge(self, to_be_merged, merge_result):
+    _LOGGER.debug("Merging %s into %s", to_be_merged, merge_result)
+    self.trigger_context.merge_state(to_be_merged, merge_result)
+    for window in to_be_merged:
+      if window != merge_result:
+        self.merged_away[window] = merge_result
+        # Clear state associated with PaneInfo since it is
+        # not preserved across merges.
+        self.trigger_context.for_window(window).clear_state(None)
+    self.windowing.triggerfn.on_merge(
+        to_be_merged,
+        merge_result,
+        self.trigger_context.for_window(merge_result))
+
+
+@typehints.with_input_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+@typehints.with_output_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+class GeneralTriggerManagerDoFn(DoFn):
+  """A trigger manager that supports all windowing / triggering cases.
+
+  This implements a DoFn that manages triggering in a per-key basis. All
+  elements for a single key are processed together. Per-key state holds data
+  related to all windows.
+  """
+
+  KNOWN_WINDOWS = SetStateSpec('known_windows', IntervalWindowCoder())
+  FINISHED_WINDOWS = SetStateSpec('finished_windows', IntervalWindowCoder())
+  LAST_KNOWN_TIME = CombiningValueStateSpec('last_known_time', combine_fn=max)
+  LAST_KNOWN_WATERMARK = CombiningValueStateSpec(
+      'last_known_watermark', combine_fn=max)
+
+  # TODO(pabloem) What's the coder for the elements/keys here?
+  WINDOW_ELEMENT_PAIRS = BagStateSpec(
+      'element_bag', TupleCoder([IntervalWindowCoder(), PickleCoder()]))
+  WINDOW_TAG_VALUES = BagStateSpec(
+      'per_window_per_tag_value_state',
+      TupleCoder([IntervalWindowCoder(), StrUtf8Coder(), VarIntCoder()]))
+
+  PROCESSING_TIME_TIMER = TimerSpec(
+      'processing_time_timer', TimeDomain.REAL_TIME)
+  WATERMARK_TIMER = TimerSpec('watermark_timer', TimeDomain.WATERMARK)
+
+  def __init__(self, windowing: Windowing):
+    self.windowing = windowing
+    # Only session windows are merging. Other windows are non-merging.
+    self.merging_windows = isinstance(windowing.windowfn, Sessions)
+
+  def process(
+      self,
+      element: typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]],
+      element_bag: BagRuntimeState = DoFn.StateParam(WINDOW_ELEMENT_PAIRS),  # type: ignore
+      time_state: AccumulatingRuntimeState = DoFn.StateParam(LAST_KNOWN_TIME),  # type: ignore
+      watermark_state: AccumulatingRuntimeState = DoFn.StateParam(  # type: ignore
+          LAST_KNOWN_WATERMARK),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      windows_state: SetRuntimeState = DoFn.StateParam(KNOWN_WINDOWS),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER),
+      *args, **kwargs):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=time_state,
+        watermark_state=watermark_state,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    key, windowed_values = element
+    watermark = read_watermark(watermark_state)
+
+    windows_to_elements = collections.defaultdict(list)
+    for wv in windowed_values:
+      for window in wv.windows:
+        # ignore expired windows
+        if watermark > window.end + self.windowing.allowed_lateness:
+          continue
+        if window in finished_windows_state.read():
+          continue
+        windows_to_elements[window].append(
+            TimestampedValue(wv.value, wv.timestamp))
+
+    # Processing merging of windows
+    if self.merging_windows:
+      old_windows = set(windows_state.read())
+      all_windows = old_windows.union(list(windows_to_elements))
+      if all_windows != old_windows:
+        merge_context = TriggerMergeContext(
+            all_windows, context, self.windowing)
+        self.windowing.windowfn.merge(merge_context)
+
+        merged_windows_to_elements = collections.defaultdict(list)
+        for window, values in windows_to_elements.items():
+          while window in merge_context.merged_away:
+            window = merge_context.merged_away[window]
+          merged_windows_to_elements[window].extend(values)
+        windows_to_elements = merged_windows_to_elements
+
+      if old_windows != all_windows:
+        for w in windows_to_elements:
+          windows_state.add(w)
+    # Done processing merging of windows
+
+    seen_windows = set()
+    for w in windows_to_elements:
+      window_context = context.for_window(w)
+      seen_windows.add(w)
+      for value_w_timestamp in windows_to_elements[w]:
+        _LOGGER.debug(value_w_timestamp)
+        element_bag.add((w, value_w_timestamp))
+        self.windowing.triggerfn.on_element(windowed_values, w, window_context)
+
+    return self._trigger_fire(
+        key, TimeDomain.WATERMARK, watermark, None, context, seen_windows)
+
+  def _trigger_fire(

Review comment:
       done.

##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')
+
+
+class ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):
+    super(TriggerMergeContext, self).__init__(all_windows)
+    self.trigger_context = context
+    self.windowing = windowing
+    self.merged_away: typing.Dict[BoundedWindow, BoundedWindow] = {}
+
+  def merge(self, to_be_merged, merge_result):
+    _LOGGER.debug("Merging %s into %s", to_be_merged, merge_result)
+    self.trigger_context.merge_state(to_be_merged, merge_result)
+    for window in to_be_merged:
+      if window != merge_result:
+        self.merged_away[window] = merge_result
+        # Clear state associated with PaneInfo since it is
+        # not preserved across merges.
+        self.trigger_context.for_window(window).clear_state(None)
+    self.windowing.triggerfn.on_merge(
+        to_be_merged,
+        merge_result,
+        self.trigger_context.for_window(merge_result))
+
+
+@typehints.with_input_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+@typehints.with_output_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+class GeneralTriggerManagerDoFn(DoFn):
+  """A trigger manager that supports all windowing / triggering cases.
+
+  This implements a DoFn that manages triggering in a per-key basis. All
+  elements for a single key are processed together. Per-key state holds data
+  related to all windows.
+  """
+
+  KNOWN_WINDOWS = SetStateSpec('known_windows', IntervalWindowCoder())

Review comment:
       Done.

##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')
+
+
+class ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):
+    super(TriggerMergeContext, self).__init__(all_windows)
+    self.trigger_context = context
+    self.windowing = windowing
+    self.merged_away: typing.Dict[BoundedWindow, BoundedWindow] = {}
+
+  def merge(self, to_be_merged, merge_result):
+    _LOGGER.debug("Merging %s into %s", to_be_merged, merge_result)
+    self.trigger_context.merge_state(to_be_merged, merge_result)
+    for window in to_be_merged:
+      if window != merge_result:
+        self.merged_away[window] = merge_result
+        # Clear state associated with PaneInfo since it is
+        # not preserved across merges.
+        self.trigger_context.for_window(window).clear_state(None)
+    self.windowing.triggerfn.on_merge(
+        to_be_merged,
+        merge_result,
+        self.trigger_context.for_window(merge_result))
+
+
+@typehints.with_input_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+@typehints.with_output_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+class GeneralTriggerManagerDoFn(DoFn):
+  """A trigger manager that supports all windowing / triggering cases.
+
+  This implements a DoFn that manages triggering in a per-key basis. All
+  elements for a single key are processed together. Per-key state holds data
+  related to all windows.
+  """
+
+  KNOWN_WINDOWS = SetStateSpec('known_windows', IntervalWindowCoder())
+  FINISHED_WINDOWS = SetStateSpec('finished_windows', IntervalWindowCoder())
+  LAST_KNOWN_TIME = CombiningValueStateSpec('last_known_time', combine_fn=max)
+  LAST_KNOWN_WATERMARK = CombiningValueStateSpec(
+      'last_known_watermark', combine_fn=max)
+
+  # TODO(pabloem) What's the coder for the elements/keys here?
+  WINDOW_ELEMENT_PAIRS = BagStateSpec(
+      'element_bag', TupleCoder([IntervalWindowCoder(), PickleCoder()]))
+  WINDOW_TAG_VALUES = BagStateSpec(
+      'per_window_per_tag_value_state',
+      TupleCoder([IntervalWindowCoder(), StrUtf8Coder(), VarIntCoder()]))
+
+  PROCESSING_TIME_TIMER = TimerSpec(
+      'processing_time_timer', TimeDomain.REAL_TIME)
+  WATERMARK_TIMER = TimerSpec('watermark_timer', TimeDomain.WATERMARK)
+
+  def __init__(self, windowing: Windowing):
+    self.windowing = windowing
+    # Only session windows are merging. Other windows are non-merging.
+    self.merging_windows = isinstance(windowing.windowfn, Sessions)
+
+  def process(
+      self,
+      element: typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]],
+      element_bag: BagRuntimeState = DoFn.StateParam(WINDOW_ELEMENT_PAIRS),  # type: ignore

Review comment:
       done.

##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')
+
+
+class ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):
+    super(TriggerMergeContext, self).__init__(all_windows)
+    self.trigger_context = context
+    self.windowing = windowing
+    self.merged_away: typing.Dict[BoundedWindow, BoundedWindow] = {}
+
+  def merge(self, to_be_merged, merge_result):
+    _LOGGER.debug("Merging %s into %s", to_be_merged, merge_result)
+    self.trigger_context.merge_state(to_be_merged, merge_result)
+    for window in to_be_merged:
+      if window != merge_result:
+        self.merged_away[window] = merge_result
+        # Clear state associated with PaneInfo since it is
+        # not preserved across merges.
+        self.trigger_context.for_window(window).clear_state(None)
+    self.windowing.triggerfn.on_merge(
+        to_be_merged,
+        merge_result,
+        self.trigger_context.for_window(merge_result))
+
+
+@typehints.with_input_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+@typehints.with_output_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+class GeneralTriggerManagerDoFn(DoFn):
+  """A trigger manager that supports all windowing / triggering cases.
+
+  This implements a DoFn that manages triggering in a per-key basis. All
+  elements for a single key are processed together. Per-key state holds data
+  related to all windows.
+  """
+
+  KNOWN_WINDOWS = SetStateSpec('known_windows', IntervalWindowCoder())
+  FINISHED_WINDOWS = SetStateSpec('finished_windows', IntervalWindowCoder())
+  LAST_KNOWN_TIME = CombiningValueStateSpec('last_known_time', combine_fn=max)
+  LAST_KNOWN_WATERMARK = CombiningValueStateSpec(
+      'last_known_watermark', combine_fn=max)
+
+  # TODO(pabloem) What's the coder for the elements/keys here?
+  WINDOW_ELEMENT_PAIRS = BagStateSpec(
+      'element_bag', TupleCoder([IntervalWindowCoder(), PickleCoder()]))
+  WINDOW_TAG_VALUES = BagStateSpec(
+      'per_window_per_tag_value_state',
+      TupleCoder([IntervalWindowCoder(), StrUtf8Coder(), VarIntCoder()]))
+
+  PROCESSING_TIME_TIMER = TimerSpec(
+      'processing_time_timer', TimeDomain.REAL_TIME)
+  WATERMARK_TIMER = TimerSpec('watermark_timer', TimeDomain.WATERMARK)
+
+  def __init__(self, windowing: Windowing):
+    self.windowing = windowing
+    # Only session windows are merging. Other windows are non-merging.
+    self.merging_windows = isinstance(windowing.windowfn, Sessions)
+
+  def process(
+      self,
+      element: typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]],
+      element_bag: BagRuntimeState = DoFn.StateParam(WINDOW_ELEMENT_PAIRS),  # type: ignore
+      time_state: AccumulatingRuntimeState = DoFn.StateParam(LAST_KNOWN_TIME),  # type: ignore
+      watermark_state: AccumulatingRuntimeState = DoFn.StateParam(  # type: ignore
+          LAST_KNOWN_WATERMARK),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      windows_state: SetRuntimeState = DoFn.StateParam(KNOWN_WINDOWS),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER),
+      *args, **kwargs):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=time_state,
+        watermark_state=watermark_state,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    key, windowed_values = element
+    watermark = read_watermark(watermark_state)
+
+    windows_to_elements = collections.defaultdict(list)
+    for wv in windowed_values:
+      for window in wv.windows:
+        # ignore expired windows
+        if watermark > window.end + self.windowing.allowed_lateness:
+          continue
+        if window in finished_windows_state.read():
+          continue
+        windows_to_elements[window].append(
+            TimestampedValue(wv.value, wv.timestamp))
+
+    # Processing merging of windows
+    if self.merging_windows:
+      old_windows = set(windows_state.read())
+      all_windows = old_windows.union(list(windows_to_elements))
+      if all_windows != old_windows:
+        merge_context = TriggerMergeContext(
+            all_windows, context, self.windowing)
+        self.windowing.windowfn.merge(merge_context)
+
+        merged_windows_to_elements = collections.defaultdict(list)
+        for window, values in windows_to_elements.items():
+          while window in merge_context.merged_away:
+            window = merge_context.merged_away[window]
+          merged_windows_to_elements[window].extend(values)
+        windows_to_elements = merged_windows_to_elements
+
+      if old_windows != all_windows:

Review comment:
       removed check.

##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')
+
+
+class ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):
+    super(TriggerMergeContext, self).__init__(all_windows)
+    self.trigger_context = context
+    self.windowing = windowing
+    self.merged_away: typing.Dict[BoundedWindow, BoundedWindow] = {}
+
+  def merge(self, to_be_merged, merge_result):
+    _LOGGER.debug("Merging %s into %s", to_be_merged, merge_result)
+    self.trigger_context.merge_state(to_be_merged, merge_result)
+    for window in to_be_merged:
+      if window != merge_result:
+        self.merged_away[window] = merge_result
+        # Clear state associated with PaneInfo since it is
+        # not preserved across merges.
+        self.trigger_context.for_window(window).clear_state(None)
+    self.windowing.triggerfn.on_merge(
+        to_be_merged,
+        merge_result,
+        self.trigger_context.for_window(merge_result))
+
+
+@typehints.with_input_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+@typehints.with_output_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+class GeneralTriggerManagerDoFn(DoFn):
+  """A trigger manager that supports all windowing / triggering cases.
+
+  This implements a DoFn that manages triggering in a per-key basis. All
+  elements for a single key are processed together. Per-key state holds data
+  related to all windows.
+  """
+
+  KNOWN_WINDOWS = SetStateSpec('known_windows', IntervalWindowCoder())
+  FINISHED_WINDOWS = SetStateSpec('finished_windows', IntervalWindowCoder())
+  LAST_KNOWN_TIME = CombiningValueStateSpec('last_known_time', combine_fn=max)
+  LAST_KNOWN_WATERMARK = CombiningValueStateSpec(
+      'last_known_watermark', combine_fn=max)
+
+  # TODO(pabloem) What's the coder for the elements/keys here?
+  WINDOW_ELEMENT_PAIRS = BagStateSpec(
+      'element_bag', TupleCoder([IntervalWindowCoder(), PickleCoder()]))
+  WINDOW_TAG_VALUES = BagStateSpec(
+      'per_window_per_tag_value_state',
+      TupleCoder([IntervalWindowCoder(), StrUtf8Coder(), VarIntCoder()]))
+
+  PROCESSING_TIME_TIMER = TimerSpec(
+      'processing_time_timer', TimeDomain.REAL_TIME)
+  WATERMARK_TIMER = TimerSpec('watermark_timer', TimeDomain.WATERMARK)
+
+  def __init__(self, windowing: Windowing):
+    self.windowing = windowing
+    # Only session windows are merging. Other windows are non-merging.
+    self.merging_windows = isinstance(windowing.windowfn, Sessions)
+
+  def process(
+      self,
+      element: typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]],
+      element_bag: BagRuntimeState = DoFn.StateParam(WINDOW_ELEMENT_PAIRS),  # type: ignore
+      time_state: AccumulatingRuntimeState = DoFn.StateParam(LAST_KNOWN_TIME),  # type: ignore
+      watermark_state: AccumulatingRuntimeState = DoFn.StateParam(  # type: ignore
+          LAST_KNOWN_WATERMARK),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      windows_state: SetRuntimeState = DoFn.StateParam(KNOWN_WINDOWS),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER),
+      *args, **kwargs):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=time_state,
+        watermark_state=watermark_state,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    key, windowed_values = element
+    watermark = read_watermark(watermark_state)
+
+    windows_to_elements = collections.defaultdict(list)
+    for wv in windowed_values:
+      for window in wv.windows:
+        # ignore expired windows
+        if watermark > window.end + self.windowing.allowed_lateness:
+          continue
+        if window in finished_windows_state.read():
+          continue
+        windows_to_elements[window].append(
+            TimestampedValue(wv.value, wv.timestamp))
+
+    # Processing merging of windows
+    if self.merging_windows:
+      old_windows = set(windows_state.read())
+      all_windows = old_windows.union(list(windows_to_elements))
+      if all_windows != old_windows:
+        merge_context = TriggerMergeContext(
+            all_windows, context, self.windowing)
+        self.windowing.windowfn.merge(merge_context)
+
+        merged_windows_to_elements = collections.defaultdict(list)
+        for window, values in windows_to_elements.items():
+          while window in merge_context.merged_away:
+            window = merge_context.merged_away[window]
+          merged_windows_to_elements[window].extend(values)
+        windows_to_elements = merged_windows_to_elements
+
+      if old_windows != all_windows:
+        for w in windows_to_elements:
+          windows_state.add(w)
+    # Done processing merging of windows
+
+    seen_windows = set()
+    for w in windows_to_elements:
+      window_context = context.for_window(w)
+      seen_windows.add(w)
+      for value_w_timestamp in windows_to_elements[w]:
+        _LOGGER.debug(value_w_timestamp)
+        element_bag.add((w, value_w_timestamp))
+        self.windowing.triggerfn.on_element(windowed_values, w, window_context)
+
+    return self._trigger_fire(
+        key, TimeDomain.WATERMARK, watermark, None, context, seen_windows)
+
+  def _trigger_fire(
+      self,
+      key: K,
+      time_domain,
+      timestamp: Timestamp,
+      timer_tag: typing.Optional[str],
+      context: 'FnRunnerStatefulTriggerContext',
+      windows_of_interest: typing.Optional[typing.Set[BoundedWindow]] = None):
+    windows_to_elements = context.windows_to_elements_map()
+    context.elements_bag_state.clear()
+
+    fired_windows = set()
+    finished_windows = set()
+    _LOGGER.debug(
+        '%s - tag %s - timestamp %s', time_domain, timer_tag, timestamp)
+    for w, elems in windows_to_elements.items():
+      if windows_of_interest is not None and w not in windows_of_interest:
+        # windows_of_interest=None means that we care about all windows.
+        # If we care only about some windows, and this window is not one of
+        # them, then we do not intend to fire this window.
+        continue
+      window_context = context.for_window(w)
+      if self.windowing.triggerfn.should_fire(time_domain,
+                                              timestamp,
+                                              w,
+                                              window_context):
+        finished = self.windowing.triggerfn.on_fire(
+            timestamp, w, window_context)
+        _LOGGER.debug('Firing on window %s. Finished: %s', w, finished)
+        fired_windows.add(w)
+        if finished:
+          context.finished_windows_state.add(w)
+          finished_windows.add(w)
+        # TODO(pabloem): Format the output: e.g. pane info
+        elems = [WindowedValue(e.value, e.timestamp, (w, )) for e in elems]
+        yield (key, elems)
+
+    # Add elements that were not fired back into state.
+    for w, elems in windows_to_elements.items():
+      for e in elems:
+        if (w in finished_windows or
+            (w in fired_windows and
+             self.windowing.accumulation_mode == AccumulationMode.DISCARDING)):
+          continue
+        context.elements_bag_state.add((w, e))
+
+  @on_timer(PROCESSING_TIME_TIMER)
+  def processing_time_trigger(
+      self,
+      key=DoFn.KeyParam,
+      timer_tag=DoFn.DynamicTimerTagParam,
+      timestamp=DoFn.TimestampParam,
+      processing_time_state=DoFn.StateParam(LAST_KNOWN_TIME),
+      element_bag=DoFn.StateParam(WINDOW_ELEMENT_PAIRS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER)):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=processing_time_state,
+        watermark_state=None,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    result = self._trigger_fire(
+        key, TimeDomain.REAL_TIME, timestamp, timer_tag, context)
+    processing_time_state.add(timestamp)
+    return result
+
+  @on_timer(WATERMARK_TIMER)
+  def watermark_trigger(
+      self,
+      key=DoFn.KeyParam,
+      timer_tag=DoFn.DynamicTimerTagParam,
+      timestamp=DoFn.TimestampParam,
+      watermark_state=DoFn.StateParam(LAST_KNOWN_WATERMARK),
+      element_bag=DoFn.StateParam(WINDOW_ELEMENT_PAIRS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER)):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=None,
+        watermark_state=watermark_state,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    result = self._trigger_fire(
+        key, TimeDomain.WATERMARK, timestamp, timer_tag, context)
+    watermark_state.add(timestamp)
+    return result
+
+
+class FnRunnerStatefulTriggerContext(TriggerContext):
+  def __init__(
+      self,
+      processing_time_timer: RuntimeTimer,
+      watermark_timer: RuntimeTimer,
+      current_time_state: typing.Optional[AccumulatingRuntimeState],
+      watermark_state: typing.Optional[AccumulatingRuntimeState],
+      elements_bag_state: BagRuntimeState,
+      window_tag_values_bag_state: BagRuntimeState,
+      finished_windows_state: SetRuntimeState):
+    self.timers = {
+        TimeDomain.REAL_TIME: processing_time_timer,
+        TimeDomain.WATERMARK: watermark_timer
+    }
+    self.current_times = {
+        TimeDomain.REAL_TIME: current_time_state,
+        TimeDomain.WATERMARK: watermark_state
+    }
+    self.elements_bag_state = elements_bag_state
+    self.window_tag_values_bag_state = window_tag_values_bag_state
+    self.finished_windows_state = finished_windows_state
+
+  def windows_to_elements_map(
+      self
+  ) -> typing.Dict[BoundedWindow, typing.List[windowed_value.WindowedValue]]:
+    window_element_pairs: typing.Iterable[typing.Tuple[
+        BoundedWindow,
+        windowed_value.WindowedValue]] = self.elements_bag_state.read()
+    result: typing.Dict[BoundedWindow,
+                        typing.List[windowed_value.WindowedValue]] = {}
+    for w, e in window_element_pairs:
+      if w not in result:
+        result[w] = []
+      result[w].append(e)
+    return result
+
+  def for_window(self, window):
+    return PerWindowTriggerContext(window, self)
+
+  def get_current_time(self):
+    return self.current_times[TimeDomain.REAL_TIME].read()
+
+  def set_timer(self, name, time_domain, timestamp):
+    _LOGGER.debug('Setting timer (%s, %s) at %s', time_domain, name, timestamp)
+    self.timers[time_domain].set(timestamp, dynamic_timer_tag=name)
+
+  def clear_timer(self, name, time_domain):
+    _LOGGER.debug('Clearing timer (%s, %s)', time_domain, name)
+    self.timers[time_domain].clear(dynamic_timer_tag=name)
+
+  def merge_state(self, to_be_merged, merge_result):

Review comment:
       done.

##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')
+
+
+class ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):
+    super(TriggerMergeContext, self).__init__(all_windows)
+    self.trigger_context = context
+    self.windowing = windowing
+    self.merged_away: typing.Dict[BoundedWindow, BoundedWindow] = {}
+
+  def merge(self, to_be_merged, merge_result):
+    _LOGGER.debug("Merging %s into %s", to_be_merged, merge_result)
+    self.trigger_context.merge_state(to_be_merged, merge_result)
+    for window in to_be_merged:
+      if window != merge_result:
+        self.merged_away[window] = merge_result
+        # Clear state associated with PaneInfo since it is
+        # not preserved across merges.
+        self.trigger_context.for_window(window).clear_state(None)
+    self.windowing.triggerfn.on_merge(
+        to_be_merged,
+        merge_result,
+        self.trigger_context.for_window(merge_result))
+
+
+@typehints.with_input_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+@typehints.with_output_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+class GeneralTriggerManagerDoFn(DoFn):
+  """A trigger manager that supports all windowing / triggering cases.
+
+  This implements a DoFn that manages triggering in a per-key basis. All
+  elements for a single key are processed together. Per-key state holds data
+  related to all windows.
+  """
+
+  KNOWN_WINDOWS = SetStateSpec('known_windows', IntervalWindowCoder())
+  FINISHED_WINDOWS = SetStateSpec('finished_windows', IntervalWindowCoder())
+  LAST_KNOWN_TIME = CombiningValueStateSpec('last_known_time', combine_fn=max)
+  LAST_KNOWN_WATERMARK = CombiningValueStateSpec(
+      'last_known_watermark', combine_fn=max)
+
+  # TODO(pabloem) What's the coder for the elements/keys here?
+  WINDOW_ELEMENT_PAIRS = BagStateSpec(
+      'element_bag', TupleCoder([IntervalWindowCoder(), PickleCoder()]))
+  WINDOW_TAG_VALUES = BagStateSpec(
+      'per_window_per_tag_value_state',
+      TupleCoder([IntervalWindowCoder(), StrUtf8Coder(), VarIntCoder()]))
+
+  PROCESSING_TIME_TIMER = TimerSpec(
+      'processing_time_timer', TimeDomain.REAL_TIME)
+  WATERMARK_TIMER = TimerSpec('watermark_timer', TimeDomain.WATERMARK)
+
+  def __init__(self, windowing: Windowing):
+    self.windowing = windowing
+    # Only session windows are merging. Other windows are non-merging.
+    self.merging_windows = isinstance(windowing.windowfn, Sessions)
+
+  def process(
+      self,
+      element: typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]],
+      element_bag: BagRuntimeState = DoFn.StateParam(WINDOW_ELEMENT_PAIRS),  # type: ignore
+      time_state: AccumulatingRuntimeState = DoFn.StateParam(LAST_KNOWN_TIME),  # type: ignore
+      watermark_state: AccumulatingRuntimeState = DoFn.StateParam(  # type: ignore
+          LAST_KNOWN_WATERMARK),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      windows_state: SetRuntimeState = DoFn.StateParam(KNOWN_WINDOWS),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER),
+      *args, **kwargs):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=time_state,
+        watermark_state=watermark_state,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    key, windowed_values = element
+    watermark = read_watermark(watermark_state)
+
+    windows_to_elements = collections.defaultdict(list)
+    for wv in windowed_values:
+      for window in wv.windows:
+        # ignore expired windows
+        if watermark > window.end + self.windowing.allowed_lateness:
+          continue
+        if window in finished_windows_state.read():
+          continue
+        windows_to_elements[window].append(
+            TimestampedValue(wv.value, wv.timestamp))
+
+    # Processing merging of windows
+    if self.merging_windows:
+      old_windows = set(windows_state.read())
+      all_windows = old_windows.union(list(windows_to_elements))
+      if all_windows != old_windows:
+        merge_context = TriggerMergeContext(
+            all_windows, context, self.windowing)
+        self.windowing.windowfn.merge(merge_context)
+
+        merged_windows_to_elements = collections.defaultdict(list)
+        for window, values in windows_to_elements.items():
+          while window in merge_context.merged_away:
+            window = merge_context.merged_away[window]
+          merged_windows_to_elements[window].extend(values)
+        windows_to_elements = merged_windows_to_elements
+
+      if old_windows != all_windows:
+        for w in windows_to_elements:
+          windows_state.add(w)
+    # Done processing merging of windows
+
+    seen_windows = set()
+    for w in windows_to_elements:
+      window_context = context.for_window(w)
+      seen_windows.add(w)
+      for value_w_timestamp in windows_to_elements[w]:
+        _LOGGER.debug(value_w_timestamp)
+        element_bag.add((w, value_w_timestamp))
+        self.windowing.triggerfn.on_element(windowed_values, w, window_context)
+
+    return self._trigger_fire(
+        key, TimeDomain.WATERMARK, watermark, None, context, seen_windows)
+
+  def _trigger_fire(
+      self,
+      key: K,
+      time_domain,
+      timestamp: Timestamp,
+      timer_tag: typing.Optional[str],
+      context: 'FnRunnerStatefulTriggerContext',
+      windows_of_interest: typing.Optional[typing.Set[BoundedWindow]] = None):
+    windows_to_elements = context.windows_to_elements_map()
+    context.elements_bag_state.clear()
+
+    fired_windows = set()
+    finished_windows = set()
+    _LOGGER.debug(
+        '%s - tag %s - timestamp %s', time_domain, timer_tag, timestamp)
+    for w, elems in windows_to_elements.items():
+      if windows_of_interest is not None and w not in windows_of_interest:
+        # windows_of_interest=None means that we care about all windows.
+        # If we care only about some windows, and this window is not one of
+        # them, then we do not intend to fire this window.
+        continue
+      window_context = context.for_window(w)
+      if self.windowing.triggerfn.should_fire(time_domain,
+                                              timestamp,
+                                              w,
+                                              window_context):
+        finished = self.windowing.triggerfn.on_fire(
+            timestamp, w, window_context)
+        _LOGGER.debug('Firing on window %s. Finished: %s', w, finished)
+        fired_windows.add(w)
+        if finished:
+          context.finished_windows_state.add(w)
+          finished_windows.add(w)
+        # TODO(pabloem): Format the output: e.g. pane info
+        elems = [WindowedValue(e.value, e.timestamp, (w, )) for e in elems]
+        yield (key, elems)
+
+    # Add elements that were not fired back into state.
+    for w, elems in windows_to_elements.items():
+      for e in elems:
+        if (w in finished_windows or
+            (w in fired_windows and
+             self.windowing.accumulation_mode == AccumulationMode.DISCARDING)):
+          continue
+        context.elements_bag_state.add((w, e))
+
+  @on_timer(PROCESSING_TIME_TIMER)
+  def processing_time_trigger(
+      self,
+      key=DoFn.KeyParam,
+      timer_tag=DoFn.DynamicTimerTagParam,
+      timestamp=DoFn.TimestampParam,
+      processing_time_state=DoFn.StateParam(LAST_KNOWN_TIME),
+      element_bag=DoFn.StateParam(WINDOW_ELEMENT_PAIRS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER)):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=processing_time_state,
+        watermark_state=None,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    result = self._trigger_fire(
+        key, TimeDomain.REAL_TIME, timestamp, timer_tag, context)
+    processing_time_state.add(timestamp)
+    return result
+
+  @on_timer(WATERMARK_TIMER)
+  def watermark_trigger(
+      self,
+      key=DoFn.KeyParam,
+      timer_tag=DoFn.DynamicTimerTagParam,
+      timestamp=DoFn.TimestampParam,
+      watermark_state=DoFn.StateParam(LAST_KNOWN_WATERMARK),
+      element_bag=DoFn.StateParam(WINDOW_ELEMENT_PAIRS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER)):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=None,
+        watermark_state=watermark_state,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    result = self._trigger_fire(
+        key, TimeDomain.WATERMARK, timestamp, timer_tag, context)
+    watermark_state.add(timestamp)
+    return result
+
+
+class FnRunnerStatefulTriggerContext(TriggerContext):
+  def __init__(
+      self,
+      processing_time_timer: RuntimeTimer,
+      watermark_timer: RuntimeTimer,
+      current_time_state: typing.Optional[AccumulatingRuntimeState],
+      watermark_state: typing.Optional[AccumulatingRuntimeState],
+      elements_bag_state: BagRuntimeState,
+      window_tag_values_bag_state: BagRuntimeState,
+      finished_windows_state: SetRuntimeState):
+    self.timers = {
+        TimeDomain.REAL_TIME: processing_time_timer,
+        TimeDomain.WATERMARK: watermark_timer
+    }
+    self.current_times = {
+        TimeDomain.REAL_TIME: current_time_state,
+        TimeDomain.WATERMARK: watermark_state
+    }
+    self.elements_bag_state = elements_bag_state
+    self.window_tag_values_bag_state = window_tag_values_bag_state
+    self.finished_windows_state = finished_windows_state
+
+  def windows_to_elements_map(
+      self
+  ) -> typing.Dict[BoundedWindow, typing.List[windowed_value.WindowedValue]]:
+    window_element_pairs: typing.Iterable[typing.Tuple[
+        BoundedWindow,
+        windowed_value.WindowedValue]] = self.elements_bag_state.read()
+    result: typing.Dict[BoundedWindow,
+                        typing.List[windowed_value.WindowedValue]] = {}
+    for w, e in window_element_pairs:
+      if w not in result:
+        result[w] = []
+      result[w].append(e)
+    return result
+
+  def for_window(self, window):
+    return PerWindowTriggerContext(window, self)
+
+  def get_current_time(self):
+    return self.current_times[TimeDomain.REAL_TIME].read()
+
+  def set_timer(self, name, time_domain, timestamp):
+    _LOGGER.debug('Setting timer (%s, %s) at %s', time_domain, name, timestamp)
+    self.timers[time_domain].set(timestamp, dynamic_timer_tag=name)
+
+  def clear_timer(self, name, time_domain):
+    _LOGGER.debug('Clearing timer (%s, %s)', time_domain, name)
+    self.timers[time_domain].clear(dynamic_timer_tag=name)
+
+  def merge_state(self, to_be_merged, merge_result):
+    all_triplets = self.window_tag_values_bag_state.read()

Review comment:
       That's correct! The triplets are used as context for a trigger_fn. Consider the following trigger:
   
   ```
   AfterEach(
     AfterCount(3),
     AfterCount(2),
     AfterWatermark())
   ```
   
   And let's consider input data like so:  [1, 2, 3, 4, 5, 6, WATERMARK PASSES]
   
   The triplets would work like so:
   
   | Input | Triplets after input | Notes |
   |---------|-------------------------|------|
   |  1   |   `[('count', '[window]', 1)]`  | Triggers have not been matched. Elements being counted. |
   |  2   |   `[('count', '[window]', 1), ('count', '[window]', 1)]`  | Next element counted |
   |  3   |   `[('matched', '[window]', 1)]`  | Count reached 3, so we fire and clean up after firing. Now one counter is matched. |
   |  4   |   `[('matched', '[window]', 1), ('count', '[window]', 1)]`  | Count reached 3, so we fire and clean up after firing. Now one counter is matched. |
   |  5   |   `[('matched', '[window]', 1), ('matched', '[window]', 1)]`  | Count reached 2, so we fire and clean up after firing. Now one more counter is matched. |
   |  6   |   `[('matched', '[window]', 1), ('matched', '[window]', 1)]`  | Watermark triggerfn only keeps state for early/late fires. |
   |  WATERMARK   |   `[]`  | Clean up and fire last trigger. |
   
   Let me know if that helps.

##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')
+
+
+class ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):
+    super(TriggerMergeContext, self).__init__(all_windows)
+    self.trigger_context = context
+    self.windowing = windowing
+    self.merged_away: typing.Dict[BoundedWindow, BoundedWindow] = {}
+
+  def merge(self, to_be_merged, merge_result):
+    _LOGGER.debug("Merging %s into %s", to_be_merged, merge_result)
+    self.trigger_context.merge_state(to_be_merged, merge_result)
+    for window in to_be_merged:
+      if window != merge_result:
+        self.merged_away[window] = merge_result
+        # Clear state associated with PaneInfo since it is
+        # not preserved across merges.
+        self.trigger_context.for_window(window).clear_state(None)
+    self.windowing.triggerfn.on_merge(
+        to_be_merged,
+        merge_result,
+        self.trigger_context.for_window(merge_result))
+
+
+@typehints.with_input_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+@typehints.with_output_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+class GeneralTriggerManagerDoFn(DoFn):
+  """A trigger manager that supports all windowing / triggering cases.
+
+  This implements a DoFn that manages triggering in a per-key basis. All
+  elements for a single key are processed together. Per-key state holds data
+  related to all windows.
+  """
+
+  KNOWN_WINDOWS = SetStateSpec('known_windows', IntervalWindowCoder())
+  FINISHED_WINDOWS = SetStateSpec('finished_windows', IntervalWindowCoder())
+  LAST_KNOWN_TIME = CombiningValueStateSpec('last_known_time', combine_fn=max)
+  LAST_KNOWN_WATERMARK = CombiningValueStateSpec(
+      'last_known_watermark', combine_fn=max)
+
+  # TODO(pabloem) What's the coder for the elements/keys here?
+  WINDOW_ELEMENT_PAIRS = BagStateSpec(
+      'element_bag', TupleCoder([IntervalWindowCoder(), PickleCoder()]))
+  WINDOW_TAG_VALUES = BagStateSpec(
+      'per_window_per_tag_value_state',
+      TupleCoder([IntervalWindowCoder(), StrUtf8Coder(), VarIntCoder()]))
+
+  PROCESSING_TIME_TIMER = TimerSpec(
+      'processing_time_timer', TimeDomain.REAL_TIME)
+  WATERMARK_TIMER = TimerSpec('watermark_timer', TimeDomain.WATERMARK)
+
+  def __init__(self, windowing: Windowing):
+    self.windowing = windowing
+    # Only session windows are merging. Other windows are non-merging.
+    self.merging_windows = isinstance(windowing.windowfn, Sessions)
+
+  def process(
+      self,
+      element: typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]],
+      element_bag: BagRuntimeState = DoFn.StateParam(WINDOW_ELEMENT_PAIRS),  # type: ignore
+      time_state: AccumulatingRuntimeState = DoFn.StateParam(LAST_KNOWN_TIME),  # type: ignore
+      watermark_state: AccumulatingRuntimeState = DoFn.StateParam(  # type: ignore
+          LAST_KNOWN_WATERMARK),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      windows_state: SetRuntimeState = DoFn.StateParam(KNOWN_WINDOWS),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER),
+      *args, **kwargs):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=time_state,
+        watermark_state=watermark_state,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    key, windowed_values = element
+    watermark = read_watermark(watermark_state)
+
+    windows_to_elements = collections.defaultdict(list)
+    for wv in windowed_values:
+      for window in wv.windows:
+        # ignore expired windows
+        if watermark > window.end + self.windowing.allowed_lateness:
+          continue
+        if window in finished_windows_state.read():
+          continue
+        windows_to_elements[window].append(
+            TimestampedValue(wv.value, wv.timestamp))
+
+    # Processing merging of windows
+    if self.merging_windows:
+      old_windows = set(windows_state.read())
+      all_windows = old_windows.union(list(windows_to_elements))
+      if all_windows != old_windows:
+        merge_context = TriggerMergeContext(
+            all_windows, context, self.windowing)
+        self.windowing.windowfn.merge(merge_context)
+
+        merged_windows_to_elements = collections.defaultdict(list)
+        for window, values in windows_to_elements.items():
+          while window in merge_context.merged_away:
+            window = merge_context.merged_away[window]
+          merged_windows_to_elements[window].extend(values)
+        windows_to_elements = merged_windows_to_elements
+
+      if old_windows != all_windows:
+        for w in windows_to_elements:
+          windows_state.add(w)
+    # Done processing merging of windows
+
+    seen_windows = set()
+    for w in windows_to_elements:
+      window_context = context.for_window(w)
+      seen_windows.add(w)
+      for value_w_timestamp in windows_to_elements[w]:
+        _LOGGER.debug(value_w_timestamp)
+        element_bag.add((w, value_w_timestamp))
+        self.windowing.triggerfn.on_element(windowed_values, w, window_context)
+
+    return self._trigger_fire(
+        key, TimeDomain.WATERMARK, watermark, None, context, seen_windows)
+
+  def _trigger_fire(
+      self,
+      key: K,
+      time_domain,
+      timestamp: Timestamp,
+      timer_tag: typing.Optional[str],
+      context: 'FnRunnerStatefulTriggerContext',
+      windows_of_interest: typing.Optional[typing.Set[BoundedWindow]] = None):
+    windows_to_elements = context.windows_to_elements_map()
+    context.elements_bag_state.clear()
+
+    fired_windows = set()
+    finished_windows = set()
+    _LOGGER.debug(
+        '%s - tag %s - timestamp %s', time_domain, timer_tag, timestamp)
+    for w, elems in windows_to_elements.items():
+      if windows_of_interest is not None and w not in windows_of_interest:
+        # windows_of_interest=None means that we care about all windows.
+        # If we care only about some windows, and this window is not one of
+        # them, then we do not intend to fire this window.
+        continue
+      window_context = context.for_window(w)
+      if self.windowing.triggerfn.should_fire(time_domain,
+                                              timestamp,
+                                              w,
+                                              window_context):
+        finished = self.windowing.triggerfn.on_fire(
+            timestamp, w, window_context)
+        _LOGGER.debug('Firing on window %s. Finished: %s', w, finished)
+        fired_windows.add(w)
+        if finished:
+          context.finished_windows_state.add(w)
+          finished_windows.add(w)
+        # TODO(pabloem): Format the output: e.g. pane info
+        elems = [WindowedValue(e.value, e.timestamp, (w, )) for e in elems]
+        yield (key, elems)
+
+    # Add elements that were not fired back into state.
+    for w, elems in windows_to_elements.items():
+      for e in elems:
+        if (w in finished_windows or
+            (w in fired_windows and
+             self.windowing.accumulation_mode == AccumulationMode.DISCARDING)):
+          continue
+        context.elements_bag_state.add((w, e))
+
+  @on_timer(PROCESSING_TIME_TIMER)

Review comment:
       that's correct, for now.

##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')
+
+
+class ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):
+    super(TriggerMergeContext, self).__init__(all_windows)
+    self.trigger_context = context
+    self.windowing = windowing
+    self.merged_away: typing.Dict[BoundedWindow, BoundedWindow] = {}
+
+  def merge(self, to_be_merged, merge_result):
+    _LOGGER.debug("Merging %s into %s", to_be_merged, merge_result)
+    self.trigger_context.merge_state(to_be_merged, merge_result)
+    for window in to_be_merged:
+      if window != merge_result:
+        self.merged_away[window] = merge_result
+        # Clear state associated with PaneInfo since it is
+        # not preserved across merges.
+        self.trigger_context.for_window(window).clear_state(None)
+    self.windowing.triggerfn.on_merge(
+        to_be_merged,
+        merge_result,
+        self.trigger_context.for_window(merge_result))
+
+
+@typehints.with_input_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+@typehints.with_output_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+class GeneralTriggerManagerDoFn(DoFn):
+  """A trigger manager that supports all windowing / triggering cases.
+
+  This implements a DoFn that manages triggering in a per-key basis. All
+  elements for a single key are processed together. Per-key state holds data
+  related to all windows.
+  """
+
+  KNOWN_WINDOWS = SetStateSpec('known_windows', IntervalWindowCoder())
+  FINISHED_WINDOWS = SetStateSpec('finished_windows', IntervalWindowCoder())
+  LAST_KNOWN_TIME = CombiningValueStateSpec('last_known_time', combine_fn=max)
+  LAST_KNOWN_WATERMARK = CombiningValueStateSpec(
+      'last_known_watermark', combine_fn=max)
+
+  # TODO(pabloem) What's the coder for the elements/keys here?
+  WINDOW_ELEMENT_PAIRS = BagStateSpec(
+      'element_bag', TupleCoder([IntervalWindowCoder(), PickleCoder()]))
+  WINDOW_TAG_VALUES = BagStateSpec(
+      'per_window_per_tag_value_state',
+      TupleCoder([IntervalWindowCoder(), StrUtf8Coder(), VarIntCoder()]))
+
+  PROCESSING_TIME_TIMER = TimerSpec(
+      'processing_time_timer', TimeDomain.REAL_TIME)
+  WATERMARK_TIMER = TimerSpec('watermark_timer', TimeDomain.WATERMARK)
+
+  def __init__(self, windowing: Windowing):
+    self.windowing = windowing
+    # Only session windows are merging. Other windows are non-merging.
+    self.merging_windows = isinstance(windowing.windowfn, Sessions)
+
+  def process(
+      self,
+      element: typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]],
+      element_bag: BagRuntimeState = DoFn.StateParam(WINDOW_ELEMENT_PAIRS),  # type: ignore
+      time_state: AccumulatingRuntimeState = DoFn.StateParam(LAST_KNOWN_TIME),  # type: ignore
+      watermark_state: AccumulatingRuntimeState = DoFn.StateParam(  # type: ignore
+          LAST_KNOWN_WATERMARK),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      windows_state: SetRuntimeState = DoFn.StateParam(KNOWN_WINDOWS),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER),
+      *args, **kwargs):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=time_state,
+        watermark_state=watermark_state,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    key, windowed_values = element
+    watermark = read_watermark(watermark_state)
+
+    windows_to_elements = collections.defaultdict(list)
+    for wv in windowed_values:
+      for window in wv.windows:
+        # ignore expired windows
+        if watermark > window.end + self.windowing.allowed_lateness:
+          continue
+        if window in finished_windows_state.read():
+          continue
+        windows_to_elements[window].append(
+            TimestampedValue(wv.value, wv.timestamp))
+
+    # Processing merging of windows
+    if self.merging_windows:
+      old_windows = set(windows_state.read())
+      all_windows = old_windows.union(list(windows_to_elements))
+      if all_windows != old_windows:
+        merge_context = TriggerMergeContext(
+            all_windows, context, self.windowing)
+        self.windowing.windowfn.merge(merge_context)
+
+        merged_windows_to_elements = collections.defaultdict(list)
+        for window, values in windows_to_elements.items():
+          while window in merge_context.merged_away:
+            window = merge_context.merged_away[window]
+          merged_windows_to_elements[window].extend(values)
+        windows_to_elements = merged_windows_to_elements
+
+      if old_windows != all_windows:
+        for w in windows_to_elements:
+          windows_state.add(w)
+    # Done processing merging of windows
+
+    seen_windows = set()
+    for w in windows_to_elements:
+      window_context = context.for_window(w)
+      seen_windows.add(w)
+      for value_w_timestamp in windows_to_elements[w]:
+        _LOGGER.debug(value_w_timestamp)
+        element_bag.add((w, value_w_timestamp))
+        self.windowing.triggerfn.on_element(windowed_values, w, window_context)
+
+    return self._trigger_fire(
+        key, TimeDomain.WATERMARK, watermark, None, context, seen_windows)
+
+  def _trigger_fire(
+      self,
+      key: K,
+      time_domain,
+      timestamp: Timestamp,
+      timer_tag: typing.Optional[str],
+      context: 'FnRunnerStatefulTriggerContext',
+      windows_of_interest: typing.Optional[typing.Set[BoundedWindow]] = None):
+    windows_to_elements = context.windows_to_elements_map()
+    context.elements_bag_state.clear()
+
+    fired_windows = set()
+    finished_windows = set()
+    _LOGGER.debug(
+        '%s - tag %s - timestamp %s', time_domain, timer_tag, timestamp)
+    for w, elems in windows_to_elements.items():
+      if windows_of_interest is not None and w not in windows_of_interest:
+        # windows_of_interest=None means that we care about all windows.
+        # If we care only about some windows, and this window is not one of
+        # them, then we do not intend to fire this window.
+        continue
+      window_context = context.for_window(w)
+      if self.windowing.triggerfn.should_fire(time_domain,
+                                              timestamp,
+                                              w,
+                                              window_context):
+        finished = self.windowing.triggerfn.on_fire(
+            timestamp, w, window_context)
+        _LOGGER.debug('Firing on window %s. Finished: %s', w, finished)
+        fired_windows.add(w)
+        if finished:
+          context.finished_windows_state.add(w)
+          finished_windows.add(w)

Review comment:
       done, and reading afterwards.

##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')
+
+
+class ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):
+    super(TriggerMergeContext, self).__init__(all_windows)
+    self.trigger_context = context
+    self.windowing = windowing
+    self.merged_away: typing.Dict[BoundedWindow, BoundedWindow] = {}
+
+  def merge(self, to_be_merged, merge_result):
+    _LOGGER.debug("Merging %s into %s", to_be_merged, merge_result)
+    self.trigger_context.merge_state(to_be_merged, merge_result)
+    for window in to_be_merged:
+      if window != merge_result:
+        self.merged_away[window] = merge_result
+        # Clear state associated with PaneInfo since it is
+        # not preserved across merges.
+        self.trigger_context.for_window(window).clear_state(None)
+    self.windowing.triggerfn.on_merge(
+        to_be_merged,
+        merge_result,
+        self.trigger_context.for_window(merge_result))
+
+
+@typehints.with_input_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+@typehints.with_output_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+class GeneralTriggerManagerDoFn(DoFn):
+  """A trigger manager that supports all windowing / triggering cases.
+
+  This implements a DoFn that manages triggering in a per-key basis. All
+  elements for a single key are processed together. Per-key state holds data
+  related to all windows.
+  """
+
+  KNOWN_WINDOWS = SetStateSpec('known_windows', IntervalWindowCoder())
+  FINISHED_WINDOWS = SetStateSpec('finished_windows', IntervalWindowCoder())
+  LAST_KNOWN_TIME = CombiningValueStateSpec('last_known_time', combine_fn=max)
+  LAST_KNOWN_WATERMARK = CombiningValueStateSpec(
+      'last_known_watermark', combine_fn=max)
+
+  # TODO(pabloem) What's the coder for the elements/keys here?
+  WINDOW_ELEMENT_PAIRS = BagStateSpec(
+      'element_bag', TupleCoder([IntervalWindowCoder(), PickleCoder()]))
+  WINDOW_TAG_VALUES = BagStateSpec(
+      'per_window_per_tag_value_state',
+      TupleCoder([IntervalWindowCoder(), StrUtf8Coder(), VarIntCoder()]))
+
+  PROCESSING_TIME_TIMER = TimerSpec(
+      'processing_time_timer', TimeDomain.REAL_TIME)
+  WATERMARK_TIMER = TimerSpec('watermark_timer', TimeDomain.WATERMARK)
+
+  def __init__(self, windowing: Windowing):
+    self.windowing = windowing
+    # Only session windows are merging. Other windows are non-merging.
+    self.merging_windows = isinstance(windowing.windowfn, Sessions)
+
+  def process(
+      self,
+      element: typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]],
+      element_bag: BagRuntimeState = DoFn.StateParam(WINDOW_ELEMENT_PAIRS),  # type: ignore
+      time_state: AccumulatingRuntimeState = DoFn.StateParam(LAST_KNOWN_TIME),  # type: ignore
+      watermark_state: AccumulatingRuntimeState = DoFn.StateParam(  # type: ignore
+          LAST_KNOWN_WATERMARK),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      windows_state: SetRuntimeState = DoFn.StateParam(KNOWN_WINDOWS),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER),
+      *args, **kwargs):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=time_state,
+        watermark_state=watermark_state,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    key, windowed_values = element
+    watermark = read_watermark(watermark_state)
+
+    windows_to_elements = collections.defaultdict(list)
+    for wv in windowed_values:
+      for window in wv.windows:
+        # ignore expired windows
+        if watermark > window.end + self.windowing.allowed_lateness:
+          continue
+        if window in finished_windows_state.read():
+          continue
+        windows_to_elements[window].append(
+            TimestampedValue(wv.value, wv.timestamp))
+
+    # Processing merging of windows
+    if self.merging_windows:
+      old_windows = set(windows_state.read())
+      all_windows = old_windows.union(list(windows_to_elements))
+      if all_windows != old_windows:
+        merge_context = TriggerMergeContext(
+            all_windows, context, self.windowing)
+        self.windowing.windowfn.merge(merge_context)
+
+        merged_windows_to_elements = collections.defaultdict(list)
+        for window, values in windows_to_elements.items():
+          while window in merge_context.merged_away:
+            window = merge_context.merged_away[window]
+          merged_windows_to_elements[window].extend(values)
+        windows_to_elements = merged_windows_to_elements
+
+      if old_windows != all_windows:
+        for w in windows_to_elements:
+          windows_state.add(w)
+    # Done processing merging of windows
+
+    seen_windows = set()
+    for w in windows_to_elements:
+      window_context = context.for_window(w)
+      seen_windows.add(w)
+      for value_w_timestamp in windows_to_elements[w]:
+        _LOGGER.debug(value_w_timestamp)
+        element_bag.add((w, value_w_timestamp))
+        self.windowing.triggerfn.on_element(windowed_values, w, window_context)
+
+    return self._trigger_fire(
+        key, TimeDomain.WATERMARK, watermark, None, context, seen_windows)
+
+  def _trigger_fire(
+      self,
+      key: K,
+      time_domain,
+      timestamp: Timestamp,
+      timer_tag: typing.Optional[str],
+      context: 'FnRunnerStatefulTriggerContext',
+      windows_of_interest: typing.Optional[typing.Set[BoundedWindow]] = None):
+    windows_to_elements = context.windows_to_elements_map()
+    context.elements_bag_state.clear()
+
+    fired_windows = set()
+    finished_windows = set()
+    _LOGGER.debug(
+        '%s - tag %s - timestamp %s', time_domain, timer_tag, timestamp)
+    for w, elems in windows_to_elements.items():
+      if windows_of_interest is not None and w not in windows_of_interest:
+        # windows_of_interest=None means that we care about all windows.
+        # If we care only about some windows, and this window is not one of
+        # them, then we do not intend to fire this window.
+        continue
+      window_context = context.for_window(w)
+      if self.windowing.triggerfn.should_fire(time_domain,
+                                              timestamp,
+                                              w,
+                                              window_context):
+        finished = self.windowing.triggerfn.on_fire(
+            timestamp, w, window_context)
+        _LOGGER.debug('Firing on window %s. Finished: %s', w, finished)
+        fired_windows.add(w)
+        if finished:
+          context.finished_windows_state.add(w)
+          finished_windows.add(w)
+        # TODO(pabloem): Format the output: e.g. pane info
+        elems = [WindowedValue(e.value, e.timestamp, (w, )) for e in elems]
+        yield (key, elems)
+
+    # Add elements that were not fired back into state.
+    for w, elems in windows_to_elements.items():
+      for e in elems:
+        if (w in finished_windows or
+            (w in fired_windows and
+             self.windowing.accumulation_mode == AccumulationMode.DISCARDING)):
+          continue
+        context.elements_bag_state.add((w, e))
+
+  @on_timer(PROCESSING_TIME_TIMER)
+  def processing_time_trigger(
+      self,
+      key=DoFn.KeyParam,
+      timer_tag=DoFn.DynamicTimerTagParam,
+      timestamp=DoFn.TimestampParam,
+      processing_time_state=DoFn.StateParam(LAST_KNOWN_TIME),
+      element_bag=DoFn.StateParam(WINDOW_ELEMENT_PAIRS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER)):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=processing_time_state,
+        watermark_state=None,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    result = self._trigger_fire(
+        key, TimeDomain.REAL_TIME, timestamp, timer_tag, context)
+    processing_time_state.add(timestamp)
+    return result
+
+  @on_timer(WATERMARK_TIMER)
+  def watermark_trigger(
+      self,
+      key=DoFn.KeyParam,
+      timer_tag=DoFn.DynamicTimerTagParam,
+      timestamp=DoFn.TimestampParam,
+      watermark_state=DoFn.StateParam(LAST_KNOWN_WATERMARK),
+      element_bag=DoFn.StateParam(WINDOW_ELEMENT_PAIRS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER)):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=None,
+        watermark_state=watermark_state,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    result = self._trigger_fire(
+        key, TimeDomain.WATERMARK, timestamp, timer_tag, context)
+    watermark_state.add(timestamp)
+    return result
+
+
+class FnRunnerStatefulTriggerContext(TriggerContext):
+  def __init__(
+      self,
+      processing_time_timer: RuntimeTimer,
+      watermark_timer: RuntimeTimer,
+      current_time_state: typing.Optional[AccumulatingRuntimeState],
+      watermark_state: typing.Optional[AccumulatingRuntimeState],
+      elements_bag_state: BagRuntimeState,
+      window_tag_values_bag_state: BagRuntimeState,
+      finished_windows_state: SetRuntimeState):
+    self.timers = {
+        TimeDomain.REAL_TIME: processing_time_timer,
+        TimeDomain.WATERMARK: watermark_timer
+    }
+    self.current_times = {
+        TimeDomain.REAL_TIME: current_time_state,
+        TimeDomain.WATERMARK: watermark_state
+    }
+    self.elements_bag_state = elements_bag_state
+    self.window_tag_values_bag_state = window_tag_values_bag_state
+    self.finished_windows_state = finished_windows_state
+
+  def windows_to_elements_map(
+      self
+  ) -> typing.Dict[BoundedWindow, typing.List[windowed_value.WindowedValue]]:
+    window_element_pairs: typing.Iterable[typing.Tuple[
+        BoundedWindow,
+        windowed_value.WindowedValue]] = self.elements_bag_state.read()
+    result: typing.Dict[BoundedWindow,
+                        typing.List[windowed_value.WindowedValue]] = {}
+    for w, e in window_element_pairs:
+      if w not in result:
+        result[w] = []
+      result[w].append(e)
+    return result
+
+  def for_window(self, window):
+    return PerWindowTriggerContext(window, self)
+
+  def get_current_time(self):
+    return self.current_times[TimeDomain.REAL_TIME].read()
+
+  def set_timer(self, name, time_domain, timestamp):
+    _LOGGER.debug('Setting timer (%s, %s) at %s', time_domain, name, timestamp)
+    self.timers[time_domain].set(timestamp, dynamic_timer_tag=name)
+
+  def clear_timer(self, name, time_domain):
+    _LOGGER.debug('Clearing timer (%s, %s)', time_domain, name)
+    self.timers[time_domain].clear(dynamic_timer_tag=name)
+
+  def merge_state(self, to_be_merged, merge_result):
+    all_triplets = self.window_tag_values_bag_state.read()
+    # Collect all the triplets for the window we are merging away, and tag them
+    # with the new window (merge_result).
+    merging_away_triplets = [(merge_result, t[1], t[2]) for t in all_triplets
+                             if t[0] in to_be_merged]
+
+    # Collect all of the other triplets, and joining them with the newly-tagged
+    # set of triplets.
+    resulting_triplets = [
+        t for t in all_triplets if t[0] not in to_be_merged

Review comment:
       it is not. fixed that issue. Thanks!




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (b874eed) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.22%`.
   > The diff coverage is `90.28%`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   83.00%   +0.22%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58463     +901     
   ==========================================
   + Hits        47647    48526     +879     
   - Misses       9915     9937      +22     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [sdks/python/apache\_beam/transforms/trigger.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdHJhbnNmb3Jtcy90cmlnZ2VyLnB5) | `88.34% <75.00%> (-0.02%)` | :arrow_down: |
   | [...nners/portability/fn\_api\_runner/trigger\_manager.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9mbl9hcGlfcnVubmVyL3RyaWdnZXJfbWFuYWdlci5weQ==) | `90.58% <90.58%> (ø)` | |
   | [...dks/python/apache\_beam/options/pipeline\_options.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vb3B0aW9ucy9waXBlbGluZV9vcHRpb25zLnB5) | `94.60% <100.00%> (ø)` | |
   | [sdks/python/apache\_beam/utils/interactive\_utils.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdXRpbHMvaW50ZXJhY3RpdmVfdXRpbHMucHk=) | `88.09% <0.00%> (-7.15%)` | :arrow_down: |
   | [...\_beam/runners/portability/sdk\_container\_builder.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9zZGtfY29udGFpbmVyX2J1aWxkZXIucHk=) | `40.00% <0.00%> (-3.28%)` | :arrow_down: |
   | [sdks/python/apache\_beam/dataframe/frames.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZGF0YWZyYW1lL2ZyYW1lcy5weQ==) | `90.80% <0.00%> (-1.18%)` | :arrow_down: |
   | [sdks/python/apache\_beam/internal/metrics/metric.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW50ZXJuYWwvbWV0cmljcy9tZXRyaWMucHk=) | `86.45% <0.00%> (-1.05%)` | :arrow_down: |
   | [...ache\_beam/runners/portability/local\_job\_service.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9sb2NhbF9qb2Jfc2VydmljZS5weQ==) | `80.53% <0.00%> (-0.64%)` | :arrow_down: |
   | [...nners/portability/fn\_api\_runner/worker\_handlers.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9mbl9hcGlfcnVubmVyL3dvcmtlcl9oYW5kbGVycy5weQ==) | `79.61% <0.00%> (-0.39%)` | :arrow_down: |
   | [...ks/python/apache\_beam/runners/worker/sdk\_worker.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy93b3JrZXIvc2RrX3dvcmtlci5weQ==) | `89.69% <0.00%> (-0.16%)` | :arrow_down: |
   | ... and [37 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...b874eed](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] pabloem commented on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
pabloem commented on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-800471557


   @y1chi would you be able to take a look? : )


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (568db95) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.25%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   83.03%   +0.25%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58534     +972     
   ==========================================
   + Hits        47647    48603     +956     
   - Misses       9915     9931      +16     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [sdks/python/apache\_beam/io/gcp/gcsio.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL2djc2lvLnB5) | | |
   | [...apache\_beam/runners/dataflow/internal/apiclient.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9kYXRhZmxvdy9pbnRlcm5hbC9hcGljbGllbnQucHk=) | | |
   | [sdks/python/apache\_beam/io/external/gcp/pubsub.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZXh0ZXJuYWwvZ2NwL3B1YnN1Yi5weQ==) | | |
   | [sdks/python/apache\_beam/io/external/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZXh0ZXJuYWwvX19pbml0X18ucHk=) | | |
   | [sdks/python/apache\_beam/examples/avro\_bitcoin.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvYXZyb19iaXRjb2luLnB5) | | |
   | [...eam/runners/interactive/options/capture\_control.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9pbnRlcmFjdGl2ZS9vcHRpb25zL2NhcHR1cmVfY29udHJvbC5weQ==) | | |
   | [...am/examples/snippets/transforms/aggregation/max.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvc25pcHBldHMvdHJhbnNmb3Jtcy9hZ2dyZWdhdGlvbi9tYXgucHk=) | | |
   | [...ks/python/apache\_beam/runners/worker/opcounters.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy93b3JrZXIvb3Bjb3VudGVycy5weQ==) | | |
   | [sdks/python/apache\_beam/dataframe/io.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZGF0YWZyYW1lL2lvLnB5) | | |
   | [...che\_beam/portability/api/beam\_provision\_api\_pb2.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcG9ydGFiaWxpdHkvYXBpL2JlYW1fcHJvdmlzaW9uX2FwaV9wYjIucHk=) | | |
   | ... and [923 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...568db95](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (568db95) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.25%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   83.03%   +0.25%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58534     +972     
   ==========================================
   + Hits        47647    48603     +956     
   - Misses       9915     9931      +16     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [sdks/python/apache\_beam/io/localfilesystem.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vbG9jYWxmaWxlc3lzdGVtLnB5) | | |
   | [sdks/python/apache\_beam/transforms/ptransform.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdHJhbnNmb3Jtcy9wdHJhbnNmb3JtLnB5) | | |
   | [...amples/snippets/transforms/aggregation/distinct.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvc25pcHBldHMvdHJhbnNmb3Jtcy9hZ2dyZWdhdGlvbi9kaXN0aW5jdC5weQ==) | | |
   | [...beam/testing/benchmarks/nexmark/queries/query11.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL25leG1hcmsvcXVlcmllcy9xdWVyeTExLnB5) | | |
   | [...ython/apache\_beam/portability/api/endpoints\_pb2.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcG9ydGFiaWxpdHkvYXBpL2VuZHBvaW50c19wYjIucHk=) | | |
   | [sdks/python/apache\_beam/io/iobase.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vaW9iYXNlLnB5) | | |
   | [...eam/portability/api/beam\_provision\_api\_pb2\_grpc.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcG9ydGFiaWxpdHkvYXBpL2JlYW1fcHJvdmlzaW9uX2FwaV9wYjJfZ3JwYy5weQ==) | | |
   | [sdks/python/apache\_beam/utils/retry.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdXRpbHMvcmV0cnkucHk=) | | |
   | [...dks/python/apache\_beam/transforms/external\_java.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdHJhbnNmb3Jtcy9leHRlcm5hbF9qYXZhLnB5) | | |
   | [...on/apache\_beam/runners/worker/statesampler\_slow.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy93b3JrZXIvc3RhdGVzYW1wbGVyX3Nsb3cucHk=) | | |
   | ... and [923 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...964d13a](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (2452188) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.28%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   83.06%   +0.28%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58590    +1028     
   ==========================================
   + Hits        47647    48666    +1019     
   - Misses       9915     9924       +9     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [sdks/python/apache\_beam/metrics/cells.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vbWV0cmljcy9jZWxscy5weQ==) | | |
   | [.../python/apache\_beam/testing/benchmarks/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL19faW5pdF9fLnB5) | | |
   | [.../python/apache\_beam/examples/kafkataxi/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMva2Fma2F0YXhpL19faW5pdF9fLnB5) | | |
   | [.../examples/snippets/transforms/elementwise/pardo.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvc25pcHBldHMvdHJhbnNmb3Jtcy9lbGVtZW50d2lzZS9wYXJkby5weQ==) | | |
   | [sdks/python/apache\_beam/testing/test\_utils.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy90ZXN0X3V0aWxzLnB5) | | |
   | [...beam/runners/interactive/background\_caching\_job.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9pbnRlcmFjdGl2ZS9iYWNrZ3JvdW5kX2NhY2hpbmdfam9iLnB5) | | |
   | [.../apache\_beam/runners/dataflow/internal/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9kYXRhZmxvdy9pbnRlcm5hbC9fX2luaXRfXy5weQ==) | | |
   | [...\_beam/testing/benchmarks/nexmark/queries/query0.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL25leG1hcmsvcXVlcmllcy9xdWVyeTAucHk=) | | |
   | [sdks/python/apache\_beam/io/jdbc.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vamRiYy5weQ==) | | |
   | [sdks/python/apache\_beam/io/external/kafka.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZXh0ZXJuYWwva2Fma2EucHk=) | | |
   | ... and [923 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...2452188](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (a57fb3d) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.25%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   83.03%   +0.25%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58535     +973     
   ==========================================
   + Hits        47647    48604     +957     
   - Misses       9915     9931      +16     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [sdks/python/apache\_beam/io/snowflake.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vc25vd2ZsYWtlLnB5) | | |
   | [...s/python/apache\_beam/testing/datatype\_inference.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9kYXRhdHlwZV9pbmZlcmVuY2UucHk=) | | |
   | [...s/python/apache\_beam/testing/synthetic\_pipeline.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9zeW50aGV0aWNfcGlwZWxpbmUucHk=) | | |
   | [...\_beam/io/gcp/datastore/v1new/adaptive\_throttler.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL2RhdGFzdG9yZS92MW5ldy9hZGFwdGl2ZV90aHJvdHRsZXIucHk=) | | |
   | [...\_beam/testing/benchmarks/nexmark/queries/query5.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL25leG1hcmsvcXVlcmllcy9xdWVyeTUucHk=) | | |
   | [sdks/python/apache\_beam/runners/worker/logger.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy93b3JrZXIvbG9nZ2VyLnB5) | | |
   | [...python/apache\_beam/examples/wordcount\_debugging.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvd29yZGNvdW50X2RlYnVnZ2luZy5weQ==) | | |
   | [...cp/internal/clients/storage/storage\_v1\_messages.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL2ludGVybmFsL2NsaWVudHMvc3RvcmFnZS9zdG9yYWdlX3YxX21lc3NhZ2VzLnB5) | | |
   | [.../python/apache\_beam/io/gcp/bigquery\_io\_metadata.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL2JpZ3F1ZXJ5X2lvX21ldGFkYXRhLnB5) | | |
   | [...on/apache\_beam/io/gcp/bigquery\_io\_read\_pipeline.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL2JpZ3F1ZXJ5X2lvX3JlYWRfcGlwZWxpbmUucHk=) | | |
   | ... and [923 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...a57fb3d](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (12920ec) into [master](https://codecov.io/gh/apache/beam/commit/67dc5fe49bd0dbc33ccccf7cde5749dc17adec08?el=desc) (67dc5fe) will **increase** coverage by `0.05%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   83.39%   83.45%   +0.05%     
   ==========================================
     Files         469      470       +1     
     Lines       58721    58952     +231     
   ==========================================
   + Hits        48972    49196     +224     
   - Misses       9749     9756       +7     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [...cp/internal/clients/storage/storage\_v1\_messages.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL2ludGVybmFsL2NsaWVudHMvc3RvcmFnZS9zdG9yYWdlX3YxX21lc3NhZ2VzLnB5) | | |
   | [...hon/apache\_beam/examples/complete/game/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvY29tcGxldGUvZ2FtZS9fX2luaXRfXy5weQ==) | | |
   | [...he\_beam/runners/interactive/pipeline\_instrument.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9pbnRlcmFjdGl2ZS9waXBlbGluZV9pbnN0cnVtZW50LnB5) | | |
   | [...dks/python/apache\_beam/transforms/external\_java.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdHJhbnNmb3Jtcy9leHRlcm5hbF9qYXZhLnB5) | | |
   | [.../srcs/sdks/python/apache\_beam/examples/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvX19pbml0X18ucHk=) | | |
   | [.../apache\_beam/runners/direct/transform\_evaluator.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9kaXJlY3QvdHJhbnNmb3JtX2V2YWx1YXRvci5weQ==) | | |
   | [.../runners/portability/fn\_api\_runner/translations.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9mbl9hcGlfcnVubmVyL3RyYW5zbGF0aW9ucy5weQ==) | | |
   | [...ks/python/apache\_beam/internal/metrics/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW50ZXJuYWwvbWV0cmljcy9fX2luaXRfXy5weQ==) | | |
   | [.../python/apache\_beam/examples/kafkataxi/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMva2Fma2F0YXhpL19faW5pdF9fLnB5) | | |
   | [...sdks/python/apache\_beam/dataframe/partitionings.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZGF0YWZyYW1lL3BhcnRpdGlvbmluZ3MucHk=) | | |
   | ... and [929 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [67dc5fe...12920ec](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (2452188) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.28%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   83.06%   +0.28%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58590    +1028     
   ==========================================
   + Hits        47647    48666    +1019     
   - Misses       9915     9924       +9     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [...dks/python/apache\_beam/runners/pipeline\_context.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9waXBlbGluZV9jb250ZXh0LnB5) | | |
   | [...dks/python/apache\_beam/io/gcp/gce\_metadata\_util.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL2djZV9tZXRhZGF0YV91dGlsLnB5) | | |
   | [...ks/python/apache\_beam/internal/metrics/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW50ZXJuYWwvbWV0cmljcy9fX2luaXRfXy5weQ==) | | |
   | [...s/python/apache\_beam/testing/pipeline\_verifiers.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9waXBlbGluZV92ZXJpZmllcnMucHk=) | | |
   | [sdks/python/apache\_beam/io/parquetio.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vcGFycXVldGlvLnB5) | | |
   | [sdks/python/apache\_beam/examples/wordcount.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvd29yZGNvdW50LnB5) | | |
   | [...python/apache\_beam/examples/complete/distribopt.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvY29tcGxldGUvZGlzdHJpYm9wdC5weQ==) | | |
   | [...ache\_beam/examples/cookbook/datastore\_wordcount.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvY29va2Jvb2svZGF0YXN0b3JlX3dvcmRjb3VudC5weQ==) | | |
   | [sdks/python/apache\_beam/io/external/kafka.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZXh0ZXJuYWwva2Fma2EucHk=) | | |
   | [...\_beam/testing/benchmarks/nexmark/queries/query5.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL25leG1hcmsvcXVlcmllcy9xdWVyeTUucHk=) | | |
   | ... and [923 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...2452188](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (4ec1a8f) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.19%`.
   > The diff coverage is `92.37%`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   82.96%   +0.19%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58455     +893     
   ==========================================
   + Hits        47647    48499     +852     
   - Misses       9915     9956      +41     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [sdks/python/apache\_beam/transforms/trigger.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdHJhbnNmb3Jtcy90cmlnZ2VyLnB5) | `88.34% <75.00%> (-0.02%)` | :arrow_down: |
   | [...nners/portability/fn\_api\_runner/trigger\_manager.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9mbl9hcGlfcnVubmVyL3RyaWdnZXJfbWFuYWdlci5weQ==) | `92.92% <92.92%> (ø)` | |
   | [...dks/python/apache\_beam/options/pipeline\_options.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vb3B0aW9ucy9waXBlbGluZV9vcHRpb25zLnB5) | `94.60% <100.00%> (ø)` | |
   | [...\_beam/runners/portability/sdk\_container\_builder.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9zZGtfY29udGFpbmVyX2J1aWxkZXIucHk=) | `40.00% <0.00%> (-3.28%)` | :arrow_down: |
   | [sdks/python/apache\_beam/internal/metrics/metric.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW50ZXJuYWwvbWV0cmljcy9tZXRyaWMucHk=) | `86.45% <0.00%> (-1.05%)` | :arrow_down: |
   | [sdks/python/apache\_beam/dataframe/frames.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZGF0YWZyYW1lL2ZyYW1lcy5weQ==) | `91.07% <0.00%> (-0.90%)` | :arrow_down: |
   | [...ache\_beam/runners/portability/local\_job\_service.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9sb2NhbF9qb2Jfc2VydmljZS5weQ==) | `80.53% <0.00%> (-0.64%)` | :arrow_down: |
   | [...ks/python/apache\_beam/runners/worker/sdk\_worker.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy93b3JrZXIvc2RrX3dvcmtlci5weQ==) | `89.54% <0.00%> (-0.32%)` | :arrow_down: |
   | [sdks/python/apache\_beam/dataframe/expressions.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZGF0YWZyYW1lL2V4cHJlc3Npb25zLnB5) | `91.02% <0.00%> (-0.14%)` | :arrow_down: |
   | [sdks/python/apache\_beam/dataframe/schemas.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZGF0YWZyYW1lL3NjaGVtYXMucHk=) | `96.87% <0.00%> (-0.12%)` | :arrow_down: |
   | ... and [33 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...b874eed](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (568db95) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.25%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   83.03%   +0.25%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58534     +972     
   ==========================================
   + Hits        47647    48603     +956     
   - Misses       9915     9931      +16     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [...e\_beam/portability/api/beam\_interactive\_api\_pb2.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcG9ydGFiaWxpdHkvYXBpL2JlYW1faW50ZXJhY3RpdmVfYXBpX3BiMi5weQ==) | | |
   | [sdks/python/apache\_beam/dataframe/partitionings.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZGF0YWZyYW1lL3BhcnRpdGlvbmluZ3MucHk=) | | |
   | [.../apache\_beam/options/pipeline\_options\_validator.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vb3B0aW9ucy9waXBlbGluZV9vcHRpb25zX3ZhbGlkYXRvci5weQ==) | | |
   | [...hon/apache\_beam/runners/direct/test\_stream\_impl.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9kaXJlY3QvdGVzdF9zdHJlYW1faW1wbC5weQ==) | | |
   | [...che\_beam/runners/interactive/interactive\_runner.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9pbnRlcmFjdGl2ZS9pbnRlcmFjdGl2ZV9ydW5uZXIucHk=) | | |
   | [...n/apache\_beam/runners/direct/test\_direct\_runner.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9kaXJlY3QvdGVzdF9kaXJlY3RfcnVubmVyLnB5) | | |
   | [.../python/apache\_beam/transforms/periodicsequence.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdHJhbnNmb3Jtcy9wZXJpb2RpY3NlcXVlbmNlLnB5) | | |
   | [...he\_beam/examples/cookbook/multiple\_output\_pardo.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvY29va2Jvb2svbXVsdGlwbGVfb3V0cHV0X3BhcmRvLnB5) | | |
   | [...thon/apache\_beam/runners/worker/operation\_specs.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy93b3JrZXIvb3BlcmF0aW9uX3NwZWNzLnB5) | | |
   | [...beam/runners/interactive/background\_caching\_job.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9pbnRlcmFjdGl2ZS9iYWNrZ3JvdW5kX2NhY2hpbmdfam9iLnB5) | | |
   | ... and [923 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...70b0383](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129






-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (568db95) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.25%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   83.03%   +0.25%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58534     +972     
   ==========================================
   + Hits        47647    48603     +956     
   - Misses       9915     9931      +16     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [...runners/interactive/options/interactive\_options.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9pbnRlcmFjdGl2ZS9vcHRpb25zL2ludGVyYWN0aXZlX29wdGlvbnMucHk=) | | |
   | [...eam/runners/portability/fn\_api\_runner/execution.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9mbl9hcGlfcnVubmVyL2V4ZWN1dGlvbi5weQ==) | | |
   | [sdks/python/apache\_beam/metrics/metric.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vbWV0cmljcy9tZXRyaWMucHk=) | | |
   | [...python/apache\_beam/io/gcp/datastore/v1new/types.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL2RhdGFzdG9yZS92MW5ldy90eXBlcy5weQ==) | | |
   | [...\_beam/testing/benchmarks/nexmark/queries/query5.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL25leG1hcmsvcXVlcmllcy9xdWVyeTUucHk=) | | |
   | [...s/snippets/transforms/aggregation/combinevalues.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvc25pcHBldHMvdHJhbnNmb3Jtcy9hZ2dyZWdhdGlvbi9jb21iaW5ldmFsdWVzLnB5) | | |
   | [...ive/messaging/interactive\_environment\_inspector.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9pbnRlcmFjdGl2ZS9tZXNzYWdpbmcvaW50ZXJhY3RpdmVfZW52aXJvbm1lbnRfaW5zcGVjdG9yLnB5) | | |
   | [sdks/python/apache\_beam/dataframe/frames.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZGF0YWZyYW1lL2ZyYW1lcy5weQ==) | | |
   | [sdks/python/apache\_beam/transforms/sideinputs.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdHJhbnNmb3Jtcy9zaWRlaW5wdXRzLnB5) | | |
   | [...s/python/apache\_beam/examples/wordcount\_minimal.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvd29yZGNvdW50X21pbmltYWwucHk=) | | |
   | ... and [923 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...568db95](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] y1chi commented on a change in pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
y1chi commented on a change in pull request #13985:
URL: https://github.com/apache/beam/pull/13985#discussion_r601727284



##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,457 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+
+
+class _ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):
+    super(TriggerMergeContext, self).__init__(all_windows)
+    self.trigger_context = context
+    self.windowing = windowing
+    self.merged_away: typing.Dict[BoundedWindow, BoundedWindow] = {}
+
+  def merge(self, to_be_merged, merge_result):
+    _LOGGER.debug("Merging %s into %s", to_be_merged, merge_result)
+    self.trigger_context.merge_windows(to_be_merged, merge_result)
+    for window in to_be_merged:
+      if window != merge_result:
+        self.merged_away[window] = merge_result
+        # Clear state associated with PaneInfo since it is
+        # not preserved across merges.
+        self.trigger_context.for_window(window).clear_state(None)
+    self.windowing.triggerfn.on_merge(
+        to_be_merged,
+        merge_result,
+        self.trigger_context.for_window(merge_result))
+
+
+@typehints.with_input_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+@typehints.with_output_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+class GeneralTriggerManagerDoFn(DoFn):
+  """A trigger manager that supports all windowing / triggering cases.
+
+  This implements a DoFn that manages triggering in a per-key basis. All
+  elements for a single key are processed together. Per-key state holds data
+  related to all windows.
+  """
+
+  # TODO(BEAM-12026) Add support for Global and custom window fns.
+  KNOWN_WINDOWS = SetStateSpec('known_windows', IntervalWindowCoder())
+  FINISHED_WINDOWS = SetStateSpec('finished_windows', IntervalWindowCoder())
+  LAST_KNOWN_TIME = CombiningValueStateSpec('last_known_time', combine_fn=max)
+  LAST_KNOWN_WATERMARK = CombiningValueStateSpec(
+      'last_known_watermark', combine_fn=max)
+
+  # TODO(pabloem) What's the coder for the elements/keys here?
+  WINDOW_ELEMENT_PAIRS = BagStateSpec(
+      'all_elements', TupleCoder([IntervalWindowCoder(), PickleCoder()]))
+  WINDOW_TAG_VALUES = BagStateSpec(
+      'per_window_per_tag_value_state',
+      TupleCoder([IntervalWindowCoder(), StrUtf8Coder(), VarIntCoder()]))
+
+  PROCESSING_TIME_TIMER = TimerSpec(
+      'processing_time_timer', TimeDomain.REAL_TIME)
+  WATERMARK_TIMER = TimerSpec('watermark_timer', TimeDomain.WATERMARK)
+
+  def __init__(self, windowing: Windowing):
+    self.windowing = windowing
+    # Only session windows are merging. Other windows are non-merging.
+    self.merging_windows = self.windowing.windowfn.is_merging()
+
+  def process(
+      self,
+      element: typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]],
+      all_elements: BagRuntimeState = DoFn.StateParam(WINDOW_ELEMENT_PAIRS),  # type: ignore
+      latest_processing_time: AccumulatingRuntimeState = DoFn.StateParam(LAST_KNOWN_TIME),  # type: ignore
+      latest_watermark: AccumulatingRuntimeState = DoFn.StateParam(  # type: ignore
+          LAST_KNOWN_WATERMARK),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      windows_state: SetRuntimeState = DoFn.StateParam(KNOWN_WINDOWS),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER),
+      *args, **kwargs):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        latest_processing_time=latest_processing_time,
+        latest_watermark=latest_watermark,
+        all_elements_state=all_elements,
+        window_tag_values=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    key, windowed_values = element
+    watermark = read_watermark(latest_watermark)
+
+    windows_to_elements = collections.defaultdict(list)
+    for wv in windowed_values:
+      for window in wv.windows:
+        # ignore expired windows
+        if watermark > window.end + self.windowing.allowed_lateness:
+          continue
+        if window in finished_windows_state.read():
+          continue
+        windows_to_elements[window].append(
+            TimestampedValue(wv.value, wv.timestamp))
+
+    # Processing merging of windows
+    if self.merging_windows:
+      old_windows = set(windows_state.read())
+      all_windows = old_windows.union(list(windows_to_elements))
+      if all_windows != old_windows:
+        merge_context = TriggerMergeContext(
+            all_windows, context, self.windowing)
+        self.windowing.windowfn.merge(merge_context)
+
+        merged_windows_to_elements = collections.defaultdict(list)
+        for window, values in windows_to_elements.items():
+          while window in merge_context.merged_away:
+            window = merge_context.merged_away[window]
+          merged_windows_to_elements[window].extend(values)
+        windows_to_elements = merged_windows_to_elements
+
+      for w in windows_to_elements:
+        windows_state.add(w)
+    # Done processing merging of windows
+
+    seen_windows = set()
+    for w in windows_to_elements:
+      window_context = context.for_window(w)
+      seen_windows.add(w)
+      for value_w_timestamp in windows_to_elements[w]:
+        _LOGGER.debug(value_w_timestamp)
+        all_elements.add((w, value_w_timestamp))
+        self.windowing.triggerfn.on_element(windowed_values, w, window_context)
+
+    return self._fire_eligible_windows(
+        key, TimeDomain.WATERMARK, watermark, None, context, seen_windows)
+
+  def _fire_eligible_windows(
+      self,
+      key: K,
+      time_domain,
+      timestamp: Timestamp,
+      timer_tag: typing.Optional[str],
+      context: 'FnRunnerStatefulTriggerContext',
+      windows_of_interest: typing.Optional[typing.Set[BoundedWindow]] = None):
+    windows_to_elements = context.windows_to_elements_map()
+    context.all_elements_state.clear()
+
+    fired_windows = set()
+    _LOGGER.debug(
+        '%s - tag %s - timestamp %s', time_domain, timer_tag, timestamp)
+    for w, elems in windows_to_elements.items():
+      if windows_of_interest is not None and w not in windows_of_interest:
+        # windows_of_interest=None means that we care about all windows.
+        # If we care only about some windows, and this window is not one of
+        # them, then we do not intend to fire this window.
+        continue
+      window_context = context.for_window(w)
+      if self.windowing.triggerfn.should_fire(time_domain,
+                                              timestamp,
+                                              w,
+                                              window_context):
+        finished = self.windowing.triggerfn.on_fire(
+            timestamp, w, window_context)
+        _LOGGER.debug('Firing on window %s. Finished: %s', w, finished)
+        fired_windows.add(w)
+        if finished:
+          context.finished_windows_state.add(w)
+        # TODO(pabloem): Format the output: e.g. pane info
+        elems = [WindowedValue(e.value, e.timestamp, (w, )) for e in elems]
+        yield (key, elems)
+
+    finished_windows: typing.Set[BoundedWindow] = set(
+        context.finished_windows_state.read())
+    # Add elements that were not fired back into state.
+    for w, elems in windows_to_elements.items():
+      for e in elems:
+        if (w in finished_windows or
+            (w in fired_windows and
+             self.windowing.accumulation_mode == AccumulationMode.DISCARDING)):
+          continue
+        context.all_elements_state.add((w, e))
+
+  @on_timer(PROCESSING_TIME_TIMER)
+  def processing_time_trigger(
+      self,
+      key=DoFn.KeyParam,
+      timer_tag=DoFn.DynamicTimerTagParam,
+      timestamp=DoFn.TimestampParam,
+      latest_processing_time=DoFn.StateParam(LAST_KNOWN_TIME),
+      all_elements=DoFn.StateParam(WINDOW_ELEMENT_PAIRS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER)):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        latest_processing_time=latest_processing_time,
+        latest_watermark=None,
+        all_elements_state=all_elements,
+        window_tag_values=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    result = self._fire_eligible_windows(
+        key, TimeDomain.REAL_TIME, timestamp, timer_tag, context)
+    latest_processing_time.add(timestamp)
+    return result
+
+  @on_timer(WATERMARK_TIMER)
+  def watermark_trigger(
+      self,
+      key=DoFn.KeyParam,
+      timer_tag=DoFn.DynamicTimerTagParam,
+      timestamp=DoFn.TimestampParam,
+      latest_watermark=DoFn.StateParam(LAST_KNOWN_WATERMARK),
+      all_elements=DoFn.StateParam(WINDOW_ELEMENT_PAIRS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER)):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        latest_processing_time=None,
+        latest_watermark=latest_watermark,
+        all_elements_state=all_elements,
+        window_tag_values=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    result = self._fire_eligible_windows(
+        key, TimeDomain.WATERMARK, timestamp, timer_tag, context)
+    latest_watermark.add(timestamp)
+    return result
+
+
+class FnRunnerStatefulTriggerContext(TriggerContext):
+  def __init__(
+      self,
+      processing_time_timer: RuntimeTimer,
+      watermark_timer: RuntimeTimer,
+      latest_processing_time: typing.Optional[AccumulatingRuntimeState],
+      latest_watermark: typing.Optional[AccumulatingRuntimeState],
+      all_elements_state: BagRuntimeState,
+      window_tag_values: BagRuntimeState,
+      finished_windows_state: SetRuntimeState):
+    self.timers = {
+        TimeDomain.REAL_TIME: processing_time_timer,
+        TimeDomain.WATERMARK: watermark_timer
+    }
+    self.current_times = {
+        TimeDomain.REAL_TIME: latest_processing_time,
+        TimeDomain.WATERMARK: latest_watermark
+    }
+    self.all_elements_state = all_elements_state
+    self.window_tag_values = window_tag_values
+    self.finished_windows_state = finished_windows_state
+
+  def windows_to_elements_map(
+      self
+  ) -> typing.Dict[BoundedWindow, typing.List[windowed_value.WindowedValue]]:
+    window_element_pairs: typing.Iterable[typing.Tuple[
+        BoundedWindow,
+        windowed_value.WindowedValue]] = self.all_elements_state.read()
+    result: typing.Dict[BoundedWindow,
+                        typing.List[windowed_value.WindowedValue]] = {}
+    for w, e in window_element_pairs:
+      if w not in result:
+        result[w] = []
+      result[w].append(e)
+    return result
+
+  def for_window(self, window):
+    return PerWindowTriggerContext(window, self)
+
+  def get_current_time(self):
+    return self.current_times[TimeDomain.REAL_TIME].read()
+
+  def set_timer(self, name, time_domain, timestamp):
+    _LOGGER.debug('Setting timer (%s, %s) at %s', time_domain, name, timestamp)
+    self.timers[time_domain].set(timestamp, dynamic_timer_tag=name)
+
+  def clear_timer(self, name, time_domain):
+    _LOGGER.debug('Clearing timer (%s, %s)', time_domain, name)
+    self.timers[time_domain].clear(dynamic_timer_tag=name)
+
+  def merge_windows(self, to_be_merged, merge_result):
+    all_triplets = list(self.window_tag_values.read())
+    # Collect all the triplets for the window we are merging away, and tag them
+    # with the new window (merge_result).
+    merging_away_triplets = [(merge_result, t[1], t[2]) for t in all_triplets

Review comment:
       I think it will be more readable if we unpack the triplets for example 
   ```
   [(merge_result, state_tag, state) for (window, state_tag, state) in all_triplets if window in to_be_merged]
   ```
   WDYT? Same for other list comprehensions in this file.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] pabloem commented on a change in pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
pabloem commented on a change in pull request #13985:
URL: https://github.com/apache/beam/pull/13985#discussion_r599146609



##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')
+
+
+class ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):
+    super(TriggerMergeContext, self).__init__(all_windows)
+    self.trigger_context = context
+    self.windowing = windowing
+    self.merged_away: typing.Dict[BoundedWindow, BoundedWindow] = {}
+
+  def merge(self, to_be_merged, merge_result):
+    _LOGGER.debug("Merging %s into %s", to_be_merged, merge_result)
+    self.trigger_context.merge_state(to_be_merged, merge_result)
+    for window in to_be_merged:
+      if window != merge_result:
+        self.merged_away[window] = merge_result
+        # Clear state associated with PaneInfo since it is
+        # not preserved across merges.
+        self.trigger_context.for_window(window).clear_state(None)
+    self.windowing.triggerfn.on_merge(
+        to_be_merged,
+        merge_result,
+        self.trigger_context.for_window(merge_result))
+
+
+@typehints.with_input_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+@typehints.with_output_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+class GeneralTriggerManagerDoFn(DoFn):
+  """A trigger manager that supports all windowing / triggering cases.
+
+  This implements a DoFn that manages triggering in a per-key basis. All
+  elements for a single key are processed together. Per-key state holds data
+  related to all windows.
+  """
+
+  KNOWN_WINDOWS = SetStateSpec('known_windows', IntervalWindowCoder())
+  FINISHED_WINDOWS = SetStateSpec('finished_windows', IntervalWindowCoder())
+  LAST_KNOWN_TIME = CombiningValueStateSpec('last_known_time', combine_fn=max)
+  LAST_KNOWN_WATERMARK = CombiningValueStateSpec(
+      'last_known_watermark', combine_fn=max)
+
+  # TODO(pabloem) What's the coder for the elements/keys here?
+  WINDOW_ELEMENT_PAIRS = BagStateSpec(
+      'element_bag', TupleCoder([IntervalWindowCoder(), PickleCoder()]))
+  WINDOW_TAG_VALUES = BagStateSpec(
+      'per_window_per_tag_value_state',
+      TupleCoder([IntervalWindowCoder(), StrUtf8Coder(), VarIntCoder()]))
+
+  PROCESSING_TIME_TIMER = TimerSpec(
+      'processing_time_timer', TimeDomain.REAL_TIME)
+  WATERMARK_TIMER = TimerSpec('watermark_timer', TimeDomain.WATERMARK)
+
+  def __init__(self, windowing: Windowing):
+    self.windowing = windowing
+    # Only session windows are merging. Other windows are non-merging.
+    self.merging_windows = isinstance(windowing.windowfn, Sessions)
+
+  def process(
+      self,
+      element: typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]],
+      element_bag: BagRuntimeState = DoFn.StateParam(WINDOW_ELEMENT_PAIRS),  # type: ignore
+      time_state: AccumulatingRuntimeState = DoFn.StateParam(LAST_KNOWN_TIME),  # type: ignore
+      watermark_state: AccumulatingRuntimeState = DoFn.StateParam(  # type: ignore
+          LAST_KNOWN_WATERMARK),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      windows_state: SetRuntimeState = DoFn.StateParam(KNOWN_WINDOWS),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER),
+      *args, **kwargs):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=time_state,
+        watermark_state=watermark_state,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    key, windowed_values = element
+    watermark = read_watermark(watermark_state)
+
+    windows_to_elements = collections.defaultdict(list)
+    for wv in windowed_values:
+      for window in wv.windows:
+        # ignore expired windows
+        if watermark > window.end + self.windowing.allowed_lateness:
+          continue
+        if window in finished_windows_state.read():
+          continue
+        windows_to_elements[window].append(
+            TimestampedValue(wv.value, wv.timestamp))
+
+    # Processing merging of windows
+    if self.merging_windows:
+      old_windows = set(windows_state.read())
+      all_windows = old_windows.union(list(windows_to_elements))
+      if all_windows != old_windows:
+        merge_context = TriggerMergeContext(
+            all_windows, context, self.windowing)
+        self.windowing.windowfn.merge(merge_context)
+
+        merged_windows_to_elements = collections.defaultdict(list)
+        for window, values in windows_to_elements.items():
+          while window in merge_context.merged_away:
+            window = merge_context.merged_away[window]
+          merged_windows_to_elements[window].extend(values)
+        windows_to_elements = merged_windows_to_elements
+
+      if old_windows != all_windows:
+        for w in windows_to_elements:
+          windows_state.add(w)
+    # Done processing merging of windows
+
+    seen_windows = set()
+    for w in windows_to_elements:
+      window_context = context.for_window(w)
+      seen_windows.add(w)
+      for value_w_timestamp in windows_to_elements[w]:
+        _LOGGER.debug(value_w_timestamp)
+        element_bag.add((w, value_w_timestamp))
+        self.windowing.triggerfn.on_element(windowed_values, w, window_context)
+
+    return self._trigger_fire(
+        key, TimeDomain.WATERMARK, watermark, None, context, seen_windows)
+
+  def _trigger_fire(
+      self,
+      key: K,
+      time_domain,
+      timestamp: Timestamp,
+      timer_tag: typing.Optional[str],
+      context: 'FnRunnerStatefulTriggerContext',
+      windows_of_interest: typing.Optional[typing.Set[BoundedWindow]] = None):
+    windows_to_elements = context.windows_to_elements_map()
+    context.elements_bag_state.clear()
+
+    fired_windows = set()
+    finished_windows = set()
+    _LOGGER.debug(
+        '%s - tag %s - timestamp %s', time_domain, timer_tag, timestamp)
+    for w, elems in windows_to_elements.items():
+      if windows_of_interest is not None and w not in windows_of_interest:
+        # windows_of_interest=None means that we care about all windows.
+        # If we care only about some windows, and this window is not one of
+        # them, then we do not intend to fire this window.
+        continue
+      window_context = context.for_window(w)
+      if self.windowing.triggerfn.should_fire(time_domain,
+                                              timestamp,
+                                              w,
+                                              window_context):
+        finished = self.windowing.triggerfn.on_fire(
+            timestamp, w, window_context)
+        _LOGGER.debug('Firing on window %s. Finished: %s', w, finished)
+        fired_windows.add(w)
+        if finished:
+          context.finished_windows_state.add(w)
+          finished_windows.add(w)
+        # TODO(pabloem): Format the output: e.g. pane info
+        elems = [WindowedValue(e.value, e.timestamp, (w, )) for e in elems]
+        yield (key, elems)
+
+    # Add elements that were not fired back into state.
+    for w, elems in windows_to_elements.items():
+      for e in elems:
+        if (w in finished_windows or
+            (w in fired_windows and
+             self.windowing.accumulation_mode == AccumulationMode.DISCARDING)):
+          continue
+        context.elements_bag_state.add((w, e))
+
+  @on_timer(PROCESSING_TIME_TIMER)
+  def processing_time_trigger(
+      self,
+      key=DoFn.KeyParam,
+      timer_tag=DoFn.DynamicTimerTagParam,
+      timestamp=DoFn.TimestampParam,
+      processing_time_state=DoFn.StateParam(LAST_KNOWN_TIME),
+      element_bag=DoFn.StateParam(WINDOW_ELEMENT_PAIRS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER)):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=processing_time_state,
+        watermark_state=None,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    result = self._trigger_fire(
+        key, TimeDomain.REAL_TIME, timestamp, timer_tag, context)
+    processing_time_state.add(timestamp)
+    return result
+
+  @on_timer(WATERMARK_TIMER)
+  def watermark_trigger(
+      self,
+      key=DoFn.KeyParam,
+      timer_tag=DoFn.DynamicTimerTagParam,
+      timestamp=DoFn.TimestampParam,
+      watermark_state=DoFn.StateParam(LAST_KNOWN_WATERMARK),
+      element_bag=DoFn.StateParam(WINDOW_ELEMENT_PAIRS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER)):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=None,
+        watermark_state=watermark_state,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    result = self._trigger_fire(
+        key, TimeDomain.WATERMARK, timestamp, timer_tag, context)
+    watermark_state.add(timestamp)
+    return result
+
+
+class FnRunnerStatefulTriggerContext(TriggerContext):
+  def __init__(
+      self,
+      processing_time_timer: RuntimeTimer,
+      watermark_timer: RuntimeTimer,
+      current_time_state: typing.Optional[AccumulatingRuntimeState],
+      watermark_state: typing.Optional[AccumulatingRuntimeState],
+      elements_bag_state: BagRuntimeState,
+      window_tag_values_bag_state: BagRuntimeState,
+      finished_windows_state: SetRuntimeState):
+    self.timers = {
+        TimeDomain.REAL_TIME: processing_time_timer,
+        TimeDomain.WATERMARK: watermark_timer
+    }
+    self.current_times = {
+        TimeDomain.REAL_TIME: current_time_state,
+        TimeDomain.WATERMARK: watermark_state
+    }
+    self.elements_bag_state = elements_bag_state
+    self.window_tag_values_bag_state = window_tag_values_bag_state
+    self.finished_windows_state = finished_windows_state
+
+  def windows_to_elements_map(
+      self
+  ) -> typing.Dict[BoundedWindow, typing.List[windowed_value.WindowedValue]]:
+    window_element_pairs: typing.Iterable[typing.Tuple[
+        BoundedWindow,
+        windowed_value.WindowedValue]] = self.elements_bag_state.read()
+    result: typing.Dict[BoundedWindow,
+                        typing.List[windowed_value.WindowedValue]] = {}
+    for w, e in window_element_pairs:
+      if w not in result:
+        result[w] = []
+      result[w].append(e)
+    return result
+
+  def for_window(self, window):
+    return PerWindowTriggerContext(window, self)
+
+  def get_current_time(self):
+    return self.current_times[TimeDomain.REAL_TIME].read()
+
+  def set_timer(self, name, time_domain, timestamp):
+    _LOGGER.debug('Setting timer (%s, %s) at %s', time_domain, name, timestamp)
+    self.timers[time_domain].set(timestamp, dynamic_timer_tag=name)
+
+  def clear_timer(self, name, time_domain):
+    _LOGGER.debug('Clearing timer (%s, %s)', time_domain, name)
+    self.timers[time_domain].clear(dynamic_timer_tag=name)
+
+  def merge_state(self, to_be_merged, merge_result):
+    all_triplets = self.window_tag_values_bag_state.read()
+    # Collect all the triplets for the window we are merging away, and tag them
+    # with the new window (merge_result).
+    merging_away_triplets = [(merge_result, t[1], t[2]) for t in all_triplets
+                             if t[0] in to_be_merged]
+
+    # Collect all of the other triplets, and joining them with the newly-tagged
+    # set of triplets.
+    resulting_triplets = [
+        t for t in all_triplets if t[0] not in to_be_merged

Review comment:
       it is not reiterable




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (12920ec) into [master](https://codecov.io/gh/apache/beam/commit/67dc5fe49bd0dbc33ccccf7cde5749dc17adec08?el=desc) (67dc5fe) will **increase** coverage by `0.05%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   83.39%   83.45%   +0.05%     
   ==========================================
     Files         469      470       +1     
     Lines       58721    58952     +231     
   ==========================================
   + Hits        48972    49196     +224     
   - Misses       9749     9756       +7     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [...am/examples/complete/juliaset/juliaset/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvY29tcGxldGUvanVsaWFzZXQvanVsaWFzZXQvX19pbml0X18ucHk=) | | |
   | [...s/sdks/python/apache\_beam/io/gcp/tests/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL3Rlc3RzL19faW5pdF9fLnB5) | | |
   | [...ernal/clients/cloudbuild/cloudbuild\_v1\_messages.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9kYXRhZmxvdy9pbnRlcm5hbC9jbGllbnRzL2Nsb3VkYnVpbGQvY2xvdWRidWlsZF92MV9tZXNzYWdlcy5weQ==) | | |
   | [...pache\_beam/runners/portability/artifact\_service.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9hcnRpZmFjdF9zZXJ2aWNlLnB5) | | |
   | [...ache\_beam/portability/api/beam\_artifact\_api\_pb2.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcG9ydGFiaWxpdHkvYXBpL2JlYW1fYXJ0aWZhY3RfYXBpX3BiMi5weQ==) | | |
   | [...s/sdks/python/apache\_beam/testing/test\_pipeline.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy90ZXN0X3BpcGVsaW5lLnB5) | | |
   | [...8/build/srcs/sdks/python/apache\_beam/io/kinesis.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8va2luZXNpcy5weQ==) | | |
   | [.../examples/snippets/transforms/elementwise/pardo.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvc25pcHBldHMvdHJhbnNmb3Jtcy9lbGVtZW50d2lzZS9wYXJkby5weQ==) | | |
   | [...\_beam/testing/benchmarks/nexmark/queries/query5.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL25leG1hcmsvcXVlcmllcy9xdWVyeTUucHk=) | | |
   | [...ild/srcs/sdks/python/apache\_beam/metrics/metric.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vbWV0cmljcy9tZXRyaWMucHk=) | | |
   | ... and [929 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [67dc5fe...12920ec](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (cc6607c) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.26%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   83.03%   +0.26%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58565    +1003     
   ==========================================
   + Hits        47647    48631     +984     
   - Misses       9915     9934      +19     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [...les/complete/juliaset/juliaset/juliaset\_test\_it.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvY29tcGxldGUvanVsaWFzZXQvanVsaWFzZXQvanVsaWFzZXRfdGVzdF9pdC5weQ==) | | |
   | [sdks/python/apache\_beam/io/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vX19pbml0X18ucHk=) | | |
   | [sdks/python/apache\_beam/dataframe/convert.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZGF0YWZyYW1lL2NvbnZlcnQucHk=) | | |
   | [...he\_beam/portability/api/external\_transforms\_pb2.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcG9ydGFiaWxpdHkvYXBpL2V4dGVybmFsX3RyYW5zZm9ybXNfcGIyLnB5) | | |
   | [sdks/python/apache\_beam/io/aws/clients/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vYXdzL2NsaWVudHMvX19pbml0X18ucHk=) | | |
   | [...s/python/apache\_beam/io/gcp/bigquery\_file\_loads.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL2JpZ3F1ZXJ5X2ZpbGVfbG9hZHMucHk=) | | |
   | [...beam/runners/portability/fn\_api\_runner/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9mbl9hcGlfcnVubmVyL19faW5pdF9fLnB5) | | |
   | [...am/testing/benchmarks/nexmark/models/field\_name.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL25leG1hcmsvbW9kZWxzL2ZpZWxkX25hbWUucHk=) | | |
   | [sdks/python/apache\_beam/examples/sql\_taxi.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvc3FsX3RheGkucHk=) | | |
   | [sdks/python/apache\_beam/utils/histogram.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdXRpbHMvaGlzdG9ncmFtLnB5) | | |
   | ... and [923 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...cc6607c](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (cc6607c) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.26%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   83.03%   +0.26%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58565    +1003     
   ==========================================
   + Hits        47647    48631     +984     
   - Misses       9915     9934      +19     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [sdks/python/apache\_beam/utils/retry.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdXRpbHMvcmV0cnkucHk=) | | |
   | [...ks/python/apache\_beam/runners/worker/statecache.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy93b3JrZXIvc3RhdGVjYWNoZS5weQ==) | | |
   | [...examples/snippets/transforms/elementwise/values.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvc25pcHBldHMvdHJhbnNmb3Jtcy9lbGVtZW50d2lzZS92YWx1ZXMucHk=) | | |
   | [...nternal/clients/dataflow/dataflow\_v1b3\_messages.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9kYXRhZmxvdy9pbnRlcm5hbC9jbGllbnRzL2RhdGFmbG93L2RhdGFmbG93X3YxYjNfbWVzc2FnZXMucHk=) | | |
   | [sdks/python/apache\_beam/transforms/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdHJhbnNmb3Jtcy9fX2luaXRfXy5weQ==) | | |
   | [sdks/python/apache\_beam/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vX19pbml0X18ucHk=) | | |
   | [...cp/internal/clients/bigquery/bigquery\_v2\_client.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL2ludGVybmFsL2NsaWVudHMvYmlncXVlcnkvYmlncXVlcnlfdjJfY2xpZW50LnB5) | | |
   | [sdks/python/apache\_beam/dataframe/expressions.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZGF0YWZyYW1lL2V4cHJlc3Npb25zLnB5) | | |
   | [...dks/python/apache\_beam/ml/gcp/naturallanguageml.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vbWwvZ2NwL25hdHVyYWxsYW5ndWFnZW1sLnB5) | | |
   | [...xamples/snippets/transforms/elementwise/flatmap.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvc25pcHBldHMvdHJhbnNmb3Jtcy9lbGVtZW50d2lzZS9mbGF0bWFwLnB5) | | |
   | ... and [923 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...cc6607c](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (568db95) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.25%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   83.03%   +0.25%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58534     +972     
   ==========================================
   + Hits        47647    48603     +956     
   - Misses       9915     9931      +16     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [sdks/python/apache\_beam/coders/coder\_impl.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vY29kZXJzL2NvZGVyX2ltcGwucHk=) | | |
   | [...he\_beam/testing/benchmarks/nexmark/nexmark\_util.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL25leG1hcmsvbmV4bWFya191dGlsLnB5) | | |
   | [...ks/python/apache\_beam/runners/dataflow/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9kYXRhZmxvdy9fX2luaXRfXy5weQ==) | | |
   | [...on/apache\_beam/examples/complete/juliaset/setup.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvY29tcGxldGUvanVsaWFzZXQvc2V0dXAucHk=) | | |
   | [sdks/python/apache\_beam/dataframe/partitionings.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZGF0YWZyYW1lL3BhcnRpdGlvbmluZ3MucHk=) | | |
   | [.../python/apache\_beam/runners/worker/statesampler.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy93b3JrZXIvc3RhdGVzYW1wbGVyLnB5) | | |
   | [...thon/apache\_beam/io/aws/clients/s3/boto3\_client.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vYXdzL2NsaWVudHMvczMvYm90bzNfY2xpZW50LnB5) | | |
   | [...eam/transforms/py\_dataflow\_distribution\_counter.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdHJhbnNmb3Jtcy9weV9kYXRhZmxvd19kaXN0cmlidXRpb25fY291bnRlci5weQ==) | | |
   | [...ache\_beam/runners/interactive/recording\_manager.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9pbnRlcmFjdGl2ZS9yZWNvcmRpbmdfbWFuYWdlci5weQ==) | | |
   | [...beam/testing/benchmarks/nexmark/models/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL25leG1hcmsvbW9kZWxzL19faW5pdF9fLnB5) | | |
   | ... and [923 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...568db95](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (568db95) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.25%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   83.03%   +0.25%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58534     +972     
   ==========================================
   + Hits        47647    48603     +956     
   - Misses       9915     9931      +16     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [sdks/python/apache\_beam/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vX19pbml0X18ucHk=) | | |
   | [sdks/python/apache\_beam/utils/windowed\_value.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdXRpbHMvd2luZG93ZWRfdmFsdWUucHk=) | | |
   | [...ks/python/apache\_beam/examples/cookbook/filters.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvY29va2Jvb2svZmlsdGVycy5weQ==) | | |
   | [...eam/runners/interactive/options/capture\_control.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9pbnRlcmFjdGl2ZS9vcHRpb25zL2NhcHR1cmVfY29udHJvbC5weQ==) | | |
   | [...he\_beam/io/flink/flink\_streaming\_impulse\_source.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZmxpbmsvZmxpbmtfc3RyZWFtaW5nX2ltcHVsc2Vfc291cmNlLnB5) | | |
   | [sdks/python/apache\_beam/coders/row\_coder.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vY29kZXJzL3Jvd19jb2Rlci5weQ==) | | |
   | [...ks/python/apache\_beam/runners/worker/opcounters.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy93b3JrZXIvb3Bjb3VudGVycy5weQ==) | | |
   | [...apache\_beam/examples/complete/juliaset/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvY29tcGxldGUvanVsaWFzZXQvX19pbml0X18ucHk=) | | |
   | [...s/python/apache\_beam/examples/cookbook/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvY29va2Jvb2svX19pbml0X18ucHk=) | | |
   | [...ython/apache\_beam/io/gcp/datastore/v1new/helper.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL2RhdGFzdG9yZS92MW5ldy9oZWxwZXIucHk=) | | |
   | ... and [923 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...568db95](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (a57fb3d) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.25%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   83.03%   +0.25%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58535     +973     
   ==========================================
   + Hits        47647    48604     +957     
   - Misses       9915     9931      +16     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [...on/apache\_beam/runners/direct/helper\_transforms.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9kaXJlY3QvaGVscGVyX3RyYW5zZm9ybXMucHk=) | | |
   | [...apache\_beam/runners/portability/portable\_runner.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9wb3J0YWJsZV9ydW5uZXIucHk=) | | |
   | [sdks/python/apache\_beam/utils/annotations.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdXRpbHMvYW5ub3RhdGlvbnMucHk=) | | |
   | [sdks/python/apache\_beam/coders/coders.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vY29kZXJzL2NvZGVycy5weQ==) | | |
   | [...cp/internal/clients/bigquery/bigquery\_v2\_client.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL2ludGVybmFsL2NsaWVudHMvYmlncXVlcnkvYmlncXVlcnlfdjJfY2xpZW50LnB5) | | |
   | [...hon/apache\_beam/portability/api/schema\_pb2\_grpc.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcG9ydGFiaWxpdHkvYXBpL3NjaGVtYV9wYjJfZ3JwYy5weQ==) | | |
   | [...\_beam/testing/benchmarks/nexmark/queries/query3.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL25leG1hcmsvcXVlcmllcy9xdWVyeTMucHk=) | | |
   | [.../python/apache\_beam/typehints/trivial\_inference.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdHlwZWhpbnRzL3RyaXZpYWxfaW5mZXJlbmNlLnB5) | | |
   | [sdks/python/apache\_beam/io/filebasedsource.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZmlsZWJhc2Vkc291cmNlLnB5) | | |
   | [sdks/python/apache\_beam/io/tfrecordio.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vdGZyZWNvcmRpby5weQ==) | | |
   | ... and [923 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...a57fb3d](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (2452188) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.28%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   83.06%   +0.28%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58590    +1028     
   ==========================================
   + Hits        47647    48666    +1019     
   - Misses       9915     9924       +9     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [...dks/python/apache\_beam/examples/wordcount\_xlang.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvd29yZGNvdW50X3hsYW5nLnB5) | | |
   | [...m/examples/snippets/transforms/aggregation/mean.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvc25pcHBldHMvdHJhbnNmb3Jtcy9hZ2dyZWdhdGlvbi9tZWFuLnB5) | | |
   | [sdks/python/apache\_beam/io/hadoopfilesystem.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vaGFkb29wZmlsZXN5c3RlbS5weQ==) | | |
   | [...\_beam/testing/benchmarks/nexmark/queries/query2.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL25leG1hcmsvcXVlcmllcy9xdWVyeTIucHk=) | | |
   | [...ers/dataflow/internal/clients/dataflow/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9kYXRhZmxvdy9pbnRlcm5hbC9jbGllbnRzL2RhdGFmbG93L19faW5pdF9fLnB5) | | |
   | [...pache\_beam/portability/api/beam\_fn\_api\_pb2\_grpc.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcG9ydGFiaWxpdHkvYXBpL2JlYW1fZm5fYXBpX3BiMl9ncnBjLnB5) | | |
   | [...pache\_beam/runners/interactive/interactive\_beam.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9pbnRlcmFjdGl2ZS9pbnRlcmFjdGl2ZV9iZWFtLnB5) | | |
   | [...eam/runners/interactive/options/capture\_control.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9pbnRlcmFjdGl2ZS9vcHRpb25zL2NhcHR1cmVfY29udHJvbC5weQ==) | | |
   | [sdks/python/apache\_beam/tools/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdG9vbHMvX19pbml0X18ucHk=) | | |
   | [...eam/runners/interactive/interactive\_environment.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9pbnRlcmFjdGl2ZS9pbnRlcmFjdGl2ZV9lbnZpcm9ubWVudC5weQ==) | | |
   | ... and [923 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...2452188](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (177f098) into [master](https://codecov.io/gh/apache/beam/commit/67dc5fe49bd0dbc33ccccf7cde5749dc17adec08?el=desc) (67dc5fe) will **increase** coverage by `0.06%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   83.39%   83.46%   +0.06%     
   ==========================================
     Files         469      470       +1     
     Lines       58721    59001     +280     
   ==========================================
   + Hits        48972    49246     +274     
   - Misses       9749     9755       +6     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [...s/sdks/python/apache\_beam/internal/gcp/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW50ZXJuYWwvZ2NwL19faW5pdF9fLnB5) | | |
   | [...s/python/apache\_beam/testing/pipeline\_verifiers.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9waXBlbGluZV92ZXJpZmllcnMucHk=) | | |
   | [...sdks/python/apache\_beam/io/aws/clients/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vYXdzL2NsaWVudHMvX19pbml0X18ucHk=) | | |
   | [...ache\_beam/runners/interactive/recording\_manager.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9pbnRlcmFjdGl2ZS9yZWNvcmRpbmdfbWFuYWdlci5weQ==) | | |
   | [...s/sdks/python/apache\_beam/io/gcp/tests/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL3Rlc3RzL19faW5pdF9fLnB5) | | |
   | [...\_beam/testing/benchmarks/nexmark/queries/query9.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL25leG1hcmsvcXVlcmllcy9xdWVyeTkucHk=) | | |
   | [.../srcs/sdks/python/apache\_beam/dataframe/schemas.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZGF0YWZyYW1lL3NjaGVtYXMucHk=) | | |
   | [.../python/apache\_beam/io/gcp/tests/pubsub\_matcher.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL3Rlc3RzL3B1YnN1Yl9tYXRjaGVyLnB5) | | |
   | [.../apache\_beam/testing/benchmarks/nexmark/monitor.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL25leG1hcmsvbW9uaXRvci5weQ==) | | |
   | [...rcs/sdks/python/apache\_beam/runners/job/manager.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9qb2IvbWFuYWdlci5weQ==) | | |
   | ... and [929 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [67dc5fe...177f098](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] pabloem merged pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
pabloem merged pull request #13985:
URL: https://github.com/apache/beam/pull/13985


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (177f098) into [master](https://codecov.io/gh/apache/beam/commit/67dc5fe49bd0dbc33ccccf7cde5749dc17adec08?el=desc) (67dc5fe) will **increase** coverage by `0.06%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   83.39%   83.46%   +0.06%     
   ==========================================
     Files         469      470       +1     
     Lines       58721    59001     +280     
   ==========================================
   + Hits        48972    49246     +274     
   - Misses       9749     9755       +6     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [...sdks/python/apache\_beam/internal/metrics/metric.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW50ZXJuYWwvbWV0cmljcy9tZXRyaWMucHk=) | | |
   | [...thon/apache\_beam/runners/worker/sdk\_worker\_main.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy93b3JrZXIvc2RrX3dvcmtlcl9tYWluLnB5) | | |
   | [...\_beam/testing/benchmarks/nexmark/queries/query7.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL25leG1hcmsvcXVlcmllcy9xdWVyeTcucHk=) | | |
   | [...s/python/apache\_beam/examples/wordcount\_minimal.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvd29yZGNvdW50X21pbmltYWwucHk=) | | |
   | [.../srcs/sdks/python/apache\_beam/examples/sql\_taxi.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvc3FsX3RheGkucHk=) | | |
   | [...dks/python/apache\_beam/io/gcp/gce\_metadata\_util.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL2djZV9tZXRhZGF0YV91dGlsLnB5) | | |
   | [...rcs/sdks/python/apache\_beam/testing/test\_stream.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy90ZXN0X3N0cmVhbS5weQ==) | | |
   | [...ild/srcs/sdks/python/apache\_beam/io/gcp/dicomio.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL2RpY29taW8ucHk=) | | |
   | [.../apache\_beam/options/pipeline\_options\_validator.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vb3B0aW9ucy9waXBlbGluZV9vcHRpb25zX3ZhbGlkYXRvci5weQ==) | | |
   | [...d/srcs/sdks/python/apache\_beam/runners/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9fX2luaXRfXy5weQ==) | | |
   | ... and [929 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [67dc5fe...177f098](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] tvalentyn commented on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
tvalentyn commented on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-800469889


   There is a lot to catch up on for me in internals of streaming semantics. I am afraid it will take some time before I can provide a meaningful review. If you need this soon Yichi might be a better reviewer for this change.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (2452188) into [master](https://codecov.io/gh/apache/beam/commit/67dc5fe49bd0dbc33ccccf7cde5749dc17adec08?el=desc) (67dc5fe) will **decrease** coverage by `0.33%`.
   > The diff coverage is `n/a`.
   
   > :exclamation: Current head 2452188 differs from pull request most recent head 12920ec. Consider uploading reports for the commit 12920ec to get more accurate results
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   - Coverage   83.39%   83.06%   -0.34%     
   ==========================================
     Files         469      470       +1     
     Lines       58721    58590     -131     
   ==========================================
   - Hits        48972    48666     -306     
   - Misses       9749     9924     +175     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [...hon/apache\_beam/runners/direct/direct\_userstate.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9kaXJlY3QvZGlyZWN0X3VzZXJzdGF0ZS5weQ==) | | |
   | [...ython/apache\_beam/typehints/decorators\_test\_py3.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdHlwZWhpbnRzL2RlY29yYXRvcnNfdGVzdF9weTMucHk=) | | |
   | [...python/apache\_beam/io/gcp/datastore/v1new/types.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL2RhdGFzdG9yZS92MW5ldy90eXBlcy5weQ==) | | |
   | [.../python/apache\_beam/examples/kafkataxi/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMva2Fma2F0YXhpL19faW5pdF9fLnB5) | | |
   | [...apache\_beam/portability/api/beam\_runner\_api\_pb2.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcG9ydGFiaWxpdHkvYXBpL2JlYW1fcnVubmVyX2FwaV9wYjIucHk=) | | |
   | [.../python/apache\_beam/portability/api/metrics\_pb2.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcG9ydGFiaWxpdHkvYXBpL21ldHJpY3NfcGIyLnB5) | | |
   | [.../apache\_beam/portability/api/endpoints\_pb2\_grpc.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcG9ydGFiaWxpdHkvYXBpL2VuZHBvaW50c19wYjJfZ3JwYy5weQ==) | | |
   | [...tes/tox/py38/build/srcs/sdks/python/test\_config.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vdGVzdF9jb25maWcucHk=) | | |
   | [...s/python/apache\_beam/typehints/sharded\_key\_type.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdHlwZWhpbnRzL3NoYXJkZWRfa2V5X3R5cGUucHk=) | | |
   | [.../python/apache\_beam/examples/windowed\_wordcount.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvd2luZG93ZWRfd29yZGNvdW50LnB5) | | |
   | ... and [929 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [67dc5fe...12920ec](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (a57fb3d) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.25%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   83.03%   +0.25%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58535     +973     
   ==========================================
   + Hits        47647    48604     +957     
   - Misses       9915     9931      +16     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [...he\_beam/testing/benchmarks/nexmark/nexmark\_util.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL25leG1hcmsvbmV4bWFya191dGlsLnB5) | | |
   | [sdks/python/apache\_beam/transforms/cy\_combiners.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdHJhbnNmb3Jtcy9jeV9jb21iaW5lcnMucHk=) | | |
   | [sdks/python/apache\_beam/transforms/stats.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdHJhbnNmb3Jtcy9zdGF0cy5weQ==) | | |
   | [...on/apache\_beam/runners/worker/statesampler\_slow.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy93b3JrZXIvc3RhdGVzYW1wbGVyX3Nsb3cucHk=) | | |
   | [sdks/python/apache\_beam/coders/row\_coder.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vY29kZXJzL3Jvd19jb2Rlci5weQ==) | | |
   | [...beam/testing/benchmarks/nexmark/queries/query12.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL25leG1hcmsvcXVlcmllcy9xdWVyeTEyLnB5) | | |
   | [...che\_beam/runners/interactive/interactive\_runner.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9pbnRlcmFjdGl2ZS9pbnRlcmFjdGl2ZV9ydW5uZXIucHk=) | | |
   | [...cp/internal/clients/bigquery/bigquery\_v2\_client.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL2ludGVybmFsL2NsaWVudHMvYmlncXVlcnkvYmlncXVlcnlfdjJfY2xpZW50LnB5) | | |
   | [...apache\_beam/runners/dataflow/native\_io/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9kYXRhZmxvdy9uYXRpdmVfaW8vX19pbml0X18ucHk=) | | |
   | [...che\_beam/examples/streaming\_wordcount\_debugging.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvc3RyZWFtaW5nX3dvcmRjb3VudF9kZWJ1Z2dpbmcucHk=) | | |
   | ... and [923 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...a57fb3d](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] pabloem commented on a change in pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
pabloem commented on a change in pull request #13985:
URL: https://github.com/apache/beam/pull/13985#discussion_r599139233



##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')
+
+
+class ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):

Review comment:
       we use windowfn in a few places (e.g. when merging two windows)

##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')
+
+
+class ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):
+    super(TriggerMergeContext, self).__init__(all_windows)
+    self.trigger_context = context
+    self.windowing = windowing
+    self.merged_away: typing.Dict[BoundedWindow, BoundedWindow] = {}
+
+  def merge(self, to_be_merged, merge_result):
+    _LOGGER.debug("Merging %s into %s", to_be_merged, merge_result)
+    self.trigger_context.merge_state(to_be_merged, merge_result)
+    for window in to_be_merged:
+      if window != merge_result:
+        self.merged_away[window] = merge_result
+        # Clear state associated with PaneInfo since it is
+        # not preserved across merges.
+        self.trigger_context.for_window(window).clear_state(None)
+    self.windowing.triggerfn.on_merge(
+        to_be_merged,
+        merge_result,
+        self.trigger_context.for_window(merge_result))
+
+
+@typehints.with_input_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+@typehints.with_output_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+class GeneralTriggerManagerDoFn(DoFn):
+  """A trigger manager that supports all windowing / triggering cases.
+
+  This implements a DoFn that manages triggering in a per-key basis. All
+  elements for a single key are processed together. Per-key state holds data
+  related to all windows.
+  """
+
+  KNOWN_WINDOWS = SetStateSpec('known_windows', IntervalWindowCoder())
+  FINISHED_WINDOWS = SetStateSpec('finished_windows', IntervalWindowCoder())
+  LAST_KNOWN_TIME = CombiningValueStateSpec('last_known_time', combine_fn=max)
+  LAST_KNOWN_WATERMARK = CombiningValueStateSpec(
+      'last_known_watermark', combine_fn=max)
+
+  # TODO(pabloem) What's the coder for the elements/keys here?
+  WINDOW_ELEMENT_PAIRS = BagStateSpec(
+      'element_bag', TupleCoder([IntervalWindowCoder(), PickleCoder()]))
+  WINDOW_TAG_VALUES = BagStateSpec(
+      'per_window_per_tag_value_state',
+      TupleCoder([IntervalWindowCoder(), StrUtf8Coder(), VarIntCoder()]))
+
+  PROCESSING_TIME_TIMER = TimerSpec(
+      'processing_time_timer', TimeDomain.REAL_TIME)
+  WATERMARK_TIMER = TimerSpec('watermark_timer', TimeDomain.WATERMARK)
+
+  def __init__(self, windowing: Windowing):
+    self.windowing = windowing
+    # Only session windows are merging. Other windows are non-merging.
+    self.merging_windows = isinstance(windowing.windowfn, Sessions)

Review comment:
       done.

##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')
+
+
+class ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):
+    super(TriggerMergeContext, self).__init__(all_windows)
+    self.trigger_context = context
+    self.windowing = windowing
+    self.merged_away: typing.Dict[BoundedWindow, BoundedWindow] = {}
+
+  def merge(self, to_be_merged, merge_result):
+    _LOGGER.debug("Merging %s into %s", to_be_merged, merge_result)
+    self.trigger_context.merge_state(to_be_merged, merge_result)
+    for window in to_be_merged:
+      if window != merge_result:
+        self.merged_away[window] = merge_result
+        # Clear state associated with PaneInfo since it is
+        # not preserved across merges.
+        self.trigger_context.for_window(window).clear_state(None)
+    self.windowing.triggerfn.on_merge(
+        to_be_merged,
+        merge_result,
+        self.trigger_context.for_window(merge_result))
+
+
+@typehints.with_input_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+@typehints.with_output_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+class GeneralTriggerManagerDoFn(DoFn):
+  """A trigger manager that supports all windowing / triggering cases.
+
+  This implements a DoFn that manages triggering in a per-key basis. All
+  elements for a single key are processed together. Per-key state holds data
+  related to all windows.
+  """
+
+  KNOWN_WINDOWS = SetStateSpec('known_windows', IntervalWindowCoder())
+  FINISHED_WINDOWS = SetStateSpec('finished_windows', IntervalWindowCoder())
+  LAST_KNOWN_TIME = CombiningValueStateSpec('last_known_time', combine_fn=max)
+  LAST_KNOWN_WATERMARK = CombiningValueStateSpec(
+      'last_known_watermark', combine_fn=max)
+
+  # TODO(pabloem) What's the coder for the elements/keys here?
+  WINDOW_ELEMENT_PAIRS = BagStateSpec(
+      'element_bag', TupleCoder([IntervalWindowCoder(), PickleCoder()]))
+  WINDOW_TAG_VALUES = BagStateSpec(
+      'per_window_per_tag_value_state',
+      TupleCoder([IntervalWindowCoder(), StrUtf8Coder(), VarIntCoder()]))
+
+  PROCESSING_TIME_TIMER = TimerSpec(
+      'processing_time_timer', TimeDomain.REAL_TIME)
+  WATERMARK_TIMER = TimerSpec('watermark_timer', TimeDomain.WATERMARK)
+
+  def __init__(self, windowing: Windowing):
+    self.windowing = windowing
+    # Only session windows are merging. Other windows are non-merging.
+    self.merging_windows = isinstance(windowing.windowfn, Sessions)
+
+  def process(
+      self,
+      element: typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]],
+      element_bag: BagRuntimeState = DoFn.StateParam(WINDOW_ELEMENT_PAIRS),  # type: ignore
+      time_state: AccumulatingRuntimeState = DoFn.StateParam(LAST_KNOWN_TIME),  # type: ignore
+      watermark_state: AccumulatingRuntimeState = DoFn.StateParam(  # type: ignore
+          LAST_KNOWN_WATERMARK),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      windows_state: SetRuntimeState = DoFn.StateParam(KNOWN_WINDOWS),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER),
+      *args, **kwargs):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=time_state,
+        watermark_state=watermark_state,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    key, windowed_values = element
+    watermark = read_watermark(watermark_state)
+
+    windows_to_elements = collections.defaultdict(list)
+    for wv in windowed_values:
+      for window in wv.windows:
+        # ignore expired windows
+        if watermark > window.end + self.windowing.allowed_lateness:
+          continue
+        if window in finished_windows_state.read():
+          continue
+        windows_to_elements[window].append(
+            TimestampedValue(wv.value, wv.timestamp))
+
+    # Processing merging of windows
+    if self.merging_windows:
+      old_windows = set(windows_state.read())
+      all_windows = old_windows.union(list(windows_to_elements))
+      if all_windows != old_windows:
+        merge_context = TriggerMergeContext(
+            all_windows, context, self.windowing)
+        self.windowing.windowfn.merge(merge_context)
+
+        merged_windows_to_elements = collections.defaultdict(list)
+        for window, values in windows_to_elements.items():
+          while window in merge_context.merged_away:
+            window = merge_context.merged_away[window]
+          merged_windows_to_elements[window].extend(values)
+        windows_to_elements = merged_windows_to_elements
+
+      if old_windows != all_windows:
+        for w in windows_to_elements:
+          windows_state.add(w)
+    # Done processing merging of windows
+
+    seen_windows = set()
+    for w in windows_to_elements:
+      window_context = context.for_window(w)
+      seen_windows.add(w)
+      for value_w_timestamp in windows_to_elements[w]:
+        _LOGGER.debug(value_w_timestamp)
+        element_bag.add((w, value_w_timestamp))
+        self.windowing.triggerfn.on_element(windowed_values, w, window_context)
+
+    return self._trigger_fire(
+        key, TimeDomain.WATERMARK, watermark, None, context, seen_windows)
+
+  def _trigger_fire(
+      self,
+      key: K,
+      time_domain,
+      timestamp: Timestamp,
+      timer_tag: typing.Optional[str],
+      context: 'FnRunnerStatefulTriggerContext',
+      windows_of_interest: typing.Optional[typing.Set[BoundedWindow]] = None):
+    windows_to_elements = context.windows_to_elements_map()
+    context.elements_bag_state.clear()
+
+    fired_windows = set()
+    finished_windows = set()
+    _LOGGER.debug(
+        '%s - tag %s - timestamp %s', time_domain, timer_tag, timestamp)
+    for w, elems in windows_to_elements.items():
+      if windows_of_interest is not None and w not in windows_of_interest:
+        # windows_of_interest=None means that we care about all windows.
+        # If we care only about some windows, and this window is not one of
+        # them, then we do not intend to fire this window.
+        continue
+      window_context = context.for_window(w)
+      if self.windowing.triggerfn.should_fire(time_domain,
+                                              timestamp,
+                                              w,
+                                              window_context):
+        finished = self.windowing.triggerfn.on_fire(
+            timestamp, w, window_context)
+        _LOGGER.debug('Firing on window %s. Finished: %s', w, finished)
+        fired_windows.add(w)
+        if finished:
+          context.finished_windows_state.add(w)
+          finished_windows.add(w)

Review comment:
       done.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (177f098) into [master](https://codecov.io/gh/apache/beam/commit/67dc5fe49bd0dbc33ccccf7cde5749dc17adec08?el=desc) (67dc5fe) will **increase** coverage by `0.06%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   83.39%   83.46%   +0.06%     
   ==========================================
     Files         469      470       +1     
     Lines       58721    59001     +280     
   ==========================================
   + Hits        48972    49246     +274     
   - Misses       9749     9755       +6     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [.../python/apache\_beam/io/gcp/resource\_identifiers.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL3Jlc291cmNlX2lkZW50aWZpZXJzLnB5) | | |
   | [...on/apache\_beam/runners/direct/sdf\_direct\_runner.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9kaXJlY3Qvc2RmX2RpcmVjdF9ydW5uZXIucHk=) | | |
   | [...cs/sdks/python/apache\_beam/dataframe/transforms.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZGF0YWZyYW1lL3RyYW5zZm9ybXMucHk=) | | |
   | [...s/sdks/python/apache\_beam/dataframe/expressions.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZGF0YWZyYW1lL2V4cHJlc3Npb25zLnB5) | | |
   | [...ild/srcs/sdks/python/apache\_beam/utils/sentinel.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdXRpbHMvc2VudGluZWwucHk=) | | |
   | [...thon/apache\_beam/runners/worker/operation\_specs.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy93b3JrZXIvb3BlcmF0aW9uX3NwZWNzLnB5) | | |
   | [...python/apache\_beam/typehints/typehints\_test\_py3.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdHlwZWhpbnRzL3R5cGVoaW50c190ZXN0X3B5My5weQ==) | | |
   | [...ild/srcs/sdks/python/apache\_beam/runners/common.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9jb21tb24ucHk=) | | |
   | [...py38/build/srcs/sdks/python/apache\_beam/version.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdmVyc2lvbi5weQ==) | | |
   | [...python/apache\_beam/examples/wordcount\_debugging.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvd29yZGNvdW50X2RlYnVnZ2luZy5weQ==) | | |
   | ... and [929 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [67dc5fe...177f098](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (4ec1a8f) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.19%`.
   > The diff coverage is `92.37%`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   82.96%   +0.19%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58455     +893     
   ==========================================
   + Hits        47647    48499     +852     
   - Misses       9915     9956      +41     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [sdks/python/apache\_beam/transforms/trigger.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdHJhbnNmb3Jtcy90cmlnZ2VyLnB5) | `88.34% <75.00%> (-0.02%)` | :arrow_down: |
   | [...nners/portability/fn\_api\_runner/trigger\_manager.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9mbl9hcGlfcnVubmVyL3RyaWdnZXJfbWFuYWdlci5weQ==) | `92.92% <92.92%> (ø)` | |
   | [...dks/python/apache\_beam/options/pipeline\_options.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vb3B0aW9ucy9waXBlbGluZV9vcHRpb25zLnB5) | `94.60% <100.00%> (ø)` | |
   | [...\_beam/runners/portability/sdk\_container\_builder.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9zZGtfY29udGFpbmVyX2J1aWxkZXIucHk=) | `40.00% <0.00%> (-3.28%)` | :arrow_down: |
   | [sdks/python/apache\_beam/internal/metrics/metric.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW50ZXJuYWwvbWV0cmljcy9tZXRyaWMucHk=) | `86.45% <0.00%> (-1.05%)` | :arrow_down: |
   | [sdks/python/apache\_beam/dataframe/frames.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZGF0YWZyYW1lL2ZyYW1lcy5weQ==) | `91.07% <0.00%> (-0.90%)` | :arrow_down: |
   | [...ache\_beam/runners/portability/local\_job\_service.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9sb2NhbF9qb2Jfc2VydmljZS5weQ==) | `80.53% <0.00%> (-0.64%)` | :arrow_down: |
   | [...ks/python/apache\_beam/runners/worker/sdk\_worker.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy93b3JrZXIvc2RrX3dvcmtlci5weQ==) | `89.54% <0.00%> (-0.32%)` | :arrow_down: |
   | [sdks/python/apache\_beam/dataframe/expressions.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZGF0YWZyYW1lL2V4cHJlc3Npb25zLnB5) | `91.02% <0.00%> (-0.14%)` | :arrow_down: |
   | [sdks/python/apache\_beam/dataframe/schemas.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZGF0YWZyYW1lL3NjaGVtYXMucHk=) | `96.87% <0.00%> (-0.12%)` | :arrow_down: |
   | ... and [33 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...4ec1a8f](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] y1chi commented on a change in pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
y1chi commented on a change in pull request #13985:
URL: https://github.com/apache/beam/pull/13985#discussion_r595606277



##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')

Review comment:
       Is this used?

##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')
+
+
+class ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):
+    super(TriggerMergeContext, self).__init__(all_windows)
+    self.trigger_context = context
+    self.windowing = windowing
+    self.merged_away: typing.Dict[BoundedWindow, BoundedWindow] = {}
+
+  def merge(self, to_be_merged, merge_result):
+    _LOGGER.debug("Merging %s into %s", to_be_merged, merge_result)
+    self.trigger_context.merge_state(to_be_merged, merge_result)
+    for window in to_be_merged:
+      if window != merge_result:
+        self.merged_away[window] = merge_result
+        # Clear state associated with PaneInfo since it is
+        # not preserved across merges.
+        self.trigger_context.for_window(window).clear_state(None)
+    self.windowing.triggerfn.on_merge(
+        to_be_merged,
+        merge_result,
+        self.trigger_context.for_window(merge_result))
+
+
+@typehints.with_input_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+@typehints.with_output_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+class GeneralTriggerManagerDoFn(DoFn):
+  """A trigger manager that supports all windowing / triggering cases.
+
+  This implements a DoFn that manages triggering in a per-key basis. All
+  elements for a single key are processed together. Per-key state holds data
+  related to all windows.
+  """
+
+  KNOWN_WINDOWS = SetStateSpec('known_windows', IntervalWindowCoder())

Review comment:
       I guess this assumes that we are not dealing with GlobalWindow or other custom window types. Maybe add a TODO to be clear that other window types are not supported yet? Also we should consider what it takes to accept arbitrary window coder, right now it seems pretty inflexible since we have to declare the state spec in advance.

##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')
+
+
+class ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):
+    super(TriggerMergeContext, self).__init__(all_windows)
+    self.trigger_context = context
+    self.windowing = windowing
+    self.merged_away: typing.Dict[BoundedWindow, BoundedWindow] = {}
+
+  def merge(self, to_be_merged, merge_result):
+    _LOGGER.debug("Merging %s into %s", to_be_merged, merge_result)
+    self.trigger_context.merge_state(to_be_merged, merge_result)
+    for window in to_be_merged:
+      if window != merge_result:
+        self.merged_away[window] = merge_result
+        # Clear state associated with PaneInfo since it is
+        # not preserved across merges.
+        self.trigger_context.for_window(window).clear_state(None)
+    self.windowing.triggerfn.on_merge(
+        to_be_merged,
+        merge_result,
+        self.trigger_context.for_window(merge_result))
+
+
+@typehints.with_input_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+@typehints.with_output_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+class GeneralTriggerManagerDoFn(DoFn):
+  """A trigger manager that supports all windowing / triggering cases.
+
+  This implements a DoFn that manages triggering in a per-key basis. All
+  elements for a single key are processed together. Per-key state holds data
+  related to all windows.
+  """
+
+  KNOWN_WINDOWS = SetStateSpec('known_windows', IntervalWindowCoder())
+  FINISHED_WINDOWS = SetStateSpec('finished_windows', IntervalWindowCoder())
+  LAST_KNOWN_TIME = CombiningValueStateSpec('last_known_time', combine_fn=max)
+  LAST_KNOWN_WATERMARK = CombiningValueStateSpec(
+      'last_known_watermark', combine_fn=max)
+
+  # TODO(pabloem) What's the coder for the elements/keys here?
+  WINDOW_ELEMENT_PAIRS = BagStateSpec(
+      'element_bag', TupleCoder([IntervalWindowCoder(), PickleCoder()]))
+  WINDOW_TAG_VALUES = BagStateSpec(
+      'per_window_per_tag_value_state',
+      TupleCoder([IntervalWindowCoder(), StrUtf8Coder(), VarIntCoder()]))
+
+  PROCESSING_TIME_TIMER = TimerSpec(
+      'processing_time_timer', TimeDomain.REAL_TIME)
+  WATERMARK_TIMER = TimerSpec('watermark_timer', TimeDomain.WATERMARK)
+
+  def __init__(self, windowing: Windowing):
+    self.windowing = windowing
+    # Only session windows are merging. Other windows are non-merging.
+    self.merging_windows = isinstance(windowing.windowfn, Sessions)
+
+  def process(
+      self,
+      element: typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]],
+      element_bag: BagRuntimeState = DoFn.StateParam(WINDOW_ELEMENT_PAIRS),  # type: ignore
+      time_state: AccumulatingRuntimeState = DoFn.StateParam(LAST_KNOWN_TIME),  # type: ignore
+      watermark_state: AccumulatingRuntimeState = DoFn.StateParam(  # type: ignore
+          LAST_KNOWN_WATERMARK),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      windows_state: SetRuntimeState = DoFn.StateParam(KNOWN_WINDOWS),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER),
+      *args, **kwargs):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=time_state,
+        watermark_state=watermark_state,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    key, windowed_values = element
+    watermark = read_watermark(watermark_state)
+
+    windows_to_elements = collections.defaultdict(list)
+    for wv in windowed_values:
+      for window in wv.windows:
+        # ignore expired windows
+        if watermark > window.end + self.windowing.allowed_lateness:
+          continue
+        if window in finished_windows_state.read():
+          continue
+        windows_to_elements[window].append(
+            TimestampedValue(wv.value, wv.timestamp))
+
+    # Processing merging of windows
+    if self.merging_windows:
+      old_windows = set(windows_state.read())
+      all_windows = old_windows.union(list(windows_to_elements))
+      if all_windows != old_windows:
+        merge_context = TriggerMergeContext(
+            all_windows, context, self.windowing)
+        self.windowing.windowfn.merge(merge_context)
+
+        merged_windows_to_elements = collections.defaultdict(list)
+        for window, values in windows_to_elements.items():
+          while window in merge_context.merged_away:
+            window = merge_context.merged_away[window]
+          merged_windows_to_elements[window].extend(values)
+        windows_to_elements = merged_windows_to_elements
+
+      if old_windows != all_windows:
+        for w in windows_to_elements:
+          windows_state.add(w)
+    # Done processing merging of windows
+
+    seen_windows = set()
+    for w in windows_to_elements:
+      window_context = context.for_window(w)
+      seen_windows.add(w)
+      for value_w_timestamp in windows_to_elements[w]:
+        _LOGGER.debug(value_w_timestamp)
+        element_bag.add((w, value_w_timestamp))
+        self.windowing.triggerfn.on_element(windowed_values, w, window_context)
+
+    return self._trigger_fire(
+        key, TimeDomain.WATERMARK, watermark, None, context, seen_windows)
+
+  def _trigger_fire(
+      self,
+      key: K,
+      time_domain,
+      timestamp: Timestamp,
+      timer_tag: typing.Optional[str],
+      context: 'FnRunnerStatefulTriggerContext',
+      windows_of_interest: typing.Optional[typing.Set[BoundedWindow]] = None):
+    windows_to_elements = context.windows_to_elements_map()
+    context.elements_bag_state.clear()
+
+    fired_windows = set()
+    finished_windows = set()
+    _LOGGER.debug(
+        '%s - tag %s - timestamp %s', time_domain, timer_tag, timestamp)
+    for w, elems in windows_to_elements.items():
+      if windows_of_interest is not None and w not in windows_of_interest:
+        # windows_of_interest=None means that we care about all windows.
+        # If we care only about some windows, and this window is not one of
+        # them, then we do not intend to fire this window.
+        continue
+      window_context = context.for_window(w)
+      if self.windowing.triggerfn.should_fire(time_domain,
+                                              timestamp,
+                                              w,
+                                              window_context):
+        finished = self.windowing.triggerfn.on_fire(
+            timestamp, w, window_context)
+        _LOGGER.debug('Firing on window %s. Finished: %s', w, finished)
+        fired_windows.add(w)
+        if finished:
+          context.finished_windows_state.add(w)
+          finished_windows.add(w)
+        # TODO(pabloem): Format the output: e.g. pane info
+        elems = [WindowedValue(e.value, e.timestamp, (w, )) for e in elems]
+        yield (key, elems)
+
+    # Add elements that were not fired back into state.
+    for w, elems in windows_to_elements.items():
+      for e in elems:
+        if (w in finished_windows or
+            (w in fired_windows and
+             self.windowing.accumulation_mode == AccumulationMode.DISCARDING)):
+          continue
+        context.elements_bag_state.add((w, e))
+
+  @on_timer(PROCESSING_TIME_TIMER)

Review comment:
       Just a side note: I think processing time timers don't have any effect in batch pipelines as of today, same to java . Also I had the impression that processing time triggers do not work in batch, so this might be ok for now.

##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')
+
+
+class ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):
+    super(TriggerMergeContext, self).__init__(all_windows)
+    self.trigger_context = context
+    self.windowing = windowing
+    self.merged_away: typing.Dict[BoundedWindow, BoundedWindow] = {}
+
+  def merge(self, to_be_merged, merge_result):
+    _LOGGER.debug("Merging %s into %s", to_be_merged, merge_result)
+    self.trigger_context.merge_state(to_be_merged, merge_result)
+    for window in to_be_merged:
+      if window != merge_result:
+        self.merged_away[window] = merge_result
+        # Clear state associated with PaneInfo since it is
+        # not preserved across merges.
+        self.trigger_context.for_window(window).clear_state(None)
+    self.windowing.triggerfn.on_merge(
+        to_be_merged,
+        merge_result,
+        self.trigger_context.for_window(merge_result))
+
+
+@typehints.with_input_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+@typehints.with_output_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+class GeneralTriggerManagerDoFn(DoFn):
+  """A trigger manager that supports all windowing / triggering cases.
+
+  This implements a DoFn that manages triggering in a per-key basis. All
+  elements for a single key are processed together. Per-key state holds data
+  related to all windows.
+  """
+
+  KNOWN_WINDOWS = SetStateSpec('known_windows', IntervalWindowCoder())
+  FINISHED_WINDOWS = SetStateSpec('finished_windows', IntervalWindowCoder())
+  LAST_KNOWN_TIME = CombiningValueStateSpec('last_known_time', combine_fn=max)
+  LAST_KNOWN_WATERMARK = CombiningValueStateSpec(
+      'last_known_watermark', combine_fn=max)
+
+  # TODO(pabloem) What's the coder for the elements/keys here?
+  WINDOW_ELEMENT_PAIRS = BagStateSpec(
+      'element_bag', TupleCoder([IntervalWindowCoder(), PickleCoder()]))
+  WINDOW_TAG_VALUES = BagStateSpec(
+      'per_window_per_tag_value_state',
+      TupleCoder([IntervalWindowCoder(), StrUtf8Coder(), VarIntCoder()]))
+
+  PROCESSING_TIME_TIMER = TimerSpec(
+      'processing_time_timer', TimeDomain.REAL_TIME)
+  WATERMARK_TIMER = TimerSpec('watermark_timer', TimeDomain.WATERMARK)
+
+  def __init__(self, windowing: Windowing):
+    self.windowing = windowing
+    # Only session windows are merging. Other windows are non-merging.
+    self.merging_windows = isinstance(windowing.windowfn, Sessions)
+
+  def process(
+      self,
+      element: typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]],
+      element_bag: BagRuntimeState = DoFn.StateParam(WINDOW_ELEMENT_PAIRS),  # type: ignore
+      time_state: AccumulatingRuntimeState = DoFn.StateParam(LAST_KNOWN_TIME),  # type: ignore
+      watermark_state: AccumulatingRuntimeState = DoFn.StateParam(  # type: ignore
+          LAST_KNOWN_WATERMARK),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      windows_state: SetRuntimeState = DoFn.StateParam(KNOWN_WINDOWS),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER),
+      *args, **kwargs):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=time_state,
+        watermark_state=watermark_state,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    key, windowed_values = element
+    watermark = read_watermark(watermark_state)
+
+    windows_to_elements = collections.defaultdict(list)
+    for wv in windowed_values:
+      for window in wv.windows:
+        # ignore expired windows
+        if watermark > window.end + self.windowing.allowed_lateness:
+          continue
+        if window in finished_windows_state.read():
+          continue
+        windows_to_elements[window].append(
+            TimestampedValue(wv.value, wv.timestamp))
+
+    # Processing merging of windows
+    if self.merging_windows:
+      old_windows = set(windows_state.read())
+      all_windows = old_windows.union(list(windows_to_elements))
+      if all_windows != old_windows:
+        merge_context = TriggerMergeContext(
+            all_windows, context, self.windowing)
+        self.windowing.windowfn.merge(merge_context)
+
+        merged_windows_to_elements = collections.defaultdict(list)
+        for window, values in windows_to_elements.items():
+          while window in merge_context.merged_away:
+            window = merge_context.merged_away[window]
+          merged_windows_to_elements[window].extend(values)
+        windows_to_elements = merged_windows_to_elements
+
+      if old_windows != all_windows:
+        for w in windows_to_elements:
+          windows_state.add(w)
+    # Done processing merging of windows
+
+    seen_windows = set()
+    for w in windows_to_elements:
+      window_context = context.for_window(w)
+      seen_windows.add(w)
+      for value_w_timestamp in windows_to_elements[w]:
+        _LOGGER.debug(value_w_timestamp)
+        element_bag.add((w, value_w_timestamp))
+        self.windowing.triggerfn.on_element(windowed_values, w, window_context)
+
+    return self._trigger_fire(
+        key, TimeDomain.WATERMARK, watermark, None, context, seen_windows)
+
+  def _trigger_fire(
+      self,
+      key: K,
+      time_domain,
+      timestamp: Timestamp,
+      timer_tag: typing.Optional[str],
+      context: 'FnRunnerStatefulTriggerContext',
+      windows_of_interest: typing.Optional[typing.Set[BoundedWindow]] = None):
+    windows_to_elements = context.windows_to_elements_map()
+    context.elements_bag_state.clear()
+
+    fired_windows = set()
+    finished_windows = set()
+    _LOGGER.debug(
+        '%s - tag %s - timestamp %s', time_domain, timer_tag, timestamp)
+    for w, elems in windows_to_elements.items():
+      if windows_of_interest is not None and w not in windows_of_interest:
+        # windows_of_interest=None means that we care about all windows.
+        # If we care only about some windows, and this window is not one of
+        # them, then we do not intend to fire this window.
+        continue
+      window_context = context.for_window(w)
+      if self.windowing.triggerfn.should_fire(time_domain,
+                                              timestamp,
+                                              w,
+                                              window_context):
+        finished = self.windowing.triggerfn.on_fire(
+            timestamp, w, window_context)
+        _LOGGER.debug('Firing on window %s. Finished: %s', w, finished)
+        fired_windows.add(w)
+        if finished:
+          context.finished_windows_state.add(w)
+          finished_windows.add(w)
+        # TODO(pabloem): Format the output: e.g. pane info
+        elems = [WindowedValue(e.value, e.timestamp, (w, )) for e in elems]
+        yield (key, elems)
+
+    # Add elements that were not fired back into state.
+    for w, elems in windows_to_elements.items():
+      for e in elems:
+        if (w in finished_windows or
+            (w in fired_windows and
+             self.windowing.accumulation_mode == AccumulationMode.DISCARDING)):
+          continue
+        context.elements_bag_state.add((w, e))
+
+  @on_timer(PROCESSING_TIME_TIMER)
+  def processing_time_trigger(
+      self,
+      key=DoFn.KeyParam,
+      timer_tag=DoFn.DynamicTimerTagParam,
+      timestamp=DoFn.TimestampParam,
+      processing_time_state=DoFn.StateParam(LAST_KNOWN_TIME),
+      element_bag=DoFn.StateParam(WINDOW_ELEMENT_PAIRS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER)):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=processing_time_state,
+        watermark_state=None,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    result = self._trigger_fire(
+        key, TimeDomain.REAL_TIME, timestamp, timer_tag, context)
+    processing_time_state.add(timestamp)
+    return result
+
+  @on_timer(WATERMARK_TIMER)
+  def watermark_trigger(
+      self,
+      key=DoFn.KeyParam,
+      timer_tag=DoFn.DynamicTimerTagParam,
+      timestamp=DoFn.TimestampParam,
+      watermark_state=DoFn.StateParam(LAST_KNOWN_WATERMARK),
+      element_bag=DoFn.StateParam(WINDOW_ELEMENT_PAIRS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER)):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=None,
+        watermark_state=watermark_state,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    result = self._trigger_fire(
+        key, TimeDomain.WATERMARK, timestamp, timer_tag, context)
+    watermark_state.add(timestamp)
+    return result
+
+
+class FnRunnerStatefulTriggerContext(TriggerContext):
+  def __init__(
+      self,
+      processing_time_timer: RuntimeTimer,
+      watermark_timer: RuntimeTimer,
+      current_time_state: typing.Optional[AccumulatingRuntimeState],
+      watermark_state: typing.Optional[AccumulatingRuntimeState],
+      elements_bag_state: BagRuntimeState,
+      window_tag_values_bag_state: BagRuntimeState,
+      finished_windows_state: SetRuntimeState):
+    self.timers = {
+        TimeDomain.REAL_TIME: processing_time_timer,
+        TimeDomain.WATERMARK: watermark_timer
+    }
+    self.current_times = {
+        TimeDomain.REAL_TIME: current_time_state,
+        TimeDomain.WATERMARK: watermark_state
+    }
+    self.elements_bag_state = elements_bag_state
+    self.window_tag_values_bag_state = window_tag_values_bag_state
+    self.finished_windows_state = finished_windows_state
+
+  def windows_to_elements_map(
+      self
+  ) -> typing.Dict[BoundedWindow, typing.List[windowed_value.WindowedValue]]:
+    window_element_pairs: typing.Iterable[typing.Tuple[
+        BoundedWindow,
+        windowed_value.WindowedValue]] = self.elements_bag_state.read()
+    result: typing.Dict[BoundedWindow,
+                        typing.List[windowed_value.WindowedValue]] = {}
+    for w, e in window_element_pairs:
+      if w not in result:
+        result[w] = []
+      result[w].append(e)
+    return result
+
+  def for_window(self, window):
+    return PerWindowTriggerContext(window, self)
+
+  def get_current_time(self):
+    return self.current_times[TimeDomain.REAL_TIME].read()
+
+  def set_timer(self, name, time_domain, timestamp):
+    _LOGGER.debug('Setting timer (%s, %s) at %s', time_domain, name, timestamp)
+    self.timers[time_domain].set(timestamp, dynamic_timer_tag=name)
+
+  def clear_timer(self, name, time_domain):
+    _LOGGER.debug('Clearing timer (%s, %s)', time_domain, name)
+    self.timers[time_domain].clear(dynamic_timer_tag=name)
+
+  def merge_state(self, to_be_merged, merge_result):

Review comment:
       call it merge_windows?

##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')
+
+
+class ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):
+    super(TriggerMergeContext, self).__init__(all_windows)
+    self.trigger_context = context
+    self.windowing = windowing
+    self.merged_away: typing.Dict[BoundedWindow, BoundedWindow] = {}
+
+  def merge(self, to_be_merged, merge_result):
+    _LOGGER.debug("Merging %s into %s", to_be_merged, merge_result)
+    self.trigger_context.merge_state(to_be_merged, merge_result)
+    for window in to_be_merged:
+      if window != merge_result:
+        self.merged_away[window] = merge_result
+        # Clear state associated with PaneInfo since it is
+        # not preserved across merges.
+        self.trigger_context.for_window(window).clear_state(None)
+    self.windowing.triggerfn.on_merge(
+        to_be_merged,
+        merge_result,
+        self.trigger_context.for_window(merge_result))
+
+
+@typehints.with_input_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+@typehints.with_output_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+class GeneralTriggerManagerDoFn(DoFn):
+  """A trigger manager that supports all windowing / triggering cases.
+
+  This implements a DoFn that manages triggering in a per-key basis. All
+  elements for a single key are processed together. Per-key state holds data
+  related to all windows.
+  """
+
+  KNOWN_WINDOWS = SetStateSpec('known_windows', IntervalWindowCoder())
+  FINISHED_WINDOWS = SetStateSpec('finished_windows', IntervalWindowCoder())
+  LAST_KNOWN_TIME = CombiningValueStateSpec('last_known_time', combine_fn=max)
+  LAST_KNOWN_WATERMARK = CombiningValueStateSpec(
+      'last_known_watermark', combine_fn=max)
+
+  # TODO(pabloem) What's the coder for the elements/keys here?
+  WINDOW_ELEMENT_PAIRS = BagStateSpec(
+      'element_bag', TupleCoder([IntervalWindowCoder(), PickleCoder()]))
+  WINDOW_TAG_VALUES = BagStateSpec(
+      'per_window_per_tag_value_state',
+      TupleCoder([IntervalWindowCoder(), StrUtf8Coder(), VarIntCoder()]))
+
+  PROCESSING_TIME_TIMER = TimerSpec(
+      'processing_time_timer', TimeDomain.REAL_TIME)
+  WATERMARK_TIMER = TimerSpec('watermark_timer', TimeDomain.WATERMARK)
+
+  def __init__(self, windowing: Windowing):
+    self.windowing = windowing
+    # Only session windows are merging. Other windows are non-merging.
+    self.merging_windows = isinstance(windowing.windowfn, Sessions)
+
+  def process(
+      self,
+      element: typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]],
+      element_bag: BagRuntimeState = DoFn.StateParam(WINDOW_ELEMENT_PAIRS),  # type: ignore

Review comment:
       I think the parameter names can be more explicit, for example all_elements, latest_processing_time etc. And the underneath data structure doesn't matter that much since we already have the type hints, so the suffix such as bag, state probably can be removed.

##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')
+
+
+class ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):
+    super(TriggerMergeContext, self).__init__(all_windows)
+    self.trigger_context = context
+    self.windowing = windowing
+    self.merged_away: typing.Dict[BoundedWindow, BoundedWindow] = {}
+
+  def merge(self, to_be_merged, merge_result):
+    _LOGGER.debug("Merging %s into %s", to_be_merged, merge_result)
+    self.trigger_context.merge_state(to_be_merged, merge_result)
+    for window in to_be_merged:
+      if window != merge_result:
+        self.merged_away[window] = merge_result
+        # Clear state associated with PaneInfo since it is
+        # not preserved across merges.
+        self.trigger_context.for_window(window).clear_state(None)
+    self.windowing.triggerfn.on_merge(
+        to_be_merged,
+        merge_result,
+        self.trigger_context.for_window(merge_result))
+
+
+@typehints.with_input_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+@typehints.with_output_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+class GeneralTriggerManagerDoFn(DoFn):
+  """A trigger manager that supports all windowing / triggering cases.
+
+  This implements a DoFn that manages triggering in a per-key basis. All
+  elements for a single key are processed together. Per-key state holds data
+  related to all windows.
+  """
+
+  KNOWN_WINDOWS = SetStateSpec('known_windows', IntervalWindowCoder())
+  FINISHED_WINDOWS = SetStateSpec('finished_windows', IntervalWindowCoder())
+  LAST_KNOWN_TIME = CombiningValueStateSpec('last_known_time', combine_fn=max)
+  LAST_KNOWN_WATERMARK = CombiningValueStateSpec(
+      'last_known_watermark', combine_fn=max)
+
+  # TODO(pabloem) What's the coder for the elements/keys here?
+  WINDOW_ELEMENT_PAIRS = BagStateSpec(
+      'element_bag', TupleCoder([IntervalWindowCoder(), PickleCoder()]))
+  WINDOW_TAG_VALUES = BagStateSpec(
+      'per_window_per_tag_value_state',
+      TupleCoder([IntervalWindowCoder(), StrUtf8Coder(), VarIntCoder()]))
+
+  PROCESSING_TIME_TIMER = TimerSpec(
+      'processing_time_timer', TimeDomain.REAL_TIME)
+  WATERMARK_TIMER = TimerSpec('watermark_timer', TimeDomain.WATERMARK)
+
+  def __init__(self, windowing: Windowing):
+    self.windowing = windowing
+    # Only session windows are merging. Other windows are non-merging.
+    self.merging_windows = isinstance(windowing.windowfn, Sessions)
+
+  def process(
+      self,
+      element: typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]],
+      element_bag: BagRuntimeState = DoFn.StateParam(WINDOW_ELEMENT_PAIRS),  # type: ignore
+      time_state: AccumulatingRuntimeState = DoFn.StateParam(LAST_KNOWN_TIME),  # type: ignore
+      watermark_state: AccumulatingRuntimeState = DoFn.StateParam(  # type: ignore
+          LAST_KNOWN_WATERMARK),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      windows_state: SetRuntimeState = DoFn.StateParam(KNOWN_WINDOWS),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER),
+      *args, **kwargs):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=time_state,
+        watermark_state=watermark_state,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    key, windowed_values = element
+    watermark = read_watermark(watermark_state)
+
+    windows_to_elements = collections.defaultdict(list)
+    for wv in windowed_values:
+      for window in wv.windows:
+        # ignore expired windows
+        if watermark > window.end + self.windowing.allowed_lateness:
+          continue
+        if window in finished_windows_state.read():
+          continue
+        windows_to_elements[window].append(
+            TimestampedValue(wv.value, wv.timestamp))
+
+    # Processing merging of windows
+    if self.merging_windows:
+      old_windows = set(windows_state.read())
+      all_windows = old_windows.union(list(windows_to_elements))
+      if all_windows != old_windows:
+        merge_context = TriggerMergeContext(
+            all_windows, context, self.windowing)
+        self.windowing.windowfn.merge(merge_context)
+
+        merged_windows_to_elements = collections.defaultdict(list)
+        for window, values in windows_to_elements.items():
+          while window in merge_context.merged_away:
+            window = merge_context.merged_away[window]
+          merged_windows_to_elements[window].extend(values)
+        windows_to_elements = merged_windows_to_elements
+
+      if old_windows != all_windows:
+        for w in windows_to_elements:
+          windows_state.add(w)
+    # Done processing merging of windows
+
+    seen_windows = set()
+    for w in windows_to_elements:
+      window_context = context.for_window(w)
+      seen_windows.add(w)
+      for value_w_timestamp in windows_to_elements[w]:
+        _LOGGER.debug(value_w_timestamp)
+        element_bag.add((w, value_w_timestamp))
+        self.windowing.triggerfn.on_element(windowed_values, w, window_context)
+
+    return self._trigger_fire(
+        key, TimeDomain.WATERMARK, watermark, None, context, seen_windows)
+
+  def _trigger_fire(

Review comment:
       maybe _fire_eligible_windows?

##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')
+
+
+class ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):

Review comment:
       nit: Seems we only need the trigger fn from the windowing, consider only accept trigger fn instead of windowing since it is most relevant?

##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')
+
+
+class ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):
+    super(TriggerMergeContext, self).__init__(all_windows)
+    self.trigger_context = context
+    self.windowing = windowing
+    self.merged_away: typing.Dict[BoundedWindow, BoundedWindow] = {}
+
+  def merge(self, to_be_merged, merge_result):
+    _LOGGER.debug("Merging %s into %s", to_be_merged, merge_result)
+    self.trigger_context.merge_state(to_be_merged, merge_result)
+    for window in to_be_merged:
+      if window != merge_result:
+        self.merged_away[window] = merge_result
+        # Clear state associated with PaneInfo since it is
+        # not preserved across merges.
+        self.trigger_context.for_window(window).clear_state(None)
+    self.windowing.triggerfn.on_merge(
+        to_be_merged,
+        merge_result,
+        self.trigger_context.for_window(merge_result))
+
+
+@typehints.with_input_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+@typehints.with_output_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+class GeneralTriggerManagerDoFn(DoFn):
+  """A trigger manager that supports all windowing / triggering cases.
+
+  This implements a DoFn that manages triggering in a per-key basis. All
+  elements for a single key are processed together. Per-key state holds data
+  related to all windows.
+  """
+
+  KNOWN_WINDOWS = SetStateSpec('known_windows', IntervalWindowCoder())
+  FINISHED_WINDOWS = SetStateSpec('finished_windows', IntervalWindowCoder())
+  LAST_KNOWN_TIME = CombiningValueStateSpec('last_known_time', combine_fn=max)
+  LAST_KNOWN_WATERMARK = CombiningValueStateSpec(
+      'last_known_watermark', combine_fn=max)
+
+  # TODO(pabloem) What's the coder for the elements/keys here?
+  WINDOW_ELEMENT_PAIRS = BagStateSpec(
+      'element_bag', TupleCoder([IntervalWindowCoder(), PickleCoder()]))
+  WINDOW_TAG_VALUES = BagStateSpec(
+      'per_window_per_tag_value_state',
+      TupleCoder([IntervalWindowCoder(), StrUtf8Coder(), VarIntCoder()]))
+
+  PROCESSING_TIME_TIMER = TimerSpec(
+      'processing_time_timer', TimeDomain.REAL_TIME)
+  WATERMARK_TIMER = TimerSpec('watermark_timer', TimeDomain.WATERMARK)
+
+  def __init__(self, windowing: Windowing):
+    self.windowing = windowing
+    # Only session windows are merging. Other windows are non-merging.
+    self.merging_windows = isinstance(windowing.windowfn, Sessions)

Review comment:
       I think this should be windowing.windowfn.is_merging() since there could be user defined merging window types.

##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')
+
+
+class ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):
+    super(TriggerMergeContext, self).__init__(all_windows)
+    self.trigger_context = context
+    self.windowing = windowing
+    self.merged_away: typing.Dict[BoundedWindow, BoundedWindow] = {}
+
+  def merge(self, to_be_merged, merge_result):
+    _LOGGER.debug("Merging %s into %s", to_be_merged, merge_result)
+    self.trigger_context.merge_state(to_be_merged, merge_result)
+    for window in to_be_merged:
+      if window != merge_result:
+        self.merged_away[window] = merge_result
+        # Clear state associated with PaneInfo since it is
+        # not preserved across merges.
+        self.trigger_context.for_window(window).clear_state(None)
+    self.windowing.triggerfn.on_merge(
+        to_be_merged,
+        merge_result,
+        self.trigger_context.for_window(merge_result))
+
+
+@typehints.with_input_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+@typehints.with_output_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+class GeneralTriggerManagerDoFn(DoFn):
+  """A trigger manager that supports all windowing / triggering cases.
+
+  This implements a DoFn that manages triggering in a per-key basis. All
+  elements for a single key are processed together. Per-key state holds data
+  related to all windows.
+  """
+
+  KNOWN_WINDOWS = SetStateSpec('known_windows', IntervalWindowCoder())
+  FINISHED_WINDOWS = SetStateSpec('finished_windows', IntervalWindowCoder())
+  LAST_KNOWN_TIME = CombiningValueStateSpec('last_known_time', combine_fn=max)
+  LAST_KNOWN_WATERMARK = CombiningValueStateSpec(
+      'last_known_watermark', combine_fn=max)
+
+  # TODO(pabloem) What's the coder for the elements/keys here?
+  WINDOW_ELEMENT_PAIRS = BagStateSpec(
+      'element_bag', TupleCoder([IntervalWindowCoder(), PickleCoder()]))
+  WINDOW_TAG_VALUES = BagStateSpec(
+      'per_window_per_tag_value_state',
+      TupleCoder([IntervalWindowCoder(), StrUtf8Coder(), VarIntCoder()]))
+
+  PROCESSING_TIME_TIMER = TimerSpec(
+      'processing_time_timer', TimeDomain.REAL_TIME)
+  WATERMARK_TIMER = TimerSpec('watermark_timer', TimeDomain.WATERMARK)
+
+  def __init__(self, windowing: Windowing):
+    self.windowing = windowing
+    # Only session windows are merging. Other windows are non-merging.
+    self.merging_windows = isinstance(windowing.windowfn, Sessions)
+
+  def process(
+      self,
+      element: typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]],
+      element_bag: BagRuntimeState = DoFn.StateParam(WINDOW_ELEMENT_PAIRS),  # type: ignore
+      time_state: AccumulatingRuntimeState = DoFn.StateParam(LAST_KNOWN_TIME),  # type: ignore
+      watermark_state: AccumulatingRuntimeState = DoFn.StateParam(  # type: ignore
+          LAST_KNOWN_WATERMARK),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      windows_state: SetRuntimeState = DoFn.StateParam(KNOWN_WINDOWS),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER),
+      *args, **kwargs):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=time_state,
+        watermark_state=watermark_state,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    key, windowed_values = element
+    watermark = read_watermark(watermark_state)
+
+    windows_to_elements = collections.defaultdict(list)
+    for wv in windowed_values:
+      for window in wv.windows:
+        # ignore expired windows
+        if watermark > window.end + self.windowing.allowed_lateness:
+          continue
+        if window in finished_windows_state.read():
+          continue
+        windows_to_elements[window].append(
+            TimestampedValue(wv.value, wv.timestamp))
+
+    # Processing merging of windows
+    if self.merging_windows:
+      old_windows = set(windows_state.read())
+      all_windows = old_windows.union(list(windows_to_elements))
+      if all_windows != old_windows:
+        merge_context = TriggerMergeContext(
+            all_windows, context, self.windowing)
+        self.windowing.windowfn.merge(merge_context)
+
+        merged_windows_to_elements = collections.defaultdict(list)
+        for window, values in windows_to_elements.items():
+          while window in merge_context.merged_away:
+            window = merge_context.merged_away[window]
+          merged_windows_to_elements[window].extend(values)
+        windows_to_elements = merged_windows_to_elements
+
+      if old_windows != all_windows:
+        for w in windows_to_elements:
+          windows_state.add(w)
+    # Done processing merging of windows
+
+    seen_windows = set()
+    for w in windows_to_elements:
+      window_context = context.for_window(w)
+      seen_windows.add(w)
+      for value_w_timestamp in windows_to_elements[w]:
+        _LOGGER.debug(value_w_timestamp)
+        element_bag.add((w, value_w_timestamp))
+        self.windowing.triggerfn.on_element(windowed_values, w, window_context)
+
+    return self._trigger_fire(
+        key, TimeDomain.WATERMARK, watermark, None, context, seen_windows)
+
+  def _trigger_fire(
+      self,
+      key: K,
+      time_domain,
+      timestamp: Timestamp,
+      timer_tag: typing.Optional[str],
+      context: 'FnRunnerStatefulTriggerContext',
+      windows_of_interest: typing.Optional[typing.Set[BoundedWindow]] = None):
+    windows_to_elements = context.windows_to_elements_map()
+    context.elements_bag_state.clear()
+
+    fired_windows = set()
+    finished_windows = set()
+    _LOGGER.debug(
+        '%s - tag %s - timestamp %s', time_domain, timer_tag, timestamp)
+    for w, elems in windows_to_elements.items():
+      if windows_of_interest is not None and w not in windows_of_interest:
+        # windows_of_interest=None means that we care about all windows.
+        # If we care only about some windows, and this window is not one of
+        # them, then we do not intend to fire this window.
+        continue
+      window_context = context.for_window(w)
+      if self.windowing.triggerfn.should_fire(time_domain,
+                                              timestamp,
+                                              w,
+                                              window_context):
+        finished = self.windowing.triggerfn.on_fire(
+            timestamp, w, window_context)
+        _LOGGER.debug('Firing on window %s. Finished: %s', w, finished)
+        fired_windows.add(w)
+        if finished:
+          context.finished_windows_state.add(w)
+          finished_windows.add(w)

Review comment:
       does it make sense to just use context.finished_windows_state instead?

##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')
+
+
+class ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):
+    super(TriggerMergeContext, self).__init__(all_windows)
+    self.trigger_context = context
+    self.windowing = windowing
+    self.merged_away: typing.Dict[BoundedWindow, BoundedWindow] = {}
+
+  def merge(self, to_be_merged, merge_result):
+    _LOGGER.debug("Merging %s into %s", to_be_merged, merge_result)
+    self.trigger_context.merge_state(to_be_merged, merge_result)
+    for window in to_be_merged:
+      if window != merge_result:
+        self.merged_away[window] = merge_result
+        # Clear state associated with PaneInfo since it is
+        # not preserved across merges.
+        self.trigger_context.for_window(window).clear_state(None)
+    self.windowing.triggerfn.on_merge(
+        to_be_merged,
+        merge_result,
+        self.trigger_context.for_window(merge_result))
+
+
+@typehints.with_input_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+@typehints.with_output_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+class GeneralTriggerManagerDoFn(DoFn):
+  """A trigger manager that supports all windowing / triggering cases.
+
+  This implements a DoFn that manages triggering in a per-key basis. All
+  elements for a single key are processed together. Per-key state holds data
+  related to all windows.
+  """
+
+  KNOWN_WINDOWS = SetStateSpec('known_windows', IntervalWindowCoder())
+  FINISHED_WINDOWS = SetStateSpec('finished_windows', IntervalWindowCoder())
+  LAST_KNOWN_TIME = CombiningValueStateSpec('last_known_time', combine_fn=max)
+  LAST_KNOWN_WATERMARK = CombiningValueStateSpec(
+      'last_known_watermark', combine_fn=max)
+
+  # TODO(pabloem) What's the coder for the elements/keys here?
+  WINDOW_ELEMENT_PAIRS = BagStateSpec(
+      'element_bag', TupleCoder([IntervalWindowCoder(), PickleCoder()]))
+  WINDOW_TAG_VALUES = BagStateSpec(
+      'per_window_per_tag_value_state',
+      TupleCoder([IntervalWindowCoder(), StrUtf8Coder(), VarIntCoder()]))
+
+  PROCESSING_TIME_TIMER = TimerSpec(
+      'processing_time_timer', TimeDomain.REAL_TIME)
+  WATERMARK_TIMER = TimerSpec('watermark_timer', TimeDomain.WATERMARK)
+
+  def __init__(self, windowing: Windowing):
+    self.windowing = windowing
+    # Only session windows are merging. Other windows are non-merging.
+    self.merging_windows = isinstance(windowing.windowfn, Sessions)
+
+  def process(
+      self,
+      element: typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]],
+      element_bag: BagRuntimeState = DoFn.StateParam(WINDOW_ELEMENT_PAIRS),  # type: ignore
+      time_state: AccumulatingRuntimeState = DoFn.StateParam(LAST_KNOWN_TIME),  # type: ignore
+      watermark_state: AccumulatingRuntimeState = DoFn.StateParam(  # type: ignore
+          LAST_KNOWN_WATERMARK),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      windows_state: SetRuntimeState = DoFn.StateParam(KNOWN_WINDOWS),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER),
+      *args, **kwargs):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=time_state,
+        watermark_state=watermark_state,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    key, windowed_values = element
+    watermark = read_watermark(watermark_state)
+
+    windows_to_elements = collections.defaultdict(list)
+    for wv in windowed_values:
+      for window in wv.windows:
+        # ignore expired windows
+        if watermark > window.end + self.windowing.allowed_lateness:
+          continue
+        if window in finished_windows_state.read():
+          continue
+        windows_to_elements[window].append(
+            TimestampedValue(wv.value, wv.timestamp))
+
+    # Processing merging of windows
+    if self.merging_windows:
+      old_windows = set(windows_state.read())
+      all_windows = old_windows.union(list(windows_to_elements))
+      if all_windows != old_windows:
+        merge_context = TriggerMergeContext(
+            all_windows, context, self.windowing)
+        self.windowing.windowfn.merge(merge_context)
+
+        merged_windows_to_elements = collections.defaultdict(list)
+        for window, values in windows_to_elements.items():
+          while window in merge_context.merged_away:
+            window = merge_context.merged_away[window]
+          merged_windows_to_elements[window].extend(values)
+        windows_to_elements = merged_windows_to_elements
+
+      if old_windows != all_windows:

Review comment:
       seems already checked above?

##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')
+
+
+class ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):
+    super(TriggerMergeContext, self).__init__(all_windows)
+    self.trigger_context = context
+    self.windowing = windowing
+    self.merged_away: typing.Dict[BoundedWindow, BoundedWindow] = {}
+
+  def merge(self, to_be_merged, merge_result):
+    _LOGGER.debug("Merging %s into %s", to_be_merged, merge_result)
+    self.trigger_context.merge_state(to_be_merged, merge_result)
+    for window in to_be_merged:
+      if window != merge_result:
+        self.merged_away[window] = merge_result
+        # Clear state associated with PaneInfo since it is
+        # not preserved across merges.
+        self.trigger_context.for_window(window).clear_state(None)
+    self.windowing.triggerfn.on_merge(
+        to_be_merged,
+        merge_result,
+        self.trigger_context.for_window(merge_result))
+
+
+@typehints.with_input_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+@typehints.with_output_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+class GeneralTriggerManagerDoFn(DoFn):
+  """A trigger manager that supports all windowing / triggering cases.
+
+  This implements a DoFn that manages triggering in a per-key basis. All
+  elements for a single key are processed together. Per-key state holds data
+  related to all windows.
+  """
+
+  KNOWN_WINDOWS = SetStateSpec('known_windows', IntervalWindowCoder())
+  FINISHED_WINDOWS = SetStateSpec('finished_windows', IntervalWindowCoder())
+  LAST_KNOWN_TIME = CombiningValueStateSpec('last_known_time', combine_fn=max)
+  LAST_KNOWN_WATERMARK = CombiningValueStateSpec(
+      'last_known_watermark', combine_fn=max)
+
+  # TODO(pabloem) What's the coder for the elements/keys here?
+  WINDOW_ELEMENT_PAIRS = BagStateSpec(
+      'element_bag', TupleCoder([IntervalWindowCoder(), PickleCoder()]))
+  WINDOW_TAG_VALUES = BagStateSpec(
+      'per_window_per_tag_value_state',
+      TupleCoder([IntervalWindowCoder(), StrUtf8Coder(), VarIntCoder()]))
+
+  PROCESSING_TIME_TIMER = TimerSpec(
+      'processing_time_timer', TimeDomain.REAL_TIME)
+  WATERMARK_TIMER = TimerSpec('watermark_timer', TimeDomain.WATERMARK)
+
+  def __init__(self, windowing: Windowing):
+    self.windowing = windowing
+    # Only session windows are merging. Other windows are non-merging.
+    self.merging_windows = isinstance(windowing.windowfn, Sessions)
+
+  def process(
+      self,
+      element: typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]],
+      element_bag: BagRuntimeState = DoFn.StateParam(WINDOW_ELEMENT_PAIRS),  # type: ignore
+      time_state: AccumulatingRuntimeState = DoFn.StateParam(LAST_KNOWN_TIME),  # type: ignore
+      watermark_state: AccumulatingRuntimeState = DoFn.StateParam(  # type: ignore
+          LAST_KNOWN_WATERMARK),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      windows_state: SetRuntimeState = DoFn.StateParam(KNOWN_WINDOWS),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER),
+      *args, **kwargs):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=time_state,
+        watermark_state=watermark_state,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    key, windowed_values = element
+    watermark = read_watermark(watermark_state)
+
+    windows_to_elements = collections.defaultdict(list)
+    for wv in windowed_values:
+      for window in wv.windows:
+        # ignore expired windows
+        if watermark > window.end + self.windowing.allowed_lateness:
+          continue
+        if window in finished_windows_state.read():
+          continue
+        windows_to_elements[window].append(
+            TimestampedValue(wv.value, wv.timestamp))
+
+    # Processing merging of windows
+    if self.merging_windows:
+      old_windows = set(windows_state.read())
+      all_windows = old_windows.union(list(windows_to_elements))
+      if all_windows != old_windows:
+        merge_context = TriggerMergeContext(
+            all_windows, context, self.windowing)
+        self.windowing.windowfn.merge(merge_context)
+
+        merged_windows_to_elements = collections.defaultdict(list)
+        for window, values in windows_to_elements.items():
+          while window in merge_context.merged_away:
+            window = merge_context.merged_away[window]
+          merged_windows_to_elements[window].extend(values)
+        windows_to_elements = merged_windows_to_elements
+
+      if old_windows != all_windows:
+        for w in windows_to_elements:
+          windows_state.add(w)
+    # Done processing merging of windows
+
+    seen_windows = set()
+    for w in windows_to_elements:
+      window_context = context.for_window(w)
+      seen_windows.add(w)
+      for value_w_timestamp in windows_to_elements[w]:
+        _LOGGER.debug(value_w_timestamp)
+        element_bag.add((w, value_w_timestamp))
+        self.windowing.triggerfn.on_element(windowed_values, w, window_context)
+
+    return self._trigger_fire(
+        key, TimeDomain.WATERMARK, watermark, None, context, seen_windows)
+
+  def _trigger_fire(
+      self,
+      key: K,
+      time_domain,
+      timestamp: Timestamp,
+      timer_tag: typing.Optional[str],
+      context: 'FnRunnerStatefulTriggerContext',
+      windows_of_interest: typing.Optional[typing.Set[BoundedWindow]] = None):
+    windows_to_elements = context.windows_to_elements_map()
+    context.elements_bag_state.clear()
+
+    fired_windows = set()
+    finished_windows = set()
+    _LOGGER.debug(
+        '%s - tag %s - timestamp %s', time_domain, timer_tag, timestamp)
+    for w, elems in windows_to_elements.items():
+      if windows_of_interest is not None and w not in windows_of_interest:
+        # windows_of_interest=None means that we care about all windows.
+        # If we care only about some windows, and this window is not one of
+        # them, then we do not intend to fire this window.
+        continue
+      window_context = context.for_window(w)
+      if self.windowing.triggerfn.should_fire(time_domain,
+                                              timestamp,
+                                              w,
+                                              window_context):
+        finished = self.windowing.triggerfn.on_fire(
+            timestamp, w, window_context)
+        _LOGGER.debug('Firing on window %s. Finished: %s', w, finished)
+        fired_windows.add(w)
+        if finished:
+          context.finished_windows_state.add(w)
+          finished_windows.add(w)
+        # TODO(pabloem): Format the output: e.g. pane info
+        elems = [WindowedValue(e.value, e.timestamp, (w, )) for e in elems]
+        yield (key, elems)
+
+    # Add elements that were not fired back into state.
+    for w, elems in windows_to_elements.items():
+      for e in elems:
+        if (w in finished_windows or
+            (w in fired_windows and
+             self.windowing.accumulation_mode == AccumulationMode.DISCARDING)):
+          continue
+        context.elements_bag_state.add((w, e))
+
+  @on_timer(PROCESSING_TIME_TIMER)
+  def processing_time_trigger(
+      self,
+      key=DoFn.KeyParam,
+      timer_tag=DoFn.DynamicTimerTagParam,
+      timestamp=DoFn.TimestampParam,
+      processing_time_state=DoFn.StateParam(LAST_KNOWN_TIME),
+      element_bag=DoFn.StateParam(WINDOW_ELEMENT_PAIRS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER)):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=processing_time_state,
+        watermark_state=None,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    result = self._trigger_fire(
+        key, TimeDomain.REAL_TIME, timestamp, timer_tag, context)
+    processing_time_state.add(timestamp)
+    return result
+
+  @on_timer(WATERMARK_TIMER)
+  def watermark_trigger(
+      self,
+      key=DoFn.KeyParam,
+      timer_tag=DoFn.DynamicTimerTagParam,
+      timestamp=DoFn.TimestampParam,
+      watermark_state=DoFn.StateParam(LAST_KNOWN_WATERMARK),
+      element_bag=DoFn.StateParam(WINDOW_ELEMENT_PAIRS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER)):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=None,
+        watermark_state=watermark_state,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    result = self._trigger_fire(
+        key, TimeDomain.WATERMARK, timestamp, timer_tag, context)
+    watermark_state.add(timestamp)
+    return result
+
+
+class FnRunnerStatefulTriggerContext(TriggerContext):
+  def __init__(
+      self,
+      processing_time_timer: RuntimeTimer,
+      watermark_timer: RuntimeTimer,
+      current_time_state: typing.Optional[AccumulatingRuntimeState],
+      watermark_state: typing.Optional[AccumulatingRuntimeState],
+      elements_bag_state: BagRuntimeState,
+      window_tag_values_bag_state: BagRuntimeState,
+      finished_windows_state: SetRuntimeState):
+    self.timers = {
+        TimeDomain.REAL_TIME: processing_time_timer,
+        TimeDomain.WATERMARK: watermark_timer
+    }
+    self.current_times = {
+        TimeDomain.REAL_TIME: current_time_state,
+        TimeDomain.WATERMARK: watermark_state
+    }
+    self.elements_bag_state = elements_bag_state
+    self.window_tag_values_bag_state = window_tag_values_bag_state
+    self.finished_windows_state = finished_windows_state
+
+  def windows_to_elements_map(
+      self
+  ) -> typing.Dict[BoundedWindow, typing.List[windowed_value.WindowedValue]]:
+    window_element_pairs: typing.Iterable[typing.Tuple[
+        BoundedWindow,
+        windowed_value.WindowedValue]] = self.elements_bag_state.read()
+    result: typing.Dict[BoundedWindow,
+                        typing.List[windowed_value.WindowedValue]] = {}
+    for w, e in window_element_pairs:
+      if w not in result:
+        result[w] = []
+      result[w].append(e)
+    return result
+
+  def for_window(self, window):
+    return PerWindowTriggerContext(window, self)
+
+  def get_current_time(self):
+    return self.current_times[TimeDomain.REAL_TIME].read()
+
+  def set_timer(self, name, time_domain, timestamp):
+    _LOGGER.debug('Setting timer (%s, %s) at %s', time_domain, name, timestamp)
+    self.timers[time_domain].set(timestamp, dynamic_timer_tag=name)
+
+  def clear_timer(self, name, time_domain):
+    _LOGGER.debug('Clearing timer (%s, %s)', time_domain, name)
+    self.timers[time_domain].clear(dynamic_timer_tag=name)
+
+  def merge_state(self, to_be_merged, merge_result):
+    all_triplets = self.window_tag_values_bag_state.read()

Review comment:
       I'm a little bit confused about the content of all_triplets, is this suppose to store window states for pane_info? How does it get updated when merging? I'm assuming the pane_info states should be updated when merged window fires instead of when merging happens.

##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')
+
+
+class ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):
+    super(TriggerMergeContext, self).__init__(all_windows)
+    self.trigger_context = context
+    self.windowing = windowing
+    self.merged_away: typing.Dict[BoundedWindow, BoundedWindow] = {}
+
+  def merge(self, to_be_merged, merge_result):
+    _LOGGER.debug("Merging %s into %s", to_be_merged, merge_result)
+    self.trigger_context.merge_state(to_be_merged, merge_result)
+    for window in to_be_merged:
+      if window != merge_result:
+        self.merged_away[window] = merge_result
+        # Clear state associated with PaneInfo since it is
+        # not preserved across merges.
+        self.trigger_context.for_window(window).clear_state(None)
+    self.windowing.triggerfn.on_merge(
+        to_be_merged,
+        merge_result,
+        self.trigger_context.for_window(merge_result))
+
+
+@typehints.with_input_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+@typehints.with_output_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+class GeneralTriggerManagerDoFn(DoFn):
+  """A trigger manager that supports all windowing / triggering cases.
+
+  This implements a DoFn that manages triggering in a per-key basis. All
+  elements for a single key are processed together. Per-key state holds data
+  related to all windows.
+  """
+
+  KNOWN_WINDOWS = SetStateSpec('known_windows', IntervalWindowCoder())
+  FINISHED_WINDOWS = SetStateSpec('finished_windows', IntervalWindowCoder())
+  LAST_KNOWN_TIME = CombiningValueStateSpec('last_known_time', combine_fn=max)
+  LAST_KNOWN_WATERMARK = CombiningValueStateSpec(
+      'last_known_watermark', combine_fn=max)
+
+  # TODO(pabloem) What's the coder for the elements/keys here?
+  WINDOW_ELEMENT_PAIRS = BagStateSpec(
+      'element_bag', TupleCoder([IntervalWindowCoder(), PickleCoder()]))
+  WINDOW_TAG_VALUES = BagStateSpec(
+      'per_window_per_tag_value_state',
+      TupleCoder([IntervalWindowCoder(), StrUtf8Coder(), VarIntCoder()]))
+
+  PROCESSING_TIME_TIMER = TimerSpec(
+      'processing_time_timer', TimeDomain.REAL_TIME)
+  WATERMARK_TIMER = TimerSpec('watermark_timer', TimeDomain.WATERMARK)
+
+  def __init__(self, windowing: Windowing):
+    self.windowing = windowing
+    # Only session windows are merging. Other windows are non-merging.
+    self.merging_windows = isinstance(windowing.windowfn, Sessions)
+
+  def process(
+      self,
+      element: typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]],
+      element_bag: BagRuntimeState = DoFn.StateParam(WINDOW_ELEMENT_PAIRS),  # type: ignore
+      time_state: AccumulatingRuntimeState = DoFn.StateParam(LAST_KNOWN_TIME),  # type: ignore
+      watermark_state: AccumulatingRuntimeState = DoFn.StateParam(  # type: ignore
+          LAST_KNOWN_WATERMARK),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      windows_state: SetRuntimeState = DoFn.StateParam(KNOWN_WINDOWS),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER),
+      *args, **kwargs):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=time_state,
+        watermark_state=watermark_state,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    key, windowed_values = element
+    watermark = read_watermark(watermark_state)
+
+    windows_to_elements = collections.defaultdict(list)
+    for wv in windowed_values:
+      for window in wv.windows:
+        # ignore expired windows
+        if watermark > window.end + self.windowing.allowed_lateness:
+          continue
+        if window in finished_windows_state.read():
+          continue
+        windows_to_elements[window].append(
+            TimestampedValue(wv.value, wv.timestamp))
+
+    # Processing merging of windows
+    if self.merging_windows:
+      old_windows = set(windows_state.read())
+      all_windows = old_windows.union(list(windows_to_elements))
+      if all_windows != old_windows:
+        merge_context = TriggerMergeContext(
+            all_windows, context, self.windowing)
+        self.windowing.windowfn.merge(merge_context)
+
+        merged_windows_to_elements = collections.defaultdict(list)
+        for window, values in windows_to_elements.items():
+          while window in merge_context.merged_away:
+            window = merge_context.merged_away[window]
+          merged_windows_to_elements[window].extend(values)
+        windows_to_elements = merged_windows_to_elements
+
+      if old_windows != all_windows:
+        for w in windows_to_elements:
+          windows_state.add(w)
+    # Done processing merging of windows
+
+    seen_windows = set()
+    for w in windows_to_elements:
+      window_context = context.for_window(w)
+      seen_windows.add(w)
+      for value_w_timestamp in windows_to_elements[w]:
+        _LOGGER.debug(value_w_timestamp)
+        element_bag.add((w, value_w_timestamp))
+        self.windowing.triggerfn.on_element(windowed_values, w, window_context)
+
+    return self._trigger_fire(
+        key, TimeDomain.WATERMARK, watermark, None, context, seen_windows)
+
+  def _trigger_fire(
+      self,
+      key: K,
+      time_domain,
+      timestamp: Timestamp,
+      timer_tag: typing.Optional[str],
+      context: 'FnRunnerStatefulTriggerContext',
+      windows_of_interest: typing.Optional[typing.Set[BoundedWindow]] = None):
+    windows_to_elements = context.windows_to_elements_map()
+    context.elements_bag_state.clear()
+
+    fired_windows = set()
+    finished_windows = set()
+    _LOGGER.debug(
+        '%s - tag %s - timestamp %s', time_domain, timer_tag, timestamp)
+    for w, elems in windows_to_elements.items():
+      if windows_of_interest is not None and w not in windows_of_interest:
+        # windows_of_interest=None means that we care about all windows.
+        # If we care only about some windows, and this window is not one of
+        # them, then we do not intend to fire this window.
+        continue
+      window_context = context.for_window(w)
+      if self.windowing.triggerfn.should_fire(time_domain,
+                                              timestamp,
+                                              w,
+                                              window_context):
+        finished = self.windowing.triggerfn.on_fire(
+            timestamp, w, window_context)
+        _LOGGER.debug('Firing on window %s. Finished: %s', w, finished)
+        fired_windows.add(w)
+        if finished:
+          context.finished_windows_state.add(w)
+          finished_windows.add(w)
+        # TODO(pabloem): Format the output: e.g. pane info
+        elems = [WindowedValue(e.value, e.timestamp, (w, )) for e in elems]
+        yield (key, elems)
+
+    # Add elements that were not fired back into state.
+    for w, elems in windows_to_elements.items():
+      for e in elems:
+        if (w in finished_windows or
+            (w in fired_windows and
+             self.windowing.accumulation_mode == AccumulationMode.DISCARDING)):
+          continue
+        context.elements_bag_state.add((w, e))
+
+  @on_timer(PROCESSING_TIME_TIMER)
+  def processing_time_trigger(
+      self,
+      key=DoFn.KeyParam,
+      timer_tag=DoFn.DynamicTimerTagParam,
+      timestamp=DoFn.TimestampParam,
+      processing_time_state=DoFn.StateParam(LAST_KNOWN_TIME),
+      element_bag=DoFn.StateParam(WINDOW_ELEMENT_PAIRS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER)):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=processing_time_state,
+        watermark_state=None,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    result = self._trigger_fire(
+        key, TimeDomain.REAL_TIME, timestamp, timer_tag, context)
+    processing_time_state.add(timestamp)
+    return result
+
+  @on_timer(WATERMARK_TIMER)
+  def watermark_trigger(
+      self,
+      key=DoFn.KeyParam,
+      timer_tag=DoFn.DynamicTimerTagParam,
+      timestamp=DoFn.TimestampParam,
+      watermark_state=DoFn.StateParam(LAST_KNOWN_WATERMARK),
+      element_bag=DoFn.StateParam(WINDOW_ELEMENT_PAIRS),
+      processing_time_timer=DoFn.TimerParam(PROCESSING_TIME_TIMER),
+      window_tag_values: BagRuntimeState = DoFn.StateParam(WINDOW_TAG_VALUES),  # type: ignore
+      finished_windows_state: SetRuntimeState = DoFn.StateParam(  # type: ignore
+          FINISHED_WINDOWS),
+      watermark_timer=DoFn.TimerParam(WATERMARK_TIMER)):
+    context = FnRunnerStatefulTriggerContext(
+        processing_time_timer=processing_time_timer,
+        watermark_timer=watermark_timer,
+        current_time_state=None,
+        watermark_state=watermark_state,
+        elements_bag_state=element_bag,
+        window_tag_values_bag_state=window_tag_values,
+        finished_windows_state=finished_windows_state)
+    result = self._trigger_fire(
+        key, TimeDomain.WATERMARK, timestamp, timer_tag, context)
+    watermark_state.add(timestamp)
+    return result
+
+
+class FnRunnerStatefulTriggerContext(TriggerContext):
+  def __init__(
+      self,
+      processing_time_timer: RuntimeTimer,
+      watermark_timer: RuntimeTimer,
+      current_time_state: typing.Optional[AccumulatingRuntimeState],
+      watermark_state: typing.Optional[AccumulatingRuntimeState],
+      elements_bag_state: BagRuntimeState,
+      window_tag_values_bag_state: BagRuntimeState,
+      finished_windows_state: SetRuntimeState):
+    self.timers = {
+        TimeDomain.REAL_TIME: processing_time_timer,
+        TimeDomain.WATERMARK: watermark_timer
+    }
+    self.current_times = {
+        TimeDomain.REAL_TIME: current_time_state,
+        TimeDomain.WATERMARK: watermark_state
+    }
+    self.elements_bag_state = elements_bag_state
+    self.window_tag_values_bag_state = window_tag_values_bag_state
+    self.finished_windows_state = finished_windows_state
+
+  def windows_to_elements_map(
+      self
+  ) -> typing.Dict[BoundedWindow, typing.List[windowed_value.WindowedValue]]:
+    window_element_pairs: typing.Iterable[typing.Tuple[
+        BoundedWindow,
+        windowed_value.WindowedValue]] = self.elements_bag_state.read()
+    result: typing.Dict[BoundedWindow,
+                        typing.List[windowed_value.WindowedValue]] = {}
+    for w, e in window_element_pairs:
+      if w not in result:
+        result[w] = []
+      result[w].append(e)
+    return result
+
+  def for_window(self, window):
+    return PerWindowTriggerContext(window, self)
+
+  def get_current_time(self):
+    return self.current_times[TimeDomain.REAL_TIME].read()
+
+  def set_timer(self, name, time_domain, timestamp):
+    _LOGGER.debug('Setting timer (%s, %s) at %s', time_domain, name, timestamp)
+    self.timers[time_domain].set(timestamp, dynamic_timer_tag=name)
+
+  def clear_timer(self, name, time_domain):
+    _LOGGER.debug('Clearing timer (%s, %s)', time_domain, name)
+    self.timers[time_domain].clear(dynamic_timer_tag=name)
+
+  def merge_state(self, to_be_merged, merge_result):
+    all_triplets = self.window_tag_values_bag_state.read()
+    # Collect all the triplets for the window we are merging away, and tag them
+    # with the new window (merge_result).
+    merging_away_triplets = [(merge_result, t[1], t[2]) for t in all_triplets
+                             if t[0] in to_be_merged]
+
+    # Collect all of the other triplets, and joining them with the newly-tagged
+    # set of triplets.
+    resulting_triplets = [
+        t for t in all_triplets if t[0] not in to_be_merged

Review comment:
       just to confirm: is all_triplets reiterable?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (4ec1a8f) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.19%`.
   > The diff coverage is `92.37%`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   82.96%   +0.19%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58455     +893     
   ==========================================
   + Hits        47647    48499     +852     
   - Misses       9915     9956      +41     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [sdks/python/apache\_beam/transforms/trigger.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdHJhbnNmb3Jtcy90cmlnZ2VyLnB5) | `88.34% <75.00%> (-0.02%)` | :arrow_down: |
   | [...nners/portability/fn\_api\_runner/trigger\_manager.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9mbl9hcGlfcnVubmVyL3RyaWdnZXJfbWFuYWdlci5weQ==) | `92.92% <92.92%> (ø)` | |
   | [...dks/python/apache\_beam/options/pipeline\_options.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vb3B0aW9ucy9waXBlbGluZV9vcHRpb25zLnB5) | `94.60% <100.00%> (ø)` | |
   | [...\_beam/runners/portability/sdk\_container\_builder.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9zZGtfY29udGFpbmVyX2J1aWxkZXIucHk=) | `40.00% <0.00%> (-3.28%)` | :arrow_down: |
   | [sdks/python/apache\_beam/internal/metrics/metric.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW50ZXJuYWwvbWV0cmljcy9tZXRyaWMucHk=) | `86.45% <0.00%> (-1.05%)` | :arrow_down: |
   | [sdks/python/apache\_beam/dataframe/frames.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZGF0YWZyYW1lL2ZyYW1lcy5weQ==) | `91.07% <0.00%> (-0.90%)` | :arrow_down: |
   | [...ache\_beam/runners/portability/local\_job\_service.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9sb2NhbF9qb2Jfc2VydmljZS5weQ==) | `80.53% <0.00%> (-0.64%)` | :arrow_down: |
   | [...ks/python/apache\_beam/runners/worker/sdk\_worker.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy93b3JrZXIvc2RrX3dvcmtlci5weQ==) | `89.54% <0.00%> (-0.32%)` | :arrow_down: |
   | [sdks/python/apache\_beam/dataframe/expressions.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZGF0YWZyYW1lL2V4cHJlc3Npb25zLnB5) | `91.02% <0.00%> (-0.14%)` | :arrow_down: |
   | [sdks/python/apache\_beam/dataframe/schemas.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZGF0YWZyYW1lL3NjaGVtYXMucHk=) | `96.87% <0.00%> (-0.12%)` | :arrow_down: |
   | ... and [33 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...4ec1a8f](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (cc6607c) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.26%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   83.03%   +0.26%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58565    +1003     
   ==========================================
   + Hits        47647    48631     +984     
   - Misses       9915     9934      +19     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [sdks/python/apache\_beam/internal/pickler.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW50ZXJuYWwvcGlja2xlci5weQ==) | | |
   | [...\_beam/io/gcp/datastore/v1new/adaptive\_throttler.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL2RhdGFzdG9yZS92MW5ldy9hZGFwdGl2ZV90aHJvdHRsZXIucHk=) | | |
   | [...examples/snippets/transforms/aggregation/sample.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvc25pcHBldHMvdHJhbnNmb3Jtcy9hZ2dyZWdhdGlvbi9zYW1wbGUucHk=) | | |
   | [...\_beam/testing/benchmarks/nexmark/queries/query8.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL25leG1hcmsvcXVlcmllcy9xdWVyeTgucHk=) | | |
   | [sdks/python/apache\_beam/pipeline.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcGlwZWxpbmUucHk=) | | |
   | [sdks/python/apache\_beam/testing/util.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy91dGlsLnB5) | | |
   | [...beam/testing/benchmarks/nexmark/queries/query12.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL25leG1hcmsvcXVlcmllcy9xdWVyeTEyLnB5) | | |
   | [...am/testing/benchmarks/chicago\_taxi/trainer/taxi.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL2NoaWNhZ29fdGF4aS90cmFpbmVyL3RheGkucHk=) | | |
   | [sdks/python/apache\_beam/portability/common\_urns.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcG9ydGFiaWxpdHkvY29tbW9uX3VybnMucHk=) | | |
   | [.../runners/portability/fn\_api\_runner/translations.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9mbl9hcGlfcnVubmVyL3RyYW5zbGF0aW9ucy5weQ==) | | |
   | ... and [923 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...cc6607c](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (12920ec) into [master](https://codecov.io/gh/apache/beam/commit/67dc5fe49bd0dbc33ccccf7cde5749dc17adec08?el=desc) (67dc5fe) will **increase** coverage by `0.05%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   83.39%   83.45%   +0.05%     
   ==========================================
     Files         469      470       +1     
     Lines       58721    58952     +231     
   ==========================================
   + Hits        48972    49196     +224     
   - Misses       9749     9756       +7     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [...d/srcs/sdks/python/apache\_beam/internal/pickler.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW50ZXJuYWwvcGlja2xlci5weQ==) | | |
   | [...beam/testing/benchmarks/nexmark/models/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL25leG1hcmsvbW9kZWxzL19faW5pdF9fLnB5) | | |
   | [...\_beam/testing/benchmarks/nexmark/queries/query1.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL25leG1hcmsvcXVlcmllcy9xdWVyeTEucHk=) | | |
   | [...d/srcs/sdks/python/apache\_beam/testing/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9fX2luaXRfXy5weQ==) | | |
   | [...\_beam/testing/benchmarks/nexmark/queries/query3.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL25leG1hcmsvcXVlcmllcy9xdWVyeTMucHk=) | | |
   | [...on/apache\_beam/io/gcp/bigquery\_io\_read\_pipeline.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL2JpZ3F1ZXJ5X2lvX3JlYWRfcGlwZWxpbmUucHk=) | | |
   | [...d/srcs/sdks/python/apache\_beam/transforms/stats.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdHJhbnNmb3Jtcy9zdGF0cy5weQ==) | | |
   | [.../py38/build/srcs/sdks/python/apache\_beam/pvalue.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcHZhbHVlLnB5) | | |
   | [.../python/apache\_beam/io/gcp/datastore/v1new/util.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL2RhdGFzdG9yZS92MW5ldy91dGlsLnB5) | | |
   | [...beam/portability/api/beam\_artifact\_api\_pb2\_urns.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcG9ydGFiaWxpdHkvYXBpL2JlYW1fYXJ0aWZhY3RfYXBpX3BiMl91cm5zLnB5) | | |
   | ... and [929 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [67dc5fe...12920ec](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (964d13a) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.26%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   83.03%   +0.26%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58535     +973     
   ==========================================
   + Hits        47647    48607     +960     
   - Misses       9915     9928      +13     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [...e\_beam/portability/api/beam\_interactive\_api\_pb2.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcG9ydGFiaWxpdHkvYXBpL2JlYW1faW50ZXJhY3RpdmVfYXBpX3BiMi5weQ==) | | |
   | [sdks/python/apache\_beam/dataframe/partitionings.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZGF0YWZyYW1lL3BhcnRpdGlvbmluZ3MucHk=) | | |
   | [.../apache\_beam/options/pipeline\_options\_validator.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vb3B0aW9ucy9waXBlbGluZV9vcHRpb25zX3ZhbGlkYXRvci5weQ==) | | |
   | [...hon/apache\_beam/runners/direct/test\_stream\_impl.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9kaXJlY3QvdGVzdF9zdHJlYW1faW1wbC5weQ==) | | |
   | [...che\_beam/runners/interactive/interactive\_runner.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9pbnRlcmFjdGl2ZS9pbnRlcmFjdGl2ZV9ydW5uZXIucHk=) | | |
   | [...n/apache\_beam/runners/direct/test\_direct\_runner.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9kaXJlY3QvdGVzdF9kaXJlY3RfcnVubmVyLnB5) | | |
   | [.../python/apache\_beam/transforms/periodicsequence.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdHJhbnNmb3Jtcy9wZXJpb2RpY3NlcXVlbmNlLnB5) | | |
   | [...he\_beam/examples/cookbook/multiple\_output\_pardo.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvY29va2Jvb2svbXVsdGlwbGVfb3V0cHV0X3BhcmRvLnB5) | | |
   | [...thon/apache\_beam/runners/worker/operation\_specs.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy93b3JrZXIvb3BlcmF0aW9uX3NwZWNzLnB5) | | |
   | [...beam/runners/interactive/background\_caching\_job.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9pbnRlcmFjdGl2ZS9iYWNrZ3JvdW5kX2NhY2hpbmdfam9iLnB5) | | |
   | ... and [923 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...964d13a](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] pabloem commented on a change in pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
pabloem commented on a change in pull request #13985:
URL: https://github.com/apache/beam/pull/13985#discussion_r599148488



##########
File path: sdks/python/apache_beam/runners/portability/fn_api_runner/trigger_manager.py
##########
@@ -0,0 +1,459 @@
+#
+# Licensed to the Apache Software Foundation (ASF) under one or more
+# contributor license agreements.  See the NOTICE file distributed with
+# this work for additional information regarding copyright ownership.
+# The ASF licenses this file to You under the Apache License, Version 2.0
+# (the "License"); you may not use this file except in compliance with
+# the License.  You may obtain a copy of the License at
+#
+#    http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing, software
+# distributed under the License is distributed on an "AS IS" BASIS,
+# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+# See the License for the specific language governing permissions and
+# limitations under the License.
+#
+import collections
+import logging
+import typing
+from collections import defaultdict
+
+from apache_beam import typehints
+from apache_beam.coders import PickleCoder
+from apache_beam.coders import StrUtf8Coder
+from apache_beam.coders import TupleCoder
+from apache_beam.coders import VarIntCoder
+from apache_beam.coders.coders import IntervalWindowCoder
+from apache_beam.transforms import DoFn
+from apache_beam.transforms.core import Windowing
+from apache_beam.transforms.timeutil import TimeDomain
+from apache_beam.transforms.trigger import AccumulationMode
+from apache_beam.transforms.trigger import TriggerContext
+from apache_beam.transforms.trigger import _CombiningValueStateTag
+from apache_beam.transforms.trigger import _StateTag
+from apache_beam.transforms.userstate import AccumulatingRuntimeState
+from apache_beam.transforms.userstate import BagRuntimeState
+from apache_beam.transforms.userstate import BagStateSpec
+from apache_beam.transforms.userstate import CombiningValueStateSpec
+from apache_beam.transforms.userstate import RuntimeTimer
+from apache_beam.transforms.userstate import SetRuntimeState
+from apache_beam.transforms.userstate import SetStateSpec
+from apache_beam.transforms.userstate import TimerSpec
+from apache_beam.transforms.userstate import on_timer
+from apache_beam.transforms.window import BoundedWindow
+from apache_beam.transforms.window import GlobalWindow
+from apache_beam.transforms.window import Sessions
+from apache_beam.transforms.window import TimestampedValue
+from apache_beam.transforms.window import WindowFn
+from apache_beam.typehints import TypeCheckError
+from apache_beam.utils import windowed_value
+from apache_beam.utils.timestamp import MIN_TIMESTAMP
+from apache_beam.utils.timestamp import Timestamp
+from apache_beam.utils.windowed_value import WindowedValue
+
+_LOGGER = logging.getLogger(__name__)
+_LOGGER.setLevel(logging.DEBUG)
+
+K = typing.TypeVar('K')
+V = typing.TypeVar('V')
+
+
+class ReifyWindows(DoFn):
+  """Receives KV pairs, and wraps the values into WindowedValues."""
+  def process(
+      self, element, window=DoFn.WindowParam, timestamp=DoFn.TimestampParam):
+    try:
+      k, v = element
+    except TypeError:
+      raise TypeCheckError(
+          'Input to GroupByKey must be a PCollection with '
+          'elements compatible with KV[A, B]')
+
+    yield (k, windowed_value.WindowedValue(v, timestamp, [window]))
+
+
+class _GroupBundlesByKey(DoFn):
+  def start_bundle(self):
+    self.keys = defaultdict(list)
+
+  def process(self, element):
+    key, windowed_value = element
+    self.keys[key].append(windowed_value)
+
+  def finish_bundle(self):
+    for k, vals in self.keys.items():
+      yield windowed_value.WindowedValue((k, vals),
+                                         MIN_TIMESTAMP, [GlobalWindow()])
+
+
+def read_watermark(watermark_state):
+  try:
+    return watermark_state.read()
+  except ValueError:
+    watermark_state.add(MIN_TIMESTAMP)
+    return watermark_state.read()
+
+
+class TriggerMergeContext(WindowFn.MergeContext):
+  def __init__(
+      self, all_windows, context: 'FnRunnerStatefulTriggerContext', windowing):
+    super(TriggerMergeContext, self).__init__(all_windows)
+    self.trigger_context = context
+    self.windowing = windowing
+    self.merged_away: typing.Dict[BoundedWindow, BoundedWindow] = {}
+
+  def merge(self, to_be_merged, merge_result):
+    _LOGGER.debug("Merging %s into %s", to_be_merged, merge_result)
+    self.trigger_context.merge_state(to_be_merged, merge_result)
+    for window in to_be_merged:
+      if window != merge_result:
+        self.merged_away[window] = merge_result
+        # Clear state associated with PaneInfo since it is
+        # not preserved across merges.
+        self.trigger_context.for_window(window).clear_state(None)
+    self.windowing.triggerfn.on_merge(
+        to_be_merged,
+        merge_result,
+        self.trigger_context.for_window(merge_result))
+
+
+@typehints.with_input_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+@typehints.with_output_types(
+    typing.Tuple[K, typing.Iterable[windowed_value.WindowedValue]])
+class GeneralTriggerManagerDoFn(DoFn):
+  """A trigger manager that supports all windowing / triggering cases.
+
+  This implements a DoFn that manages triggering in a per-key basis. All
+  elements for a single key are processed together. Per-key state holds data
+  related to all windows.
+  """
+
+  KNOWN_WINDOWS = SetStateSpec('known_windows', IntervalWindowCoder())
+  FINISHED_WINDOWS = SetStateSpec('finished_windows', IntervalWindowCoder())
+  LAST_KNOWN_TIME = CombiningValueStateSpec('last_known_time', combine_fn=max)
+  LAST_KNOWN_WATERMARK = CombiningValueStateSpec(
+      'last_known_watermark', combine_fn=max)
+
+  # TODO(pabloem) What's the coder for the elements/keys here?
+  WINDOW_ELEMENT_PAIRS = BagStateSpec(
+      'element_bag', TupleCoder([IntervalWindowCoder(), PickleCoder()]))
+  WINDOW_TAG_VALUES = BagStateSpec(
+      'per_window_per_tag_value_state',
+      TupleCoder([IntervalWindowCoder(), StrUtf8Coder(), VarIntCoder()]))
+
+  PROCESSING_TIME_TIMER = TimerSpec(
+      'processing_time_timer', TimeDomain.REAL_TIME)
+  WATERMARK_TIMER = TimerSpec('watermark_timer', TimeDomain.WATERMARK)
+
+  def __init__(self, windowing: Windowing):
+    self.windowing = windowing
+    # Only session windows are merging. Other windows are non-merging.
+    self.merging_windows = isinstance(windowing.windowfn, Sessions)

Review comment:
       done.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (b874eed) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.22%`.
   > The diff coverage is `90.28%`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   83.00%   +0.22%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58463     +901     
   ==========================================
   + Hits        47647    48526     +879     
   - Misses       9915     9937      +22     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [sdks/python/apache\_beam/transforms/trigger.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdHJhbnNmb3Jtcy90cmlnZ2VyLnB5) | `88.34% <75.00%> (-0.02%)` | :arrow_down: |
   | [...nners/portability/fn\_api\_runner/trigger\_manager.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9mbl9hcGlfcnVubmVyL3RyaWdnZXJfbWFuYWdlci5weQ==) | `90.58% <90.58%> (ø)` | |
   | [...dks/python/apache\_beam/options/pipeline\_options.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vb3B0aW9ucy9waXBlbGluZV9vcHRpb25zLnB5) | `94.60% <100.00%> (ø)` | |
   | [sdks/python/apache\_beam/utils/interactive\_utils.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdXRpbHMvaW50ZXJhY3RpdmVfdXRpbHMucHk=) | `88.09% <0.00%> (-7.15%)` | :arrow_down: |
   | [...\_beam/runners/portability/sdk\_container\_builder.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9zZGtfY29udGFpbmVyX2J1aWxkZXIucHk=) | `40.00% <0.00%> (-3.28%)` | :arrow_down: |
   | [sdks/python/apache\_beam/dataframe/frames.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZGF0YWZyYW1lL2ZyYW1lcy5weQ==) | `90.80% <0.00%> (-1.18%)` | :arrow_down: |
   | [sdks/python/apache\_beam/internal/metrics/metric.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW50ZXJuYWwvbWV0cmljcy9tZXRyaWMucHk=) | `86.45% <0.00%> (-1.05%)` | :arrow_down: |
   | [...ache\_beam/runners/portability/local\_job\_service.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9sb2NhbF9qb2Jfc2VydmljZS5weQ==) | `80.53% <0.00%> (-0.64%)` | :arrow_down: |
   | [...nners/portability/fn\_api\_runner/worker\_handlers.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9mbl9hcGlfcnVubmVyL3dvcmtlcl9oYW5kbGVycy5weQ==) | `79.61% <0.00%> (-0.39%)` | :arrow_down: |
   | [...ks/python/apache\_beam/runners/worker/sdk\_worker.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy93b3JrZXIvc2RrX3dvcmtlci5weQ==) | `89.69% <0.00%> (-0.16%)` | :arrow_down: |
   | ... and [37 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...b874eed](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (964d13a) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.26%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   83.03%   +0.26%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58535     +973     
   ==========================================
   + Hits        47647    48607     +960     
   - Misses       9915     9928      +13     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [sdks/python/apache\_beam/io/snowflake.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vc25vd2ZsYWtlLnB5) | | |
   | [...s/python/apache\_beam/testing/datatype\_inference.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9kYXRhdHlwZV9pbmZlcmVuY2UucHk=) | | |
   | [...s/python/apache\_beam/testing/synthetic\_pipeline.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9zeW50aGV0aWNfcGlwZWxpbmUucHk=) | | |
   | [...\_beam/io/gcp/datastore/v1new/adaptive\_throttler.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL2RhdGFzdG9yZS92MW5ldy9hZGFwdGl2ZV90aHJvdHRsZXIucHk=) | | |
   | [...\_beam/testing/benchmarks/nexmark/queries/query5.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL25leG1hcmsvcXVlcmllcy9xdWVyeTUucHk=) | | |
   | [sdks/python/apache\_beam/runners/worker/logger.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy93b3JrZXIvbG9nZ2VyLnB5) | | |
   | [...python/apache\_beam/examples/wordcount\_debugging.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvd29yZGNvdW50X2RlYnVnZ2luZy5weQ==) | | |
   | [...cp/internal/clients/storage/storage\_v1\_messages.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL2ludGVybmFsL2NsaWVudHMvc3RvcmFnZS9zdG9yYWdlX3YxX21lc3NhZ2VzLnB5) | | |
   | [.../python/apache\_beam/io/gcp/bigquery\_io\_metadata.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL2JpZ3F1ZXJ5X2lvX21ldGFkYXRhLnB5) | | |
   | [...on/apache\_beam/io/gcp/bigquery\_io\_read\_pipeline.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL2JpZ3F1ZXJ5X2lvX3JlYWRfcGlwZWxpbmUucHk=) | | |
   | ... and [923 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...a57fb3d](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] commented on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] commented on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (4ec1a8f) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.19%`.
   > The diff coverage is `92.37%`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   82.96%   +0.19%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58455     +893     
   ==========================================
   + Hits        47647    48499     +852     
   - Misses       9915     9956      +41     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [sdks/python/apache\_beam/transforms/trigger.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdHJhbnNmb3Jtcy90cmlnZ2VyLnB5) | `88.34% <75.00%> (-0.02%)` | :arrow_down: |
   | [...nners/portability/fn\_api\_runner/trigger\_manager.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9mbl9hcGlfcnVubmVyL3RyaWdnZXJfbWFuYWdlci5weQ==) | `92.92% <92.92%> (ø)` | |
   | [...dks/python/apache\_beam/options/pipeline\_options.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vb3B0aW9ucy9waXBlbGluZV9vcHRpb25zLnB5) | `94.60% <100.00%> (ø)` | |
   | [...\_beam/runners/portability/sdk\_container\_builder.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9zZGtfY29udGFpbmVyX2J1aWxkZXIucHk=) | `40.00% <0.00%> (-3.28%)` | :arrow_down: |
   | [sdks/python/apache\_beam/internal/metrics/metric.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW50ZXJuYWwvbWV0cmljcy9tZXRyaWMucHk=) | `86.45% <0.00%> (-1.05%)` | :arrow_down: |
   | [sdks/python/apache\_beam/dataframe/frames.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZGF0YWZyYW1lL2ZyYW1lcy5weQ==) | `91.07% <0.00%> (-0.90%)` | :arrow_down: |
   | [...ache\_beam/runners/portability/local\_job\_service.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9sb2NhbF9qb2Jfc2VydmljZS5weQ==) | `80.53% <0.00%> (-0.64%)` | :arrow_down: |
   | [...ks/python/apache\_beam/runners/worker/sdk\_worker.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy93b3JrZXIvc2RrX3dvcmtlci5weQ==) | `89.54% <0.00%> (-0.32%)` | :arrow_down: |
   | [sdks/python/apache\_beam/dataframe/expressions.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZGF0YWZyYW1lL2V4cHJlc3Npb25zLnB5) | `91.02% <0.00%> (-0.14%)` | :arrow_down: |
   | [sdks/python/apache\_beam/dataframe/schemas.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZGF0YWZyYW1lL3NjaGVtYXMucHk=) | `96.87% <0.00%> (-0.12%)` | :arrow_down: |
   | ... and [33 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...4ec1a8f](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (cc6607c) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.26%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   83.03%   +0.26%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58565    +1003     
   ==========================================
   + Hits        47647    48631     +984     
   - Misses       9915     9934      +19     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [...ache\_beam/portability/api/beam\_job\_api\_pb2\_grpc.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcG9ydGFiaWxpdHkvYXBpL2JlYW1fam9iX2FwaV9wYjJfZ3JwYy5weQ==) | | |
   | [sdks/python/apache\_beam/runners/common.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9jb21tb24ucHk=) | | |
   | [...ers/dataflow/internal/clients/dataflow/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9kYXRhZmxvdy9pbnRlcm5hbC9jbGllbnRzL2RhdGFmbG93L19faW5pdF9fLnB5) | | |
   | [...che\_beam/examples/streaming\_wordcount\_debugging.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvc3RyZWFtaW5nX3dvcmRjb3VudF9kZWJ1Z2dpbmcucHk=) | | |
   | [...m/testing/benchmarks/chicago\_taxi/trainer/model.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL2NoaWNhZ29fdGF4aS90cmFpbmVyL21vZGVsLnB5) | | |
   | [sdks/python/apache\_beam/examples/complete/tfidf.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvY29tcGxldGUvdGZpZGYucHk=) | | |
   | [sdks/python/apache\_beam/io/jdbc.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vamRiYy5weQ==) | | |
   | [...python/apache\_beam/runners/portability/\_\_init\_\_.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9fX2luaXRfXy5weQ==) | | |
   | [sdks/python/apache\_beam/metrics/metric.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vbWV0cmljcy9tZXRyaWMucHk=) | | |
   | [...ers/dataflow/dataflow\_exercise\_metrics\_pipeline.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9kYXRhZmxvdy9kYXRhZmxvd19leGVyY2lzZV9tZXRyaWNzX3BpcGVsaW5lLnB5) | | |
   | ... and [923 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...cc6607c](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (12920ec) into [master](https://codecov.io/gh/apache/beam/commit/67dc5fe49bd0dbc33ccccf7cde5749dc17adec08?el=desc) (67dc5fe) will **increase** coverage by `0.05%`.
   > The diff coverage is `n/a`.
   
   > :exclamation: Current head 12920ec differs from pull request most recent head 177f098. Consider uploading reports for the commit 177f098 to get more accurate results
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   83.39%   83.45%   +0.05%     
   ==========================================
     Files         469      470       +1     
     Lines       58721    58952     +231     
   ==========================================
   + Hits        48972    49196     +224     
   - Misses       9749     9756       +7     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [.../python/apache\_beam/io/gcp/resource\_identifiers.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL3Jlc291cmNlX2lkZW50aWZpZXJzLnB5) | | |
   | [...on/apache\_beam/runners/direct/sdf\_direct\_runner.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9kaXJlY3Qvc2RmX2RpcmVjdF9ydW5uZXIucHk=) | | |
   | [...cs/sdks/python/apache\_beam/dataframe/transforms.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZGF0YWZyYW1lL3RyYW5zZm9ybXMucHk=) | | |
   | [...s/sdks/python/apache\_beam/dataframe/expressions.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZGF0YWZyYW1lL2V4cHJlc3Npb25zLnB5) | | |
   | [...ild/srcs/sdks/python/apache\_beam/utils/sentinel.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdXRpbHMvc2VudGluZWwucHk=) | | |
   | [...thon/apache\_beam/runners/worker/operation\_specs.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy93b3JrZXIvb3BlcmF0aW9uX3NwZWNzLnB5) | | |
   | [...python/apache\_beam/typehints/typehints\_test\_py3.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdHlwZWhpbnRzL3R5cGVoaW50c190ZXN0X3B5My5weQ==) | | |
   | [...ild/srcs/sdks/python/apache\_beam/runners/common.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9jb21tb24ucHk=) | | |
   | [...py38/build/srcs/sdks/python/apache\_beam/version.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdmVyc2lvbi5weQ==) | | |
   | [...python/apache\_beam/examples/wordcount\_debugging.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-YmVhbV9QcmVDb21taXRfUHl0aG9uX0Nyb24vc3JjL3Nka3MvcHl0aG9uL3Rlc3Qtc3VpdGVzL3RveC9weTM4L2J1aWxkL3NyY3Mvc2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvd29yZGNvdW50X2RlYnVnZ2luZy5weQ==) | | |
   | ... and [929 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [67dc5fe...177f098](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (2452188) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.28%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   83.06%   +0.28%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58590    +1028     
   ==========================================
   + Hits        47647    48666    +1019     
   - Misses       9915     9924       +9     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [sdks/python/apache\_beam/coders/row\_coder.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vY29kZXJzL3Jvd19jb2Rlci5weQ==) | | |
   | [sdks/python/apache\_beam/io/gcp/dicomio.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZ2NwL2RpY29taW8ucHk=) | | |
   | [sdks/python/apache\_beam/io/filesystemio.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW8vZmlsZXN5c3RlbWlvLnB5) | | |
   | [...python/apache\_beam/runners/worker/worker\_status.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy93b3JrZXIvd29ya2VyX3N0YXR1cy5weQ==) | | |
   | [...thon/apache\_beam/runners/worker/operation\_specs.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy93b3JrZXIvb3BlcmF0aW9uX3NwZWNzLnB5) | | |
   | [sdks/python/apache\_beam/metrics/cells.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vbWV0cmljcy9jZWxscy5weQ==) | | |
   | [...examples/snippets/transforms/elementwise/values.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvc25pcHBldHMvdHJhbnNmb3Jtcy9lbGVtZW50d2lzZS92YWx1ZXMucHk=) | | |
   | [sdks/python/apache\_beam/internal/gcp/auth.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vaW50ZXJuYWwvZ2NwL2F1dGgucHk=) | | |
   | [...beam/testing/benchmarks/chicago\_taxi/preprocess.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL2NoaWNhZ29fdGF4aS9wcmVwcm9jZXNzLnB5) | | |
   | [...dks/python/apache\_beam/testing/extra\_assertions.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9leHRyYV9hc3NlcnRpb25zLnB5) | | |
   | ... and [923 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...2452188](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] pabloem commented on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
pabloem commented on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-785427608


   r: @robertwb 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [beam] codecov[bot] edited a comment on pull request #13985: [BEAM-11810] A trigger manager for FnApiRunner

Posted by GitBox <gi...@apache.org>.
codecov[bot] edited a comment on pull request #13985:
URL: https://github.com/apache/beam/pull/13985#issuecomment-778537129


   # [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=h1) Report
   > Merging [#13985](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=desc) (a57fb3d) into [master](https://codecov.io/gh/apache/beam/commit/4c81134dd9206fd6cb9060a9b973fc111b8fbd64?el=desc) (4c81134) will **increase** coverage by `0.25%`.
   > The diff coverage is `n/a`.
   
   [![Impacted file tree graph](https://codecov.io/gh/apache/beam/pull/13985/graphs/tree.svg?width=650&height=150&src=pr&token=qcbbAh8Fj1)](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree)
   
   ```diff
   @@            Coverage Diff             @@
   ##           master   #13985      +/-   ##
   ==========================================
   + Coverage   82.77%   83.03%   +0.25%     
   ==========================================
     Files         466      470       +4     
     Lines       57562    58535     +973     
   ==========================================
   + Hits        47647    48604     +957     
   - Misses       9915     9931      +16     
   ```
   
   
   | [Impacted Files](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=tree) | Coverage Δ | |
   |---|---|---|
   | [.../runners/interactive/testing/test\_cache\_manager.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9pbnRlcmFjdGl2ZS90ZXN0aW5nL3Rlc3RfY2FjaGVfbWFuYWdlci5weQ==) | | |
   | [sdks/python/apache\_beam/testing/test\_utils.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy90ZXN0X3V0aWxzLnB5) | | |
   | [sdks/python/apache\_beam/metrics/execution.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vbWV0cmljcy9leGVjdXRpb24ucHk=) | | |
   | [...mples/snippets/transforms/elementwise/partition.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vZXhhbXBsZXMvc25pcHBldHMvdHJhbnNmb3Jtcy9lbGVtZW50d2lzZS9wYXJ0aXRpb24ucHk=) | | |
   | [...runners/interactive/display/pcoll\_visualization.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9pbnRlcmFjdGl2ZS9kaXNwbGF5L3Bjb2xsX3Zpc3VhbGl6YXRpb24ucHk=) | | |
   | [...hon/apache\_beam/runners/direct/test\_stream\_impl.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9kaXJlY3QvdGVzdF9zdHJlYW1faW1wbC5weQ==) | | |
   | [...eam/runners/portability/fn\_api\_runner/fn\_runner.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy9wb3J0YWJpbGl0eS9mbl9hcGlfcnVubmVyL2ZuX3J1bm5lci5weQ==) | | |
   | [...he\_beam/testing/benchmarks/nexmark/nexmark\_perf.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL25leG1hcmsvbmV4bWFya19wZXJmLnB5) | | |
   | [...hon/apache\_beam/runners/worker/worker\_pool\_main.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vcnVubmVycy93b3JrZXIvd29ya2VyX3Bvb2xfbWFpbi5weQ==) | | |
   | [...\_beam/testing/benchmarks/nexmark/queries/query8.py](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree#diff-c2Rrcy9weXRob24vYXBhY2hlX2JlYW0vdGVzdGluZy9iZW5jaG1hcmtzL25leG1hcmsvcXVlcmllcy9xdWVyeTgucHk=) | | |
   | ... and [923 more](https://codecov.io/gh/apache/beam/pull/13985/diff?src=pr&el=tree-more) | |
   
   ------
   
   [Continue to review full report at Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=continue).
   > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)
   > `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`
   > Powered by [Codecov](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=footer). Last update [7c4a21a...a57fb3d](https://codecov.io/gh/apache/beam/pull/13985?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org