You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by GitBox <gi...@apache.org> on 2022/03/02 04:41:45 UTC

[GitHub] [flink] cun8cun8 opened a new pull request #18957: Window allocator supporting pyflink datastream API

cun8cun8 opened a new pull request #18957:
URL: https://github.com/apache/flink/pull/18957


   <!--
   *Thank you very much for contributing to Apache Flink - we are happy that you want to help us improve Flink. To help the community review your contribution in the best possible way, please go through the checklist below, which will get the contribution into a shape in which it can be best reviewed.*
   
   *Please understand that we do not do this to make contributions to Flink a hassle. In order to uphold a high standard of quality for code contributions, while at the same time managing a large number of contributions, we need contributors to prepare the contributions well, and give reviewers enough contextual information for the review. Please also understand that contributions that do not follow this guide will take longer to review and thus typically be picked up with lower priority by the community.*
   
   ## Contribution Checklist
   
     - Make sure that the pull request corresponds to a [JIRA issue](https://issues.apache.org/jira/projects/FLINK/issues). Exceptions are made for typos in JavaDoc or documentation files, which need no JIRA issue.
     
     - Name the pull request in the form "[FLINK-XXXX] [component] Title of the pull request", where *FLINK-XXXX* should be replaced by the actual issue number. Skip *component* if you are unsure about which is the best component.
     Typo fixes that have no associated JIRA issue should be named following this pattern: `[hotfix] [docs] Fix typo in event time introduction` or `[hotfix] [javadocs] Expand JavaDoc for PuncuatedWatermarkGenerator`.
   
     - Fill out the template below to describe the changes contributed by the pull request. That will give reviewers the context they need to do the review.
     
     - Make sure that the change passes the automated tests, i.e., `mvn clean verify` passes. You can set up Azure Pipelines CI to do that following [this guide](https://cwiki.apache.org/confluence/display/FLINK/Azure+Pipelines#AzurePipelines-Tutorial:SettingupAzurePipelinesforaforkoftheFlinkrepository).
   
     - Each pull request should address only one issue, not mix up code from multiple issues.
     
     - Each commit in the pull request has a meaningful commit message (including the JIRA id)
   
     - Once all items of the checklist are addressed, remove the above text and this checklist, leaving only the filled out template below.
   
   
   **(The sections below can be removed for hotfixes of typos)**
   -->
   
   ## What is the purpose of the change
   
   *(For example: This pull request makes task deployment go through the blob server, rather than through RPC. That way we avoid re-transferring them on each deployment (during recovery).)*
   
   
   ## Brief change log
   
   *(for example:)*
     - *The TaskInfo is stored in the blob store on job creation time as a persistent artifact*
     - *Deployments RPC transmits only the blob storage reference*
     - *TaskManagers retrieve the TaskInfo from the blob cache*
   
   
   ## Verifying this change
   
   Please make sure both new and modified tests in this PR follows the conventions defined in our code quality guide: https://flink.apache.org/contributing/code-style-and-quality-common.html#testing
   
   *(Please pick either of the following options)*
   
   This change is a trivial rework / code cleanup without any test coverage.
   
   *(or)*
   
   This change is already covered by existing tests, such as *(please describe tests)*.
   
   *(or)*
   
   This change added tests and can be verified as follows:
   
   *(example:)*
     - *Added integration tests for end-to-end deployment with large payloads (100MB)*
     - *Extended integration test for recovery after master (JobManager) failure*
     - *Added test that validates that TaskInfo is transferred only once across recoveries*
     - *Manually verified the change by running a 4 node cluster with 2 JobManagers and 4 TaskManagers, a stateful streaming program, and killing one JobManager and two TaskManagers during the execution, verifying that recovery happens correctly.*
   
   ## Does this pull request potentially affect one of the following parts:
   
     - Dependencies (does it add or upgrade a dependency): (yes / no)
     - The public API, i.e., is any changed class annotated with `@Public(Evolving)`: (yes / no)
     - The serializers: (yes / no / don't know)
     - The runtime per-record code paths (performance sensitive): (yes / no / don't know)
     - Anything that affects deployment or recovery: JobManager (and its components), Checkpointing, Kubernetes/Yarn, ZooKeeper: (yes / no / don't know)
     - The S3 file system connector: (yes / no / don't know)
   
   ## Documentation
   
     - Does this pull request introduce a new feature? (yes / no)
     - If yes, how is the feature documented? (not applicable / docs / JavaDocs / not documented)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 610f23f625a4559bef6f000916a2de37a5cb3b38 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * a4a5fc419dbf82c079a24327846d4a7baa500959 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531) 
   * 8e46affb05226eeed6b8eb20df971445139654a8 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot commented on pull request #18957: Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot commented on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 8fea20f61b27f7b803c12798ad85dbe97e05dcdf UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * d178dd422b702f51ee85b41e43ec4b306c60aa8c Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914) 
   * ff0bcbd841960c3e524a62c3c45f426707fb5306 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32952",
       "triggerID" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 84846361d6db21a5ac721494ebb5badf23b274a1 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947) 
   * 195bf4369bf1eb0cdc1727847e63ce0d6fd31f66 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32952) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32952",
       "triggerID" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 195bf4369bf1eb0cdc1727847e63ce0d6fd31f66 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32952) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32952",
       "triggerID" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "triggerType" : "PUSH"
     }, {
       "hash" : "894eb800e4045264cec298908892659466e78b7b",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32964",
       "triggerID" : "894eb800e4045264cec298908892659466e78b7b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "770b70d23b1f989d2de869f0d92c37615743513e",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33018",
       "triggerID" : "770b70d23b1f989d2de869f0d92c37615743513e",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 894eb800e4045264cec298908892659466e78b7b Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32964) 
   * 770b70d23b1f989d2de869f0d92c37615743513e Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33018) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32952",
       "triggerID" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "triggerType" : "PUSH"
     }, {
       "hash" : "894eb800e4045264cec298908892659466e78b7b",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "894eb800e4045264cec298908892659466e78b7b",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 195bf4369bf1eb0cdc1727847e63ce0d6fd31f66 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32952) 
   * 894eb800e4045264cec298908892659466e78b7b UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * a36182858952a61ef43ccbdc184440354bb86110 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721) 
   * d9ffed1defa7514c27d3ba9a9e3008de8b84bc68 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dianfu commented on a change in pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
dianfu commented on a change in pull request #18957:
URL: https://github.com/apache/flink/pull/18957#discussion_r821588832



##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -905,24 +887,6 @@ def set_parallelism(self, parallelism: int) -> 'DataStreamSink':
         self._j_data_stream_sink.setParallelism(parallelism)
         return self
 
-    def set_description(self, description: str) -> 'DataStreamSink':

Review comment:
       ditto

##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -1216,6 +1180,18 @@ def window(self, window_assigner: WindowAssigner) -> 'WindowedStream':
         """
         return WindowedStream(self, window_assigner)
 
+    def count_window(self, window_size: int, window_slide = 0):
+        """
+        Windows this KeyedStream into tumbling count windows.
+        Params:

Review comment:
       ```suggestion
           
   ```

##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -231,24 +231,6 @@ def slot_sharing_group(self, slot_sharing_group: Union[str, SlotSharingGroup]) -
             self._j_data_stream.slotSharingGroup(slot_sharing_group)
         return self
 
-    def set_description(self, description: str) -> 'DataStream':

Review comment:
       What's the purpose of this change?

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -65,6 +66,23 @@ def max_timestamp(self) -> int:
         pass
 
 
+class SessionWindowTimeGapExtractor(ABC):
+    """
+    Window is a grouping of elements into finite buckets. Windows have a maximum timestamp
+    which means that, at some point, all elements that go into one window will have arrived.
+    """
+
+    @abstractmethod
+    def extract(self, element: Any) -> int:
+        """
+        Extracts the session time gap.
+        Params:

Review comment:
       ```suggestion
   
   ```

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +538,721 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,

Review comment:
       ```suggestion
                      window: TimeWindow,
   ```

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -36,8 +39,6 @@
            'TimeWindowSerializer',
            'CountWindowSerializer']

Review comment:
       Declare the introduced classes which are planned to be exposed to users here.

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +538,721 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of the

Review comment:
       The comment is misleading for users as it support both event time and processing time.

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +538,721 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of the
+    machine the operation is running on. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(10), False))
+
+    A WindowAssigner that windows elements into windows based on the timestamp of the elements.
+    Windows cannot overlap.
+    For example, in order to window into windows of 1 minute:
+     ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(60000, 0, True))
+    """
+
+    def __init__(self, size: int, offset: int, is_event_time: bool):
+        if abs(offset) >= size:
+            raise Exception("TumblingWindowAssigner parameters must satisfy abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+        self._is_event_time = is_event_time
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if self._is_event_time is False:
+            current_processing_time = context.get_current_processing_time()
+            start = TimeWindow.get_window_start_with_offset(current_processing_time, self._offset,
+                                                            self._size)
+            return [TimeWindow(start, start + self._size)]
+        else:
+            if timestamp > MIN_LONG_VALUE:
+                start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._size)
+                return [TimeWindow(start, start + self._size)]
+            else:
+                raise Exception("Record has MIN_LONG_VALUE timestamp (= no timestamp marker). "
+                                + "Is the time characteristic set to 'ProcessingTime', or did you forget to call "
+                                + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        if self._is_event_time is True:
+            return EventTimeTrigger()
+        else:
+            return ProcessingTimeTrigger()
+
+    def get_window_serializer(self) -> TypeSerializer[W]:

Review comment:
       ```suggestion
       def get_window_serializer(self) -> TypeSerializer[TimeWindow]:
   ```

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +538,721 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of the
+    machine the operation is running on. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(10), False))
+
+    A WindowAssigner that windows elements into windows based on the timestamp of the elements.
+    Windows cannot overlap.
+    For example, in order to window into windows of 1 minute:
+     ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(60000, 0, True))
+    """
+
+    def __init__(self, size: int, offset: int, is_event_time: bool):
+        if abs(offset) >= size:
+            raise Exception("TumblingWindowAssigner parameters must satisfy abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+        self._is_event_time = is_event_time
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if self._is_event_time is False:
+            current_processing_time = context.get_current_processing_time()
+            start = TimeWindow.get_window_start_with_offset(current_processing_time, self._offset,
+                                                            self._size)
+            return [TimeWindow(start, start + self._size)]
+        else:
+            if timestamp > MIN_LONG_VALUE:
+                start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._size)
+                return [TimeWindow(start, start + self._size)]
+            else:
+                raise Exception("Record has MIN_LONG_VALUE timestamp (= no timestamp marker). "
+                                + "Is the time characteristic set to 'ProcessingTime', or did you forget to call "
+                                + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:

Review comment:
       ```suggestion
       def get_default_trigger(self, env) -> Trigger[T, TimeWindow]:
   ```

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +538,721 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of the
+    machine the operation is running on. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(10), False))
+
+    A WindowAssigner that windows elements into windows based on the timestamp of the elements.
+    Windows cannot overlap.
+    For example, in order to window into windows of 1 minute:
+     ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(60000, 0, True))
+    """
+
+    def __init__(self, size: int, offset: int, is_event_time: bool):
+        if abs(offset) >= size:
+            raise Exception("TumblingWindowAssigner parameters must satisfy abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+        self._is_event_time = is_event_time
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if self._is_event_time is False:
+            current_processing_time = context.get_current_processing_time()
+            start = TimeWindow.get_window_start_with_offset(current_processing_time, self._offset,
+                                                            self._size)
+            return [TimeWindow(start, start + self._size)]
+        else:
+            if timestamp > MIN_LONG_VALUE:
+                start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._size)
+                return [TimeWindow(start, start + self._size)]
+            else:
+                raise Exception("Record has MIN_LONG_VALUE timestamp (= no timestamp marker). "
+                                + "Is the time characteristic set to 'ProcessingTime', or did you forget to call "
+                                + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        if self._is_event_time is True:
+            return EventTimeTrigger()
+        else:
+            return ProcessingTimeTrigger()
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return self._is_event_time
+
+    def __repr__(self):
+        return "TumblingWindowAssigner(%s,%s,%s)" % (self._size, self._offset, self.is_event_time)
+
+
+class CountTumblingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into fixed-size windows based on the count number
+    of the elements. Windows cannot overlap.
+    """
+
+    def __init__(self, window_size: int):
+        """
+        Windows this KeyedStream into tumbling count windows.
+        :param window_size: The size of the windows in number of elements.
+        """
+        self._window_size = window_size
+        self._counter_state_descriptor = ReducingStateDescriptor(
+            "assigner_counter", lambda a, b: a + b, Types.LONG())
+
+    @staticmethod
+    def of(window_size: int):
+        return CountTumblingWindowAssigner(window_size)
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        counter = context.get_runtime_context().get_reducing_state(
+            self._counter_state_descriptor)
+        if counter.get() is None:
+            counter.add(0)
+        result = [CountWindow(counter.get() // self._window_size)]
+        counter.add(1)
+        return result
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self) -> str:
+        return "CountTumblingWindowAssigner(%s)" % (self._window_size)
+
+
+class SlidingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the current system time
+    of the machine the operation is running on. Windows can possibly overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(SlidingWindowAssigner(60000, 10000, 0, False))
+
+    A WindowAssigner that windows elements into sliding windows based on the timestamp of the
+    elements. Windows can possibly overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(SlidingWindowAssigner(60000, 10000, 0, True))
+    """
+
+    def __init__(self, size: int, slide: int, offset: int, is_event_time: bool):
+        if abs(offset) >= slide or size <= 0:
+            raise Exception("SlidingWindowAssigner parameters must satisfy "
+                            + "abs(offset) < slide and size > 0")
+
+        self._size = size
+        self._slide = slide
+        self._offset = offset
+        self._is_event_time = is_event_time
+        self._pane_size = math.gcd(size, slide)
+        self._num_panes_per_window = size // self._pane_size
+
+    def assign_windows(
+        self,
+        element: T,
+        timestamp: int,
+        context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:

Review comment:
       ```suggestion
           context: 'WindowAssigner.WindowAssignerContext') -> Collection[TimeWindow]:
   ```

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +538,721 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of the
+    machine the operation is running on. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(10), False))
+
+    A WindowAssigner that windows elements into windows based on the timestamp of the elements.
+    Windows cannot overlap.
+    For example, in order to window into windows of 1 minute:
+     ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(60000, 0, True))
+    """
+
+    def __init__(self, size: int, offset: int, is_event_time: bool):
+        if abs(offset) >= size:
+            raise Exception("TumblingWindowAssigner parameters must satisfy abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+        self._is_event_time = is_event_time
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if self._is_event_time is False:
+            current_processing_time = context.get_current_processing_time()
+            start = TimeWindow.get_window_start_with_offset(current_processing_time, self._offset,
+                                                            self._size)
+            return [TimeWindow(start, start + self._size)]
+        else:
+            if timestamp > MIN_LONG_VALUE:
+                start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._size)
+                return [TimeWindow(start, start + self._size)]
+            else:
+                raise Exception("Record has MIN_LONG_VALUE timestamp (= no timestamp marker). "
+                                + "Is the time characteristic set to 'ProcessingTime', or did you forget to call "
+                                + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        if self._is_event_time is True:
+            return EventTimeTrigger()
+        else:
+            return ProcessingTimeTrigger()
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return self._is_event_time
+
+    def __repr__(self):
+        return "TumblingWindowAssigner(%s,%s,%s)" % (self._size, self._offset, self.is_event_time)
+
+
+class CountTumblingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into fixed-size windows based on the count number
+    of the elements. Windows cannot overlap.
+    """
+
+    def __init__(self, window_size: int):
+        """
+        Windows this KeyedStream into tumbling count windows.
+        :param window_size: The size of the windows in number of elements.
+        """
+        self._window_size = window_size
+        self._counter_state_descriptor = ReducingStateDescriptor(
+            "assigner_counter", lambda a, b: a + b, Types.LONG())
+
+    @staticmethod
+    def of(window_size: int):
+        return CountTumblingWindowAssigner(window_size)
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        counter = context.get_runtime_context().get_reducing_state(
+            self._counter_state_descriptor)
+        if counter.get() is None:
+            counter.add(0)
+        result = [CountWindow(counter.get() // self._window_size)]
+        counter.add(1)
+        return result
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self) -> str:
+        return "CountTumblingWindowAssigner(%s)" % (self._window_size)
+
+
+class SlidingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the current system time
+    of the machine the operation is running on. Windows can possibly overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(SlidingWindowAssigner(60000, 10000, 0, False))
+
+    A WindowAssigner that windows elements into sliding windows based on the timestamp of the
+    elements. Windows can possibly overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(SlidingWindowAssigner(60000, 10000, 0, True))
+    """
+
+    def __init__(self, size: int, slide: int, offset: int, is_event_time: bool):
+        if abs(offset) >= slide or size <= 0:
+            raise Exception("SlidingWindowAssigner parameters must satisfy "
+                            + "abs(offset) < slide and size > 0")
+
+        self._size = size
+        self._slide = slide
+        self._offset = offset
+        self._is_event_time = is_event_time
+        self._pane_size = math.gcd(size, slide)
+        self._num_panes_per_window = size // self._pane_size
+
+    def assign_windows(
+        self,
+        element: T,
+        timestamp: int,
+        context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        if self._is_event_time is False:
+            current_processing_time = context.get_current_processing_time()
+            last_start = TimeWindow.get_window_start_with_offset(
+                current_processing_time, self._offset, self._slide)
+            windows = [TimeWindow(start, start + self._size)
+                       for start in range(last_start,
+                                          current_processing_time - self._size, -self._slide)]
+            return windows
+        else:
+            if timestamp > MIN_LONG_VALUE:
+                last_start = TimeWindow.get_window_start_with_offset(timestamp,
+                                                                     self._offset, self._slide)
+                windows = [TimeWindow(start, start + self._size)
+                           for start in range(last_start, timestamp - self._size, -self._slide)]
+                return windows
+            else:
+                raise Exception("Record has MIN_LONG_VALUE timestamp (= no timestamp marker). "
+                                + "Is the time characteristic set to 'ProcessingTime', "
+                                  "or did you forget to call "
+                                + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        if self._is_event_time is True:
+            return EventTimeTrigger()
+        else:
+            return ProcessingTimeTrigger()
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return self._is_event_time
+
+    def __repr__(self) -> str:
+        return "SlidingWindowAssigner(%s, %s, %s, %s)" % (
+            self._size, self._slide, self._offset, self._is_event_time)
+
+
+class CountSlidingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the count number of
+    the elements. Windows can possibly overlap.
+    """
+
+    def __init__(self, window_size: int, window_slide: int):
+        """
+        Windows this KeyedStream into sliding count windows.
+        :param window_size: The size of the windows in number of elements.
+        :param window_slide: The slide interval in number of elements.
+        """
+        self._window_size = window_size
+        self._window_slide = window_slide
+        self._count = None  # type: ValueState
+        self._counter_state_descriptor = ReducingStateDescriptor(
+            "slide-count-assigner", lambda a, b: a + b, Types.LONG())
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        count_descriptor = ValueStateDescriptor('slide-count-assigner', Types.LONG())
+        self._count = context.get_runtime_context().get_state(count_descriptor)
+        count_value = self._count.value()
+        if count_value is None:
+            current_count = 0
+        else:
+            current_count = count_value
+        self._count.update(current_count + 1)
+        last_id = current_count // self._window_slide
+        last_start = last_id * self._window_slide
+        last_end = last_start + self._window_size - 1
+        windows = []
+        while last_id >= 0 and last_start <= current_count <= last_end:
+            if last_start <= current_count <= last_end:
+                windows.append(CountWindow(last_id))
+            last_id -= 1
+            last_start -= self._window_slide
+            last_end -= self._window_slide
+        return windows
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self):
+        return "CountSlidingWindowAssigner(%s, %s)" % (self._window_size, self._window_slide)
+
+
+class SessionWindowAssigner(MergingWindowAssigner[T, TimeWindow]):
+    """
+        WindowAssigner that windows elements into sessions based on the timestamp. Windows cannot

Review comment:
       correct the indentation.

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +538,721 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of the
+    machine the operation is running on. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(10), False))
+
+    A WindowAssigner that windows elements into windows based on the timestamp of the elements.
+    Windows cannot overlap.
+    For example, in order to window into windows of 1 minute:
+     ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(60000, 0, True))
+    """
+
+    def __init__(self, size: int, offset: int, is_event_time: bool):
+        if abs(offset) >= size:
+            raise Exception("TumblingWindowAssigner parameters must satisfy abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+        self._is_event_time = is_event_time
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if self._is_event_time is False:
+            current_processing_time = context.get_current_processing_time()
+            start = TimeWindow.get_window_start_with_offset(current_processing_time, self._offset,
+                                                            self._size)
+            return [TimeWindow(start, start + self._size)]
+        else:
+            if timestamp > MIN_LONG_VALUE:
+                start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._size)
+                return [TimeWindow(start, start + self._size)]
+            else:
+                raise Exception("Record has MIN_LONG_VALUE timestamp (= no timestamp marker). "
+                                + "Is the time characteristic set to 'ProcessingTime', or did you forget to call "
+                                + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        if self._is_event_time is True:
+            return EventTimeTrigger()
+        else:
+            return ProcessingTimeTrigger()
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return self._is_event_time
+
+    def __repr__(self):
+        return "TumblingWindowAssigner(%s,%s,%s)" % (self._size, self._offset, self.is_event_time)
+
+
+class CountTumblingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into fixed-size windows based on the count number
+    of the elements. Windows cannot overlap.
+    """
+
+    def __init__(self, window_size: int):
+        """
+        Windows this KeyedStream into tumbling count windows.
+        :param window_size: The size of the windows in number of elements.
+        """
+        self._window_size = window_size
+        self._counter_state_descriptor = ReducingStateDescriptor(
+            "assigner_counter", lambda a, b: a + b, Types.LONG())
+
+    @staticmethod
+    def of(window_size: int):
+        return CountTumblingWindowAssigner(window_size)
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        counter = context.get_runtime_context().get_reducing_state(
+            self._counter_state_descriptor)
+        if counter.get() is None:
+            counter.add(0)
+        result = [CountWindow(counter.get() // self._window_size)]
+        counter.add(1)
+        return result
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self) -> str:
+        return "CountTumblingWindowAssigner(%s)" % (self._window_size)
+
+
+class SlidingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the current system time
+    of the machine the operation is running on. Windows can possibly overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(SlidingWindowAssigner(60000, 10000, 0, False))
+
+    A WindowAssigner that windows elements into sliding windows based on the timestamp of the
+    elements. Windows can possibly overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(SlidingWindowAssigner(60000, 10000, 0, True))
+    """
+
+    def __init__(self, size: int, slide: int, offset: int, is_event_time: bool):
+        if abs(offset) >= slide or size <= 0:
+            raise Exception("SlidingWindowAssigner parameters must satisfy "
+                            + "abs(offset) < slide and size > 0")
+
+        self._size = size
+        self._slide = slide
+        self._offset = offset
+        self._is_event_time = is_event_time
+        self._pane_size = math.gcd(size, slide)
+        self._num_panes_per_window = size // self._pane_size
+
+    def assign_windows(
+        self,
+        element: T,
+        timestamp: int,
+        context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        if self._is_event_time is False:
+            current_processing_time = context.get_current_processing_time()
+            last_start = TimeWindow.get_window_start_with_offset(
+                current_processing_time, self._offset, self._slide)
+            windows = [TimeWindow(start, start + self._size)
+                       for start in range(last_start,
+                                          current_processing_time - self._size, -self._slide)]
+            return windows
+        else:
+            if timestamp > MIN_LONG_VALUE:
+                last_start = TimeWindow.get_window_start_with_offset(timestamp,
+                                                                     self._offset, self._slide)
+                windows = [TimeWindow(start, start + self._size)
+                           for start in range(last_start, timestamp - self._size, -self._slide)]
+                return windows
+            else:
+                raise Exception("Record has MIN_LONG_VALUE timestamp (= no timestamp marker). "
+                                + "Is the time characteristic set to 'ProcessingTime', "
+                                  "or did you forget to call "
+                                + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        if self._is_event_time is True:
+            return EventTimeTrigger()
+        else:
+            return ProcessingTimeTrigger()
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return self._is_event_time
+
+    def __repr__(self) -> str:
+        return "SlidingWindowAssigner(%s, %s, %s, %s)" % (
+            self._size, self._slide, self._offset, self._is_event_time)
+
+
+class CountSlidingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the count number of
+    the elements. Windows can possibly overlap.
+    """
+
+    def __init__(self, window_size: int, window_slide: int):
+        """
+        Windows this KeyedStream into sliding count windows.
+        :param window_size: The size of the windows in number of elements.
+        :param window_slide: The slide interval in number of elements.
+        """
+        self._window_size = window_size
+        self._window_slide = window_slide
+        self._count = None  # type: ValueState
+        self._counter_state_descriptor = ReducingStateDescriptor(
+            "slide-count-assigner", lambda a, b: a + b, Types.LONG())
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        count_descriptor = ValueStateDescriptor('slide-count-assigner', Types.LONG())
+        self._count = context.get_runtime_context().get_state(count_descriptor)
+        count_value = self._count.value()
+        if count_value is None:
+            current_count = 0
+        else:
+            current_count = count_value
+        self._count.update(current_count + 1)
+        last_id = current_count // self._window_slide
+        last_start = last_id * self._window_slide
+        last_end = last_start + self._window_size - 1
+        windows = []
+        while last_id >= 0 and last_start <= current_count <= last_end:
+            if last_start <= current_count <= last_end:
+                windows.append(CountWindow(last_id))
+            last_id -= 1
+            last_start -= self._window_slide
+            last_end -= self._window_slide
+        return windows
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self):
+        return "CountSlidingWindowAssigner(%s, %s)" % (self._window_size, self._window_slide)
+
+
+class SessionWindowAssigner(MergingWindowAssigner[T, TimeWindow]):
+    """
+        WindowAssigner that windows elements into sessions based on the timestamp. Windows cannot
+        overlap.
+        """
+
+    def __init__(self, session_gap: int, is_event_time: bool):
+        if session_gap <= 0:
+            raise Exception("SessionWindowAssigner parameters must satisfy 0 < size")
+
+        self._session_gap = session_gap
+        self._is_event_time = is_event_time
+
+    def merge_windows(self,
+                      windows: Iterable[W],
+                      callback: 'MergingWindowAssigner.MergeCallback[W]') -> None:
+        window_list = [w for w in windows]
+        window_list.sort()
+        for i in range(1, len(window_list)):
+            if window_list[i - 1].end > window_list[i].start:
+                callback.merge([window_list[i - 1], window_list[i]],
+                               TimeWindow(window_list[i - 1].start, window_list[i].end))
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        if self._is_event_time is False:
+            timestamp = context.get_current_processing_time()
+
+        return [TimeWindow(timestamp, timestamp + self._session_gap)]
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:

Review comment:
       ```suggestion
       def get_default_trigger(self, env) -> Trigger[T, TimeWindow]:
   ```

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +538,721 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of the
+    machine the operation is running on. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(10), False))
+
+    A WindowAssigner that windows elements into windows based on the timestamp of the elements.
+    Windows cannot overlap.
+    For example, in order to window into windows of 1 minute:
+     ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(60000, 0, True))
+    """
+
+    def __init__(self, size: int, offset: int, is_event_time: bool):
+        if abs(offset) >= size:
+            raise Exception("TumblingWindowAssigner parameters must satisfy abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+        self._is_event_time = is_event_time
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if self._is_event_time is False:
+            current_processing_time = context.get_current_processing_time()
+            start = TimeWindow.get_window_start_with_offset(current_processing_time, self._offset,
+                                                            self._size)
+            return [TimeWindow(start, start + self._size)]
+        else:
+            if timestamp > MIN_LONG_VALUE:
+                start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._size)
+                return [TimeWindow(start, start + self._size)]
+            else:
+                raise Exception("Record has MIN_LONG_VALUE timestamp (= no timestamp marker). "
+                                + "Is the time characteristic set to 'ProcessingTime', or did you forget to call "
+                                + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        if self._is_event_time is True:
+            return EventTimeTrigger()
+        else:
+            return ProcessingTimeTrigger()
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return self._is_event_time
+
+    def __repr__(self):
+        return "TumblingWindowAssigner(%s,%s,%s)" % (self._size, self._offset, self.is_event_time)
+
+
+class CountTumblingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into fixed-size windows based on the count number
+    of the elements. Windows cannot overlap.
+    """
+
+    def __init__(self, window_size: int):
+        """
+        Windows this KeyedStream into tumbling count windows.
+        :param window_size: The size of the windows in number of elements.
+        """
+        self._window_size = window_size
+        self._counter_state_descriptor = ReducingStateDescriptor(
+            "assigner_counter", lambda a, b: a + b, Types.LONG())
+
+    @staticmethod
+    def of(window_size: int):
+        return CountTumblingWindowAssigner(window_size)
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        counter = context.get_runtime_context().get_reducing_state(
+            self._counter_state_descriptor)
+        if counter.get() is None:
+            counter.add(0)
+        result = [CountWindow(counter.get() // self._window_size)]
+        counter.add(1)
+        return result
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self) -> str:
+        return "CountTumblingWindowAssigner(%s)" % (self._window_size)
+
+
+class SlidingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the current system time
+    of the machine the operation is running on. Windows can possibly overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(SlidingWindowAssigner(60000, 10000, 0, False))
+
+    A WindowAssigner that windows elements into sliding windows based on the timestamp of the
+    elements. Windows can possibly overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(SlidingWindowAssigner(60000, 10000, 0, True))
+    """
+
+    def __init__(self, size: int, slide: int, offset: int, is_event_time: bool):
+        if abs(offset) >= slide or size <= 0:
+            raise Exception("SlidingWindowAssigner parameters must satisfy "
+                            + "abs(offset) < slide and size > 0")
+
+        self._size = size
+        self._slide = slide
+        self._offset = offset
+        self._is_event_time = is_event_time
+        self._pane_size = math.gcd(size, slide)
+        self._num_panes_per_window = size // self._pane_size
+
+    def assign_windows(
+        self,
+        element: T,
+        timestamp: int,
+        context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        if self._is_event_time is False:
+            current_processing_time = context.get_current_processing_time()
+            last_start = TimeWindow.get_window_start_with_offset(
+                current_processing_time, self._offset, self._slide)
+            windows = [TimeWindow(start, start + self._size)
+                       for start in range(last_start,
+                                          current_processing_time - self._size, -self._slide)]
+            return windows
+        else:
+            if timestamp > MIN_LONG_VALUE:
+                last_start = TimeWindow.get_window_start_with_offset(timestamp,
+                                                                     self._offset, self._slide)
+                windows = [TimeWindow(start, start + self._size)
+                           for start in range(last_start, timestamp - self._size, -self._slide)]
+                return windows
+            else:
+                raise Exception("Record has MIN_LONG_VALUE timestamp (= no timestamp marker). "
+                                + "Is the time characteristic set to 'ProcessingTime', "
+                                  "or did you forget to call "
+                                + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        if self._is_event_time is True:
+            return EventTimeTrigger()
+        else:
+            return ProcessingTimeTrigger()
+
+    def get_window_serializer(self) -> TypeSerializer[W]:

Review comment:
       ```suggestion
       def get_window_serializer(self) -> TypeSerializer[TimeWindow]:
   ```

##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -1216,6 +1180,18 @@ def window(self, window_assigner: WindowAssigner) -> 'WindowedStream':
         """
         return WindowedStream(self, window_assigner)
 
+    def count_window(self, window_size: int, window_slide = 0):
+        """
+        Windows this KeyedStream into tumbling count windows.

Review comment:
       ```suggestion
           Windows this KeyedStream into tumbling or sliding count windows.
   ```

##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -1216,6 +1180,18 @@ def window(self, window_assigner: WindowAssigner) -> 'WindowedStream':
         """
         return WindowedStream(self, window_assigner)
 
+    def count_window(self, window_size: int, window_slide = 0):
+        """
+        Windows this KeyedStream into tumbling count windows.
+        Params:
+        :param window_size: The size of the windows in number of elements.
+        :param window_slide: The slide interval in number of elements.
+        """
+        if window_slide is 0:

Review comment:
       ```suggestion
           if window_slide == 0:
   ```

##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -1216,6 +1180,18 @@ def window(self, window_assigner: WindowAssigner) -> 'WindowedStream':
         """
         return WindowedStream(self, window_assigner)
 
+    def count_window(self, window_size: int, window_slide = 0):

Review comment:
       ```suggestion
       def count_window(self, size: int, slide=0):
   ```

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -65,6 +66,23 @@ def max_timestamp(self) -> int:
         pass
 
 
+class SessionWindowTimeGapExtractor(ABC):
+    """
+    Window is a grouping of elements into finite buckets. Windows have a maximum timestamp

Review comment:
       The doc doesn't apply here. Could you refer to the Java doc of the Java SessionWindowTimeGapExtractor?

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +538,721 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.

Review comment:
       state.clear()

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +538,721 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,

Review comment:
       ```suggestion
                         window: TimeWindow,
   ```

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +538,721 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of the
+    machine the operation is running on. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(10), False))

Review comment:
       ```suggestion
               >>> .window(TumblingWindowAssigner(60000, 10000, False))
   ```

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +538,721 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of the
+    machine the operation is running on. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(10), False))
+
+    A WindowAssigner that windows elements into windows based on the timestamp of the elements.
+    Windows cannot overlap.
+    For example, in order to window into windows of 1 minute:
+     ::

Review comment:
       correct indentation

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +538,721 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of the
+    machine the operation is running on. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(10), False))
+
+    A WindowAssigner that windows elements into windows based on the timestamp of the elements.
+    Windows cannot overlap.
+    For example, in order to window into windows of 1 minute:
+     ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(60000, 0, True))
+    """
+
+    def __init__(self, size: int, offset: int, is_event_time: bool):
+        if abs(offset) >= size:

Review comment:
       check offset is positive?

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +538,721 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of the
+    machine the operation is running on. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(10), False))
+
+    A WindowAssigner that windows elements into windows based on the timestamp of the elements.
+    Windows cannot overlap.
+    For example, in order to window into windows of 1 minute:
+     ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(60000, 0, True))
+    """
+
+    def __init__(self, size: int, offset: int, is_event_time: bool):
+        if abs(offset) >= size:
+            raise Exception("TumblingWindowAssigner parameters must satisfy abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+        self._is_event_time = is_event_time
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if self._is_event_time is False:
+            current_processing_time = context.get_current_processing_time()
+            start = TimeWindow.get_window_start_with_offset(current_processing_time, self._offset,
+                                                            self._size)
+            return [TimeWindow(start, start + self._size)]
+        else:
+            if timestamp > MIN_LONG_VALUE:
+                start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._size)
+                return [TimeWindow(start, start + self._size)]
+            else:
+                raise Exception("Record has MIN_LONG_VALUE timestamp (= no timestamp marker). "
+                                + "Is the time characteristic set to 'ProcessingTime', or did you forget to call "
+                                + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        if self._is_event_time is True:
+            return EventTimeTrigger()
+        else:
+            return ProcessingTimeTrigger()
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return self._is_event_time
+
+    def __repr__(self):
+        return "TumblingWindowAssigner(%s,%s,%s)" % (self._size, self._offset, self.is_event_time)
+
+
+class CountTumblingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into fixed-size windows based on the count number
+    of the elements. Windows cannot overlap.
+    """
+
+    def __init__(self, window_size: int):
+        """
+        Windows this KeyedStream into tumbling count windows.
+        :param window_size: The size of the windows in number of elements.
+        """
+        self._window_size = window_size
+        self._counter_state_descriptor = ReducingStateDescriptor(
+            "assigner_counter", lambda a, b: a + b, Types.LONG())
+
+    @staticmethod
+    def of(window_size: int):
+        return CountTumblingWindowAssigner(window_size)
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        counter = context.get_runtime_context().get_reducing_state(
+            self._counter_state_descriptor)
+        if counter.get() is None:
+            counter.add(0)
+        result = [CountWindow(counter.get() // self._window_size)]
+        counter.add(1)
+        return result
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:

Review comment:
       ```suggestion
       def get_default_trigger(self, env) -> Trigger[T, CountWindow]:
   ```

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -16,15 +16,18 @@
 # limitations under the License.
 ################################################################################
 import sys
+import math
 from abc import ABC, abstractmethod
 from enum import Enum
 from io import BytesIO
-from typing import TypeVar, Generic, Iterable, Collection
-
+from typing import TypeVar, Generic, Iterable, Collection, Any
+from pyflink.common import TypeSerializer, Time

Review comment:
       ```suggestion
   from pyflink.common import Time
   ```

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +538,721 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of the
+    machine the operation is running on. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(10), False))
+
+    A WindowAssigner that windows elements into windows based on the timestamp of the elements.
+    Windows cannot overlap.
+    For example, in order to window into windows of 1 minute:
+     ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(60000, 0, True))
+    """
+
+    def __init__(self, size: int, offset: int, is_event_time: bool):
+        if abs(offset) >= size:
+            raise Exception("TumblingWindowAssigner parameters must satisfy abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+        self._is_event_time = is_event_time
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if self._is_event_time is False:
+            current_processing_time = context.get_current_processing_time()
+            start = TimeWindow.get_window_start_with_offset(current_processing_time, self._offset,
+                                                            self._size)
+            return [TimeWindow(start, start + self._size)]
+        else:
+            if timestamp > MIN_LONG_VALUE:
+                start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._size)
+                return [TimeWindow(start, start + self._size)]
+            else:
+                raise Exception("Record has MIN_LONG_VALUE timestamp (= no timestamp marker). "
+                                + "Is the time characteristic set to 'ProcessingTime', or did you forget to call "

Review comment:
       you can run ./dev/lint-python.sh locally to make sure that there is no checkstyle issues.

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +538,721 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of the
+    machine the operation is running on. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(10), False))
+
+    A WindowAssigner that windows elements into windows based on the timestamp of the elements.
+    Windows cannot overlap.
+    For example, in order to window into windows of 1 minute:
+     ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(60000, 0, True))
+    """
+
+    def __init__(self, size: int, offset: int, is_event_time: bool):
+        if abs(offset) >= size:
+            raise Exception("TumblingWindowAssigner parameters must satisfy abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+        self._is_event_time = is_event_time
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if self._is_event_time is False:
+            current_processing_time = context.get_current_processing_time()
+            start = TimeWindow.get_window_start_with_offset(current_processing_time, self._offset,
+                                                            self._size)
+            return [TimeWindow(start, start + self._size)]
+        else:
+            if timestamp > MIN_LONG_VALUE:
+                start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._size)
+                return [TimeWindow(start, start + self._size)]
+            else:
+                raise Exception("Record has MIN_LONG_VALUE timestamp (= no timestamp marker). "
+                                + "Is the time characteristic set to 'ProcessingTime', or did you forget to call "
+                                + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        if self._is_event_time is True:
+            return EventTimeTrigger()
+        else:
+            return ProcessingTimeTrigger()
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return self._is_event_time
+
+    def __repr__(self):
+        return "TumblingWindowAssigner(%s,%s,%s)" % (self._size, self._offset, self.is_event_time)
+
+
+class CountTumblingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into fixed-size windows based on the count number
+    of the elements. Windows cannot overlap.
+    """
+
+    def __init__(self, window_size: int):
+        """
+        Windows this KeyedStream into tumbling count windows.
+        :param window_size: The size of the windows in number of elements.
+        """
+        self._window_size = window_size
+        self._counter_state_descriptor = ReducingStateDescriptor(
+            "assigner_counter", lambda a, b: a + b, Types.LONG())
+
+    @staticmethod
+    def of(window_size: int):
+        return CountTumblingWindowAssigner(window_size)
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        counter = context.get_runtime_context().get_reducing_state(
+            self._counter_state_descriptor)
+        if counter.get() is None:
+            counter.add(0)
+        result = [CountWindow(counter.get() // self._window_size)]
+        counter.add(1)
+        return result
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[W]:

Review comment:
       ```suggestion
       def get_window_serializer(self) -> TypeSerializer[CountWindow]:
   ```

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +538,721 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of the
+    machine the operation is running on. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(10), False))
+
+    A WindowAssigner that windows elements into windows based on the timestamp of the elements.
+    Windows cannot overlap.
+    For example, in order to window into windows of 1 minute:
+     ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(60000, 0, True))
+    """
+
+    def __init__(self, size: int, offset: int, is_event_time: bool):
+        if abs(offset) >= size:
+            raise Exception("TumblingWindowAssigner parameters must satisfy abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+        self._is_event_time = is_event_time
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if self._is_event_time is False:
+            current_processing_time = context.get_current_processing_time()
+            start = TimeWindow.get_window_start_with_offset(current_processing_time, self._offset,
+                                                            self._size)
+            return [TimeWindow(start, start + self._size)]
+        else:
+            if timestamp > MIN_LONG_VALUE:
+                start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._size)
+                return [TimeWindow(start, start + self._size)]
+            else:
+                raise Exception("Record has MIN_LONG_VALUE timestamp (= no timestamp marker). "

Review comment:
       use the value of MIN_LONG_VALUE 

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +538,721 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of the
+    machine the operation is running on. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(10), False))
+
+    A WindowAssigner that windows elements into windows based on the timestamp of the elements.
+    Windows cannot overlap.
+    For example, in order to window into windows of 1 minute:
+     ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(60000, 0, True))
+    """
+
+    def __init__(self, size: int, offset: int, is_event_time: bool):
+        if abs(offset) >= size:
+            raise Exception("TumblingWindowAssigner parameters must satisfy abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+        self._is_event_time = is_event_time
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if self._is_event_time is False:
+            current_processing_time = context.get_current_processing_time()
+            start = TimeWindow.get_window_start_with_offset(current_processing_time, self._offset,
+                                                            self._size)
+            return [TimeWindow(start, start + self._size)]
+        else:
+            if timestamp > MIN_LONG_VALUE:
+                start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._size)
+                return [TimeWindow(start, start + self._size)]
+            else:
+                raise Exception("Record has MIN_LONG_VALUE timestamp (= no timestamp marker). "
+                                + "Is the time characteristic set to 'ProcessingTime', or did you forget to call "
+                                + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        if self._is_event_time is True:
+            return EventTimeTrigger()
+        else:
+            return ProcessingTimeTrigger()
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return self._is_event_time
+
+    def __repr__(self):
+        return "TumblingWindowAssigner(%s,%s,%s)" % (self._size, self._offset, self.is_event_time)
+
+
+class CountTumblingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into fixed-size windows based on the count number
+    of the elements. Windows cannot overlap.
+    """
+
+    def __init__(self, window_size: int):
+        """
+        Windows this KeyedStream into tumbling count windows.
+        :param window_size: The size of the windows in number of elements.
+        """
+        self._window_size = window_size
+        self._counter_state_descriptor = ReducingStateDescriptor(
+            "assigner_counter", lambda a, b: a + b, Types.LONG())
+
+    @staticmethod
+    def of(window_size: int):
+        return CountTumblingWindowAssigner(window_size)
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        counter = context.get_runtime_context().get_reducing_state(
+            self._counter_state_descriptor)
+        if counter.get() is None:
+            counter.add(0)
+        result = [CountWindow(counter.get() // self._window_size)]
+        counter.add(1)
+        return result
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self) -> str:
+        return "CountTumblingWindowAssigner(%s)" % (self._window_size)

Review comment:
       ```suggestion
           return "CountTumblingWindowAssigner(%s)" % self._window_size
   ```

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +538,721 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of the
+    machine the operation is running on. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(10), False))
+
+    A WindowAssigner that windows elements into windows based on the timestamp of the elements.
+    Windows cannot overlap.
+    For example, in order to window into windows of 1 minute:
+     ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(60000, 0, True))
+    """
+
+    def __init__(self, size: int, offset: int, is_event_time: bool):
+        if abs(offset) >= size:
+            raise Exception("TumblingWindowAssigner parameters must satisfy abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+        self._is_event_time = is_event_time
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if self._is_event_time is False:
+            current_processing_time = context.get_current_processing_time()
+            start = TimeWindow.get_window_start_with_offset(current_processing_time, self._offset,
+                                                            self._size)
+            return [TimeWindow(start, start + self._size)]
+        else:
+            if timestamp > MIN_LONG_VALUE:
+                start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._size)
+                return [TimeWindow(start, start + self._size)]
+            else:
+                raise Exception("Record has MIN_LONG_VALUE timestamp (= no timestamp marker). "
+                                + "Is the time characteristic set to 'ProcessingTime', or did you forget to call "
+                                + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        if self._is_event_time is True:
+            return EventTimeTrigger()
+        else:
+            return ProcessingTimeTrigger()
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return self._is_event_time
+
+    def __repr__(self):
+        return "TumblingWindowAssigner(%s,%s,%s)" % (self._size, self._offset, self.is_event_time)
+
+
+class CountTumblingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into fixed-size windows based on the count number
+    of the elements. Windows cannot overlap.
+    """
+
+    def __init__(self, window_size: int):
+        """
+        Windows this KeyedStream into tumbling count windows.
+        :param window_size: The size of the windows in number of elements.
+        """
+        self._window_size = window_size
+        self._counter_state_descriptor = ReducingStateDescriptor(
+            "assigner_counter", lambda a, b: a + b, Types.LONG())
+
+    @staticmethod
+    def of(window_size: int):
+        return CountTumblingWindowAssigner(window_size)
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        counter = context.get_runtime_context().get_reducing_state(
+            self._counter_state_descriptor)
+        if counter.get() is None:
+            counter.add(0)
+        result = [CountWindow(counter.get() // self._window_size)]
+        counter.add(1)
+        return result
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self) -> str:
+        return "CountTumblingWindowAssigner(%s)" % (self._window_size)
+
+
+class SlidingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the current system time
+    of the machine the operation is running on. Windows can possibly overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(SlidingWindowAssigner(60000, 10000, 0, False))
+
+    A WindowAssigner that windows elements into sliding windows based on the timestamp of the
+    elements. Windows can possibly overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(SlidingWindowAssigner(60000, 10000, 0, True))
+    """
+
+    def __init__(self, size: int, slide: int, offset: int, is_event_time: bool):
+        if abs(offset) >= slide or size <= 0:
+            raise Exception("SlidingWindowAssigner parameters must satisfy "
+                            + "abs(offset) < slide and size > 0")
+
+        self._size = size
+        self._slide = slide
+        self._offset = offset
+        self._is_event_time = is_event_time
+        self._pane_size = math.gcd(size, slide)
+        self._num_panes_per_window = size // self._pane_size
+
+    def assign_windows(
+        self,
+        element: T,
+        timestamp: int,
+        context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        if self._is_event_time is False:
+            current_processing_time = context.get_current_processing_time()
+            last_start = TimeWindow.get_window_start_with_offset(
+                current_processing_time, self._offset, self._slide)
+            windows = [TimeWindow(start, start + self._size)
+                       for start in range(last_start,
+                                          current_processing_time - self._size, -self._slide)]
+            return windows
+        else:
+            if timestamp > MIN_LONG_VALUE:
+                last_start = TimeWindow.get_window_start_with_offset(timestamp,
+                                                                     self._offset, self._slide)
+                windows = [TimeWindow(start, start + self._size)
+                           for start in range(last_start, timestamp - self._size, -self._slide)]
+                return windows
+            else:
+                raise Exception("Record has MIN_LONG_VALUE timestamp (= no timestamp marker). "
+                                + "Is the time characteristic set to 'ProcessingTime', "
+                                  "or did you forget to call "
+                                + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:

Review comment:
       ```suggestion
       def get_default_trigger(self, env) -> Trigger[T, TimeWindow]:
   ```

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +538,721 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of the
+    machine the operation is running on. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(10), False))
+
+    A WindowAssigner that windows elements into windows based on the timestamp of the elements.
+    Windows cannot overlap.
+    For example, in order to window into windows of 1 minute:
+     ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(60000, 0, True))
+    """
+
+    def __init__(self, size: int, offset: int, is_event_time: bool):
+        if abs(offset) >= size:
+            raise Exception("TumblingWindowAssigner parameters must satisfy abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+        self._is_event_time = is_event_time
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if self._is_event_time is False:
+            current_processing_time = context.get_current_processing_time()
+            start = TimeWindow.get_window_start_with_offset(current_processing_time, self._offset,
+                                                            self._size)
+            return [TimeWindow(start, start + self._size)]
+        else:
+            if timestamp > MIN_LONG_VALUE:
+                start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._size)
+                return [TimeWindow(start, start + self._size)]
+            else:
+                raise Exception("Record has MIN_LONG_VALUE timestamp (= no timestamp marker). "
+                                + "Is the time characteristic set to 'ProcessingTime', or did you forget to call "
+                                + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        if self._is_event_time is True:
+            return EventTimeTrigger()
+        else:
+            return ProcessingTimeTrigger()
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return self._is_event_time
+
+    def __repr__(self):
+        return "TumblingWindowAssigner(%s,%s,%s)" % (self._size, self._offset, self.is_event_time)
+
+
+class CountTumblingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into fixed-size windows based on the count number
+    of the elements. Windows cannot overlap.
+    """
+
+    def __init__(self, window_size: int):
+        """
+        Windows this KeyedStream into tumbling count windows.
+        :param window_size: The size of the windows in number of elements.
+        """
+        self._window_size = window_size
+        self._counter_state_descriptor = ReducingStateDescriptor(

Review comment:
       Why not use ValueState? I think it has better performance over ReducingState.

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +538,721 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of the
+    machine the operation is running on. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(10), False))
+
+    A WindowAssigner that windows elements into windows based on the timestamp of the elements.
+    Windows cannot overlap.
+    For example, in order to window into windows of 1 minute:
+     ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(60000, 0, True))
+    """
+
+    def __init__(self, size: int, offset: int, is_event_time: bool):
+        if abs(offset) >= size:
+            raise Exception("TumblingWindowAssigner parameters must satisfy abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+        self._is_event_time = is_event_time
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if self._is_event_time is False:
+            current_processing_time = context.get_current_processing_time()
+            start = TimeWindow.get_window_start_with_offset(current_processing_time, self._offset,
+                                                            self._size)
+            return [TimeWindow(start, start + self._size)]
+        else:
+            if timestamp > MIN_LONG_VALUE:
+                start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._size)
+                return [TimeWindow(start, start + self._size)]
+            else:
+                raise Exception("Record has MIN_LONG_VALUE timestamp (= no timestamp marker). "
+                                + "Is the time characteristic set to 'ProcessingTime', or did you forget to call "
+                                + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        if self._is_event_time is True:
+            return EventTimeTrigger()
+        else:
+            return ProcessingTimeTrigger()
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return self._is_event_time
+
+    def __repr__(self):
+        return "TumblingWindowAssigner(%s,%s,%s)" % (self._size, self._offset, self.is_event_time)
+
+
+class CountTumblingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into fixed-size windows based on the count number
+    of the elements. Windows cannot overlap.
+    """
+
+    def __init__(self, window_size: int):
+        """
+        Windows this KeyedStream into tumbling count windows.
+        :param window_size: The size of the windows in number of elements.
+        """
+        self._window_size = window_size
+        self._counter_state_descriptor = ReducingStateDescriptor(
+            "assigner_counter", lambda a, b: a + b, Types.LONG())
+
+    @staticmethod
+    def of(window_size: int):
+        return CountTumblingWindowAssigner(window_size)
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        counter = context.get_runtime_context().get_reducing_state(
+            self._counter_state_descriptor)
+        if counter.get() is None:
+            counter.add(0)
+        result = [CountWindow(counter.get() // self._window_size)]
+        counter.add(1)
+        return result
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self) -> str:
+        return "CountTumblingWindowAssigner(%s)" % (self._window_size)
+
+
+class SlidingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the current system time
+    of the machine the operation is running on. Windows can possibly overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(SlidingWindowAssigner(60000, 10000, 0, False))
+
+    A WindowAssigner that windows elements into sliding windows based on the timestamp of the
+    elements. Windows can possibly overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(SlidingWindowAssigner(60000, 10000, 0, True))
+    """
+
+    def __init__(self, size: int, slide: int, offset: int, is_event_time: bool):
+        if abs(offset) >= slide or size <= 0:
+            raise Exception("SlidingWindowAssigner parameters must satisfy "
+                            + "abs(offset) < slide and size > 0")
+
+        self._size = size
+        self._slide = slide
+        self._offset = offset
+        self._is_event_time = is_event_time
+        self._pane_size = math.gcd(size, slide)
+        self._num_panes_per_window = size // self._pane_size
+
+    def assign_windows(
+        self,
+        element: T,
+        timestamp: int,
+        context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        if self._is_event_time is False:
+            current_processing_time = context.get_current_processing_time()
+            last_start = TimeWindow.get_window_start_with_offset(
+                current_processing_time, self._offset, self._slide)
+            windows = [TimeWindow(start, start + self._size)
+                       for start in range(last_start,
+                                          current_processing_time - self._size, -self._slide)]
+            return windows
+        else:
+            if timestamp > MIN_LONG_VALUE:
+                last_start = TimeWindow.get_window_start_with_offset(timestamp,
+                                                                     self._offset, self._slide)
+                windows = [TimeWindow(start, start + self._size)
+                           for start in range(last_start, timestamp - self._size, -self._slide)]
+                return windows
+            else:
+                raise Exception("Record has MIN_LONG_VALUE timestamp (= no timestamp marker). "
+                                + "Is the time characteristic set to 'ProcessingTime', "
+                                  "or did you forget to call "
+                                + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        if self._is_event_time is True:
+            return EventTimeTrigger()
+        else:
+            return ProcessingTimeTrigger()
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return self._is_event_time
+
+    def __repr__(self) -> str:
+        return "SlidingWindowAssigner(%s, %s, %s, %s)" % (
+            self._size, self._slide, self._offset, self._is_event_time)
+
+
+class CountSlidingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the count number of
+    the elements. Windows can possibly overlap.
+    """
+
+    def __init__(self, window_size: int, window_slide: int):
+        """
+        Windows this KeyedStream into sliding count windows.
+        :param window_size: The size of the windows in number of elements.
+        :param window_slide: The slide interval in number of elements.
+        """
+        self._window_size = window_size
+        self._window_slide = window_slide
+        self._count = None  # type: ValueState
+        self._counter_state_descriptor = ReducingStateDescriptor(
+            "slide-count-assigner", lambda a, b: a + b, Types.LONG())
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        count_descriptor = ValueStateDescriptor('slide-count-assigner', Types.LONG())
+        self._count = context.get_runtime_context().get_state(count_descriptor)
+        count_value = self._count.value()
+        if count_value is None:
+            current_count = 0
+        else:
+            current_count = count_value
+        self._count.update(current_count + 1)
+        last_id = current_count // self._window_slide
+        last_start = last_id * self._window_slide
+        last_end = last_start + self._window_size - 1
+        windows = []
+        while last_id >= 0 and last_start <= current_count <= last_end:
+            if last_start <= current_count <= last_end:
+                windows.append(CountWindow(last_id))
+            last_id -= 1
+            last_start -= self._window_slide
+            last_end -= self._window_slide
+        return windows
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self):
+        return "CountSlidingWindowAssigner(%s, %s)" % (self._window_size, self._window_slide)
+
+
+class SessionWindowAssigner(MergingWindowAssigner[T, TimeWindow]):
+    """
+        WindowAssigner that windows elements into sessions based on the timestamp. Windows cannot
+        overlap.
+        """
+
+    def __init__(self, session_gap: int, is_event_time: bool):
+        if session_gap <= 0:
+            raise Exception("SessionWindowAssigner parameters must satisfy 0 < size")
+
+        self._session_gap = session_gap
+        self._is_event_time = is_event_time
+
+    def merge_windows(self,
+                      windows: Iterable[W],

Review comment:
       ```suggestion
                         windows: Iterable[TimeWindow],
   ```

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +538,721 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of the
+    machine the operation is running on. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(10), False))
+
+    A WindowAssigner that windows elements into windows based on the timestamp of the elements.
+    Windows cannot overlap.
+    For example, in order to window into windows of 1 minute:
+     ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(60000, 0, True))
+    """
+
+    def __init__(self, size: int, offset: int, is_event_time: bool):
+        if abs(offset) >= size:
+            raise Exception("TumblingWindowAssigner parameters must satisfy abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+        self._is_event_time = is_event_time
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if self._is_event_time is False:
+            current_processing_time = context.get_current_processing_time()
+            start = TimeWindow.get_window_start_with_offset(current_processing_time, self._offset,
+                                                            self._size)
+            return [TimeWindow(start, start + self._size)]
+        else:
+            if timestamp > MIN_LONG_VALUE:
+                start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._size)
+                return [TimeWindow(start, start + self._size)]
+            else:
+                raise Exception("Record has MIN_LONG_VALUE timestamp (= no timestamp marker). "
+                                + "Is the time characteristic set to 'ProcessingTime', or did you forget to call "
+                                + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        if self._is_event_time is True:
+            return EventTimeTrigger()
+        else:
+            return ProcessingTimeTrigger()
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return self._is_event_time
+
+    def __repr__(self):
+        return "TumblingWindowAssigner(%s,%s,%s)" % (self._size, self._offset, self.is_event_time)
+
+
+class CountTumblingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into fixed-size windows based on the count number
+    of the elements. Windows cannot overlap.
+    """
+
+    def __init__(self, window_size: int):
+        """
+        Windows this KeyedStream into tumbling count windows.
+        :param window_size: The size of the windows in number of elements.
+        """
+        self._window_size = window_size
+        self._counter_state_descriptor = ReducingStateDescriptor(
+            "assigner_counter", lambda a, b: a + b, Types.LONG())
+
+    @staticmethod
+    def of(window_size: int):
+        return CountTumblingWindowAssigner(window_size)
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        counter = context.get_runtime_context().get_reducing_state(
+            self._counter_state_descriptor)
+        if counter.get() is None:
+            counter.add(0)
+        result = [CountWindow(counter.get() // self._window_size)]
+        counter.add(1)
+        return result
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self) -> str:
+        return "CountTumblingWindowAssigner(%s)" % (self._window_size)
+
+
+class SlidingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the current system time
+    of the machine the operation is running on. Windows can possibly overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(SlidingWindowAssigner(60000, 10000, 0, False))
+
+    A WindowAssigner that windows elements into sliding windows based on the timestamp of the
+    elements. Windows can possibly overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(SlidingWindowAssigner(60000, 10000, 0, True))
+    """
+
+    def __init__(self, size: int, slide: int, offset: int, is_event_time: bool):
+        if abs(offset) >= slide or size <= 0:
+            raise Exception("SlidingWindowAssigner parameters must satisfy "
+                            + "abs(offset) < slide and size > 0")
+
+        self._size = size
+        self._slide = slide
+        self._offset = offset
+        self._is_event_time = is_event_time
+        self._pane_size = math.gcd(size, slide)
+        self._num_panes_per_window = size // self._pane_size
+
+    def assign_windows(
+        self,
+        element: T,
+        timestamp: int,
+        context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        if self._is_event_time is False:
+            current_processing_time = context.get_current_processing_time()
+            last_start = TimeWindow.get_window_start_with_offset(
+                current_processing_time, self._offset, self._slide)
+            windows = [TimeWindow(start, start + self._size)
+                       for start in range(last_start,
+                                          current_processing_time - self._size, -self._slide)]
+            return windows
+        else:
+            if timestamp > MIN_LONG_VALUE:
+                last_start = TimeWindow.get_window_start_with_offset(timestamp,
+                                                                     self._offset, self._slide)
+                windows = [TimeWindow(start, start + self._size)
+                           for start in range(last_start, timestamp - self._size, -self._slide)]
+                return windows
+            else:
+                raise Exception("Record has MIN_LONG_VALUE timestamp (= no timestamp marker). "
+                                + "Is the time characteristic set to 'ProcessingTime', "
+                                  "or did you forget to call "
+                                + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        if self._is_event_time is True:
+            return EventTimeTrigger()
+        else:
+            return ProcessingTimeTrigger()
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return self._is_event_time
+
+    def __repr__(self) -> str:
+        return "SlidingWindowAssigner(%s, %s, %s, %s)" % (
+            self._size, self._slide, self._offset, self._is_event_time)
+
+
+class CountSlidingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the count number of
+    the elements. Windows can possibly overlap.
+    """
+
+    def __init__(self, window_size: int, window_slide: int):
+        """
+        Windows this KeyedStream into sliding count windows.
+        :param window_size: The size of the windows in number of elements.
+        :param window_slide: The slide interval in number of elements.
+        """
+        self._window_size = window_size
+        self._window_slide = window_slide
+        self._count = None  # type: ValueState
+        self._counter_state_descriptor = ReducingStateDescriptor(
+            "slide-count-assigner", lambda a, b: a + b, Types.LONG())
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        count_descriptor = ValueStateDescriptor('slide-count-assigner', Types.LONG())
+        self._count = context.get_runtime_context().get_state(count_descriptor)
+        count_value = self._count.value()
+        if count_value is None:
+            current_count = 0
+        else:
+            current_count = count_value
+        self._count.update(current_count + 1)
+        last_id = current_count // self._window_slide
+        last_start = last_id * self._window_slide
+        last_end = last_start + self._window_size - 1
+        windows = []
+        while last_id >= 0 and last_start <= current_count <= last_end:
+            if last_start <= current_count <= last_end:
+                windows.append(CountWindow(last_id))
+            last_id -= 1
+            last_start -= self._window_slide
+            last_end -= self._window_slide
+        return windows
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self):
+        return "CountSlidingWindowAssigner(%s, %s)" % (self._window_size, self._window_slide)
+
+
+class SessionWindowAssigner(MergingWindowAssigner[T, TimeWindow]):
+    """
+        WindowAssigner that windows elements into sessions based on the timestamp. Windows cannot
+        overlap.
+        """
+
+    def __init__(self, session_gap: int, is_event_time: bool):
+        if session_gap <= 0:
+            raise Exception("SessionWindowAssigner parameters must satisfy 0 < size")
+
+        self._session_gap = session_gap
+        self._is_event_time = is_event_time
+
+    def merge_windows(self,
+                      windows: Iterable[W],
+                      callback: 'MergingWindowAssigner.MergeCallback[W]') -> None:
+        window_list = [w for w in windows]
+        window_list.sort()
+        for i in range(1, len(window_list)):
+            if window_list[i - 1].end > window_list[i].start:
+                callback.merge([window_list[i - 1], window_list[i]],
+                               TimeWindow(window_list[i - 1].start, window_list[i].end))
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        if self._is_event_time is False:
+            timestamp = context.get_current_processing_time()
+
+        return [TimeWindow(timestamp, timestamp + self._session_gap)]
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        if self._is_event_time is True:
+            return EventTimeTrigger()
+        else:
+            return ProcessingTimeTrigger()
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return self._is_event_time
+
+    def __repr__(self):
+        return "SessionWindowAssigner(%s, %s)" % (self._session_gap, self._is_event_time)
+
+
+class DynamicSessionWindowAssigner(MergingWindowAssigner[T, TimeWindow]):
+    """
+        WindowAssigner that windows elements into sessions based on the timestamp. Windows cannot
+        overlap.
+        """
+
+    def __init__(self,
+                 session_window_time_gap_extractor: SessionWindowTimeGapExtractor,
+                 is_event_time: bool):
+        self._session_gap = None
+        self._session_window_time_gap_extractor = session_window_time_gap_extractor
+        self._is_event_time = is_event_time
+
+    def merge_windows(self,
+                      windows: Iterable[W],
+                      callback: 'MergingWindowAssigner.MergeCallback[W]') -> None:
+        window_list = [w for w in windows]
+        window_list.sort()
+        for i in range(1, len(window_list)):
+            if window_list[i - 1].end > window_list[i].start:
+                callback.merge([window_list[i - 1], window_list[i]],
+                               TimeWindow(window_list[i - 1].start, window_list[i].end))
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        if self._is_event_time is False:
+            timestamp = context.get_current_processing_time()
+        self._session_gap = self._session_window_time_gap_extractor.extract(element)
+        if self._session_gap <= 0:
+            raise Exception("Dynamic session time gap must satisfy 0 < gap")
+
+        return [TimeWindow(timestamp, timestamp + self._session_gap)]
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        if self._is_event_time is True:
+            return EventTimeTrigger()
+        else:
+            return ProcessingTimeTrigger()
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return self._is_event_time
+
+    def __repr__(self):
+        return "DynamicSessionWindowAssigner(%s, %s)" % (self._session_gap, self._is_event_time)
+
+
+class TumblingProcessingTimeWindows:
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of
+    the machine the operation is running on. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+
+     in = ...
+     keyed = in.keyBy(...)
+     windowed =
+       keyed.window(TumblingProcessingTimeWindows.of(Time.of(1, MINUTES), Time.of(10, SECONDS))
+    """
+
+    @staticmethod
+    def of(size: Time, offset=None):
+        """
+        Creates a new TumblingProcessingTimeWindows WindowAssigner that assigns elements to time
+        windows based on the element timestamp and offset.
+        For example, if you want window a stream by hour,but window begins at the 15th minutes of
+        each hour, you can use of(Time.hours(1),Time.minutes(15)),then you will get time windows
+        start at 0:15:00,1:15:00,2:15:00,etc.
+        Rather than that,if you are living in somewhere which is not using UTC±00:00 time,
+        such as China which is using UTC+08:00,and you want a time window with size of one day,
+        and window begins at every 00:00:00 of local time,you may use of(Time.days(1),
+        Time.hours(-8)). The parameter of offset is Time.hours(-8))
+        since UTC+08:00 is 8 hours earlier than UTC time.
+        Params:
+            size – The size of the generated windows.
+            offset – The offset which window start would be shifted by.
+        Returns: The time policy.
+        """
+        if offset is None:
+            return TumblingWindowAssigner(size.to_milliseconds(), 0, False)
+        else:
+            return TumblingWindowAssigner(size.to_milliseconds(), offset.to_milliseconds(), False)
+
+
+class TumblingEventTimeWindows:
+    """
+    A WindowAssigner that windows elements into windows based on the timestamp of the elements.
+    Windows cannot overlap.
+    For example, in order to window into windows of 1 minute:
+
+     in = ...
+     keyed = in.keyBy(...)
+     windowed = keyed.window(TumblingEventTimeWindows.of(Time.minutes(1)))
+    """
+
+    @staticmethod
+    def of(size: Time, offset=None):

Review comment:
       Move this method into TumblingWindowAssigner?

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +538,721 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of the
+    machine the operation is running on. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(10), False))
+
+    A WindowAssigner that windows elements into windows based on the timestamp of the elements.
+    Windows cannot overlap.
+    For example, in order to window into windows of 1 minute:
+     ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(60000, 0, True))
+    """
+
+    def __init__(self, size: int, offset: int, is_event_time: bool):
+        if abs(offset) >= size:
+            raise Exception("TumblingWindowAssigner parameters must satisfy abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+        self._is_event_time = is_event_time
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if self._is_event_time is False:
+            current_processing_time = context.get_current_processing_time()
+            start = TimeWindow.get_window_start_with_offset(current_processing_time, self._offset,
+                                                            self._size)
+            return [TimeWindow(start, start + self._size)]
+        else:
+            if timestamp > MIN_LONG_VALUE:
+                start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._size)
+                return [TimeWindow(start, start + self._size)]
+            else:
+                raise Exception("Record has MIN_LONG_VALUE timestamp (= no timestamp marker). "
+                                + "Is the time characteristic set to 'ProcessingTime', or did you forget to call "
+                                + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        if self._is_event_time is True:
+            return EventTimeTrigger()
+        else:
+            return ProcessingTimeTrigger()
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return self._is_event_time
+
+    def __repr__(self):
+        return "TumblingWindowAssigner(%s,%s,%s)" % (self._size, self._offset, self.is_event_time)
+
+
+class CountTumblingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into fixed-size windows based on the count number
+    of the elements. Windows cannot overlap.
+    """
+
+    def __init__(self, window_size: int):
+        """
+        Windows this KeyedStream into tumbling count windows.
+        :param window_size: The size of the windows in number of elements.
+        """
+        self._window_size = window_size
+        self._counter_state_descriptor = ReducingStateDescriptor(
+            "assigner_counter", lambda a, b: a + b, Types.LONG())
+
+    @staticmethod
+    def of(window_size: int):
+        return CountTumblingWindowAssigner(window_size)
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        counter = context.get_runtime_context().get_reducing_state(
+            self._counter_state_descriptor)
+        if counter.get() is None:
+            counter.add(0)
+        result = [CountWindow(counter.get() // self._window_size)]
+        counter.add(1)
+        return result
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self) -> str:
+        return "CountTumblingWindowAssigner(%s)" % (self._window_size)
+
+
+class SlidingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the current system time
+    of the machine the operation is running on. Windows can possibly overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(SlidingWindowAssigner(60000, 10000, 0, False))
+
+    A WindowAssigner that windows elements into sliding windows based on the timestamp of the
+    elements. Windows can possibly overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(SlidingWindowAssigner(60000, 10000, 0, True))
+    """
+
+    def __init__(self, size: int, slide: int, offset: int, is_event_time: bool):
+        if abs(offset) >= slide or size <= 0:
+            raise Exception("SlidingWindowAssigner parameters must satisfy "
+                            + "abs(offset) < slide and size > 0")
+
+        self._size = size
+        self._slide = slide
+        self._offset = offset
+        self._is_event_time = is_event_time
+        self._pane_size = math.gcd(size, slide)
+        self._num_panes_per_window = size // self._pane_size
+
+    def assign_windows(
+        self,
+        element: T,
+        timestamp: int,
+        context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        if self._is_event_time is False:
+            current_processing_time = context.get_current_processing_time()
+            last_start = TimeWindow.get_window_start_with_offset(
+                current_processing_time, self._offset, self._slide)
+            windows = [TimeWindow(start, start + self._size)
+                       for start in range(last_start,
+                                          current_processing_time - self._size, -self._slide)]
+            return windows
+        else:
+            if timestamp > MIN_LONG_VALUE:
+                last_start = TimeWindow.get_window_start_with_offset(timestamp,
+                                                                     self._offset, self._slide)
+                windows = [TimeWindow(start, start + self._size)
+                           for start in range(last_start, timestamp - self._size, -self._slide)]
+                return windows
+            else:
+                raise Exception("Record has MIN_LONG_VALUE timestamp (= no timestamp marker). "
+                                + "Is the time characteristic set to 'ProcessingTime', "
+                                  "or did you forget to call "
+                                + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        if self._is_event_time is True:
+            return EventTimeTrigger()
+        else:
+            return ProcessingTimeTrigger()
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return self._is_event_time
+
+    def __repr__(self) -> str:
+        return "SlidingWindowAssigner(%s, %s, %s, %s)" % (
+            self._size, self._slide, self._offset, self._is_event_time)
+
+
+class CountSlidingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the count number of
+    the elements. Windows can possibly overlap.
+    """
+
+    def __init__(self, window_size: int, window_slide: int):
+        """
+        Windows this KeyedStream into sliding count windows.
+        :param window_size: The size of the windows in number of elements.
+        :param window_slide: The slide interval in number of elements.
+        """
+        self._window_size = window_size
+        self._window_slide = window_slide
+        self._count = None  # type: ValueState
+        self._counter_state_descriptor = ReducingStateDescriptor(
+            "slide-count-assigner", lambda a, b: a + b, Types.LONG())
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        count_descriptor = ValueStateDescriptor('slide-count-assigner', Types.LONG())
+        self._count = context.get_runtime_context().get_state(count_descriptor)
+        count_value = self._count.value()
+        if count_value is None:
+            current_count = 0
+        else:
+            current_count = count_value
+        self._count.update(current_count + 1)
+        last_id = current_count // self._window_slide
+        last_start = last_id * self._window_slide
+        last_end = last_start + self._window_size - 1
+        windows = []
+        while last_id >= 0 and last_start <= current_count <= last_end:
+            if last_start <= current_count <= last_end:
+                windows.append(CountWindow(last_id))
+            last_id -= 1
+            last_start -= self._window_slide
+            last_end -= self._window_slide
+        return windows
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self):
+        return "CountSlidingWindowAssigner(%s, %s)" % (self._window_size, self._window_slide)
+
+
+class SessionWindowAssigner(MergingWindowAssigner[T, TimeWindow]):
+    """
+        WindowAssigner that windows elements into sessions based on the timestamp. Windows cannot
+        overlap.
+        """
+
+    def __init__(self, session_gap: int, is_event_time: bool):
+        if session_gap <= 0:
+            raise Exception("SessionWindowAssigner parameters must satisfy 0 < size")
+
+        self._session_gap = session_gap
+        self._is_event_time = is_event_time
+
+    def merge_windows(self,
+                      windows: Iterable[W],
+                      callback: 'MergingWindowAssigner.MergeCallback[W]') -> None:
+        window_list = [w for w in windows]
+        window_list.sort()
+        for i in range(1, len(window_list)):
+            if window_list[i - 1].end > window_list[i].start:
+                callback.merge([window_list[i - 1], window_list[i]],
+                               TimeWindow(window_list[i - 1].start, window_list[i].end))
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        if self._is_event_time is False:
+            timestamp = context.get_current_processing_time()
+
+        return [TimeWindow(timestamp, timestamp + self._session_gap)]
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        if self._is_event_time is True:
+            return EventTimeTrigger()
+        else:
+            return ProcessingTimeTrigger()
+
+    def get_window_serializer(self) -> TypeSerializer[W]:

Review comment:
       ```suggestion
       def get_window_serializer(self) -> TypeSerializer[TimeWindow]:
   ```




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 1240d4313817b4ae0bfca21090eb6f1cbd40f78e Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * d9ffed1defa7514c27d3ba9a9e3008de8b84bc68 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805) 
   * 0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "c2c6a5ca85b9245321a0676f00da78a476b022ae",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "c2c6a5ca85b9245321a0676f00da78a476b022ae",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * a36182858952a61ef43ccbdc184440354bb86110 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721) 
   * c2c6a5ca85b9245321a0676f00da78a476b022ae UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * d9ffed1defa7514c27d3ba9a9e3008de8b84bc68 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 1240d4313817b4ae0bfca21090eb6f1cbd40f78e Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 14ae6d15c25f454fe7675bf7b0e129834b99ae37 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708) 
   * a36182858952a61ef43ccbdc184440354bb86110 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 1240d4313817b4ae0bfca21090eb6f1cbd40f78e Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684) 
   * 4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707) 
   * 14ae6d15c25f454fe7675bf7b0e129834b99ae37 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707) 
   * 14ae6d15c25f454fe7675bf7b0e129834b99ae37 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 8fea20f61b27f7b803c12798ad85dbe97e05dcdf Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405) 
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * bb918383e68f2c6539ceab99853972a73e9fec0c Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * a4a5fc419dbf82c079a24327846d4a7baa500959 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "cef09a1ef67b53e652e546e42c9fdd9715d6c568",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "cef09a1ef67b53e652e546e42c9fdd9715d6c568",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 1240d4313817b4ae0bfca21090eb6f1cbd40f78e Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684) 
   * cef09a1ef67b53e652e546e42c9fdd9715d6c568 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 1240d4313817b4ae0bfca21090eb6f1cbd40f78e Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684) 
   * 4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 1240d4313817b4ae0bfca21090eb6f1cbd40f78e Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 8e46affb05226eeed6b8eb20df971445139654a8 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 1240d4313817b4ae0bfca21090eb6f1cbd40f78e Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684) 
   * 4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 84846361d6db21a5ac721494ebb5badf23b274a1 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947) 
   * 195bf4369bf1eb0cdc1727847e63ce0d6fd31f66 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9912efdd986957fee47f630ce4368d76bfabeac6",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "9912efdd986957fee47f630ce4368d76bfabeac6",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * bb918383e68f2c6539ceab99853972a73e9fec0c Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423) 
   * a4a5fc419dbf82c079a24327846d4a7baa500959 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531) 
   * 9912efdd986957fee47f630ce4368d76bfabeac6 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dianfu commented on a change in pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
dianfu commented on a change in pull request #18957:
URL: https://github.com/apache/flink/pull/18957#discussion_r824431485



##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -38,7 +38,7 @@
 from pyflink.datastream.state import ValueStateDescriptor, ValueState, ListStateDescriptor
 from pyflink.datastream.utils import convert_to_python_obj
 from pyflink.java_gateway import get_gateway
-
+from pyflink.datastream.window import CountTumblingWindowAssigner, CountSlidingWindowAssigner

Review comment:
       Merge it with the import statement in line 24

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +543,795 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class SessionWindowTimeGapExtractor(ABC):
+    """
+    A {@code SessionWindowTimeGapExtractor} extracts session time gaps for Dynamic Session Window
+    Assigners.
+    :param <ABC> The type of elements that this {@code SessionWindowTimeGapExtractor} can extract
+        session time gaps from.
+    """
+
+    @abstractmethod
+    def extract(self, element: Any) -> int:
+        """
+        Extracts the session time gap.
+        :param element The input element.
+        :return The session time gap in milliseconds.
+        """
+        pass
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        reduce_state = ctx.get_partitioned_state(self._count_state_descriptor)  # type:  ReducingState
+        reduce_state.add(1)
+        if reduce_state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.

Review comment:
       correct the indentation

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +543,795 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class SessionWindowTimeGapExtractor(ABC):
+    """
+    A {@code SessionWindowTimeGapExtractor} extracts session time gaps for Dynamic Session Window
+    Assigners.
+    :param <ABC> The type of elements that this {@code SessionWindowTimeGapExtractor} can extract
+        session time gaps from.

Review comment:
       correct the indentation

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +543,795 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class SessionWindowTimeGapExtractor(ABC):
+    """
+    A {@code SessionWindowTimeGapExtractor} extracts session time gaps for Dynamic Session Window
+    Assigners.
+    :param <ABC> The type of elements that this {@code SessionWindowTimeGapExtractor} can extract
+        session time gaps from.
+    """
+
+    @abstractmethod
+    def extract(self, element: Any) -> int:
+        """
+        Extracts the session time gap.
+        :param element The input element.
+        :return The session time gap in milliseconds.
+        """
+        pass
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        reduce_state = ctx.get_partitioned_state(self._count_state_descriptor)  # type:  ReducingState
+        reduce_state.add(1)
+        if reduce_state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            reduce_state.clear()
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class CountTumblingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into fixed-size windows based on the count number
+    of the elements. Windows cannot overlap.
+    """
+
+    def __init__(self, window_size: int):
+        """
+        Windows this KeyedStream into tumbling count windows.
+        :param window_size: The size of the windows in number of elements.
+        """
+        self._window_size = window_size
+        self._count_descriptor = ValueStateDescriptor('count-assigner', Types.LONG())
+
+    @staticmethod
+    def of(window_size: int):
+        return CountTumblingWindowAssigner(window_size)
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[CountWindow]:
+        count = context.get_runtime_context().get_state(self._count_descriptor)
+        count_value = count.value()
+        if count_value is None:
+            current_count = 0
+        else:
+            current_count = count_value
+        count.update(current_count + 1)
+        result = [CountWindow(current_count // self._window_size)]
+        return result
+
+    def get_default_trigger(self, env) -> Trigger[T, CountWindow]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[CountWindow]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self) -> str:
+        return "CountTumblingWindowAssigner(%s)" % self._window_size
+
+
+class CountSlidingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the count number of
+    the elements. Windows can possibly overlap.
+    """
+
+    def __init__(self, window_size: int, window_slide: int):
+        """
+        Windows this KeyedStream into sliding count windows.
+        :param window_size: The size of the windows in number of elements.
+        :param window_slide: The slide interval in number of elements.
+        """
+        self._window_size = window_size
+        self._window_slide = window_slide
+        self._count_descriptor = ValueStateDescriptor('slide-count-assigner', Types.LONG())
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[CountWindow]:
+        count = context.get_runtime_context().get_state(self._count_descriptor)  # type: ValueState
+        count_value = count.value()
+        if count_value is None:
+            current_count = 0
+        else:
+            current_count = count_value
+        count.update(current_count + 1)
+        last_id = current_count // self._window_slide
+        last_start = last_id * self._window_slide
+        last_end = last_start + self._window_size - 1
+        windows = []
+        while last_id >= 0 and last_start <= current_count <= last_end:
+            if last_start <= current_count <= last_end:
+                windows.append(CountWindow(last_id))
+            last_id -= 1
+            last_start -= self._window_slide
+            last_end -= self._window_slide
+        return windows
+
+    def get_default_trigger(self, env) -> Trigger[T, CountWindow]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[CountWindow]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self):
+        return "CountSlidingWindowAssigner(%s, %s)" % (self._window_size, self._window_slide)
+
+
+class TumblingProcessingTimeWindows(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of
+    the machine the operation is running on. Windows cannot overlap.
+
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+    >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING())
+    >>> .window(TumblingProcessingTimeWindows.of(Time.minutes(1), Time.seconds(10)))
+    """
+    def __init__(self, size: int, offset: int):
+        if abs(offset) >= size:
+            raise Exception("TumblingProcessingTimeWindows parameters must satisfy "
+                            "abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        current_processing_time = context.get_current_processing_time()
+        start = TimeWindow.get_window_start_with_offset(current_processing_time, self._offset,
+                                                        self._size)
+        return [TimeWindow(start, start + self._size)]
+
+    def get_default_trigger(self, env) -> Trigger[T, TimeWindow]:
+        return ProcessingTimeTrigger()
+
+    @staticmethod
+    def of(size: Time, offset=None):

Review comment:
       ```suggestion
       def of(size: Time, offset: Time = None):
   ```

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +543,795 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class SessionWindowTimeGapExtractor(ABC):
+    """
+    A {@code SessionWindowTimeGapExtractor} extracts session time gaps for Dynamic Session Window
+    Assigners.
+    :param <ABC> The type of elements that this {@code SessionWindowTimeGapExtractor} can extract
+        session time gaps from.
+    """
+
+    @abstractmethod
+    def extract(self, element: Any) -> int:
+        """
+        Extracts the session time gap.
+        :param element The input element.
+        :return The session time gap in milliseconds.
+        """
+        pass
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        reduce_state = ctx.get_partitioned_state(self._count_state_descriptor)  # type:  ReducingState
+        reduce_state.add(1)
+        if reduce_state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            reduce_state.clear()
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class CountTumblingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into fixed-size windows based on the count number
+    of the elements. Windows cannot overlap.
+    """
+
+    def __init__(self, window_size: int):
+        """
+        Windows this KeyedStream into tumbling count windows.
+        :param window_size: The size of the windows in number of elements.
+        """
+        self._window_size = window_size
+        self._count_descriptor = ValueStateDescriptor('count-assigner', Types.LONG())

Review comment:
       ```suggestion
           self._count_descriptor = ValueStateDescriptor('tumble-count-assigner', Types.LONG())
   ```

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +543,795 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class SessionWindowTimeGapExtractor(ABC):
+    """
+    A {@code SessionWindowTimeGapExtractor} extracts session time gaps for Dynamic Session Window
+    Assigners.
+    :param <ABC> The type of elements that this {@code SessionWindowTimeGapExtractor} can extract
+        session time gaps from.
+    """
+
+    @abstractmethod
+    def extract(self, element: Any) -> int:
+        """
+        Extracts the session time gap.
+        :param element The input element.
+        :return The session time gap in milliseconds.
+        """
+        pass
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        reduce_state = ctx.get_partitioned_state(self._count_state_descriptor)  # type:  ReducingState
+        reduce_state.add(1)
+        if reduce_state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            reduce_state.clear()
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class CountTumblingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into fixed-size windows based on the count number
+    of the elements. Windows cannot overlap.
+    """
+
+    def __init__(self, window_size: int):
+        """
+        Windows this KeyedStream into tumbling count windows.
+        :param window_size: The size of the windows in number of elements.
+        """
+        self._window_size = window_size
+        self._count_descriptor = ValueStateDescriptor('count-assigner', Types.LONG())
+
+    @staticmethod
+    def of(window_size: int):
+        return CountTumblingWindowAssigner(window_size)
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[CountWindow]:
+        count = context.get_runtime_context().get_state(self._count_descriptor)
+        count_value = count.value()
+        if count_value is None:
+            current_count = 0
+        else:
+            current_count = count_value
+        count.update(current_count + 1)
+        result = [CountWindow(current_count // self._window_size)]
+        return result
+
+    def get_default_trigger(self, env) -> Trigger[T, CountWindow]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[CountWindow]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self) -> str:
+        return "CountTumblingWindowAssigner(%s)" % self._window_size
+
+
+class CountSlidingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the count number of
+    the elements. Windows can possibly overlap.
+    """
+
+    def __init__(self, window_size: int, window_slide: int):
+        """
+        Windows this KeyedStream into sliding count windows.
+        :param window_size: The size of the windows in number of elements.
+        :param window_slide: The slide interval in number of elements.
+        """
+        self._window_size = window_size
+        self._window_slide = window_slide
+        self._count_descriptor = ValueStateDescriptor('slide-count-assigner', Types.LONG())
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[CountWindow]:
+        count = context.get_runtime_context().get_state(self._count_descriptor)  # type: ValueState
+        count_value = count.value()
+        if count_value is None:
+            current_count = 0
+        else:
+            current_count = count_value
+        count.update(current_count + 1)
+        last_id = current_count // self._window_slide
+        last_start = last_id * self._window_slide
+        last_end = last_start + self._window_size - 1
+        windows = []
+        while last_id >= 0 and last_start <= current_count <= last_end:
+            if last_start <= current_count <= last_end:
+                windows.append(CountWindow(last_id))
+            last_id -= 1
+            last_start -= self._window_slide
+            last_end -= self._window_slide
+        return windows
+
+    def get_default_trigger(self, env) -> Trigger[T, CountWindow]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[CountWindow]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self):
+        return "CountSlidingWindowAssigner(%s, %s)" % (self._window_size, self._window_slide)
+
+
+class TumblingProcessingTimeWindows(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of
+    the machine the operation is running on. Windows cannot overlap.
+
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+    >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING())
+    >>> .window(TumblingProcessingTimeWindows.of(Time.minutes(1), Time.seconds(10)))
+    """
+    def __init__(self, size: int, offset: int):
+        if abs(offset) >= size:
+            raise Exception("TumblingProcessingTimeWindows parameters must satisfy "
+                            "abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        current_processing_time = context.get_current_processing_time()
+        start = TimeWindow.get_window_start_with_offset(current_processing_time, self._offset,
+                                                        self._size)
+        return [TimeWindow(start, start + self._size)]
+
+    def get_default_trigger(self, env) -> Trigger[T, TimeWindow]:
+        return ProcessingTimeTrigger()
+
+    @staticmethod
+    def of(size: Time, offset=None):
+        """
+        Creates a new {@code TumblingProcessingTimeWindows} {@link WindowAssigner} that assigns
+        elements to time windows based on the element timestamp and offset.
+
+        For example, if you want window a stream by hour,but window begins at the 15th minutes of
+        each hour, you can use {@code of(Time.hours(1),Time.minutes(15))},then you will get time
+        windows start at 0:15:00,1:15:00,2:15:00,etc.
+
+        Rather than that,if you are living in somewhere which is not using UTC±00:00 time, such as
+        China which is using UTC+08:00,and you want a time window with size of one day, and window
+        begins at every 00:00:00 of local time,you may use {@code of(Time.days(1),Time.hours(-8))}.
+        The parameter of offset is {@code Time.hours(-8))} since UTC+08:00 is 8 hours earlier than
+        UTC time.
+
+        :param size The size of the generated windows.
+        :param offset The offset which window start would be shifted by.
+        :return The time policy.
+        """
+        if offset is None:
+            return TumblingProcessingTimeWindows(size.to_milliseconds(), 0)
+        else:
+            return TumblingProcessingTimeWindows(size.to_milliseconds(), offset.to_milliseconds())
+
+    @property
+    def get_window_serializer(self) -> TypeSerializer[TimeWindow]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self):
+        return "TumblingProcessingTimeWindows(%s, %s)" % (self._size, self._offset)
+
+
+class TumblingEventTimeWindows(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the timestamp of the
+    elements. Windows cannot overlap.
+
+    For example, in order to window into windows of 1 minute:
+    ::
+    >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+    >>> .window(TumblingEventTimeWindows.of(Time.minutes(1)))
+    """
+
+    def __init__(self, size: int, offset: int):
+        if abs(offset) >= size:
+            raise Exception("TumblingEventTimeWindows parameters must satisfy abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if timestamp > MIN_LONG_VALUE:
+            start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._size)
+            return [TimeWindow(start, start + self._size)]
+        else:
+            raise Exception("Record has jvm.java.lang.Long.MIN_VALUE timestamp (= no timestamp marker). "
+                            + "Is the time characteristic set to 'ProcessingTime', "
+                            + "or did you forget to call "
+                            + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, TimeWindow]:
+        return EventTimeTrigger()
+
+    @staticmethod
+    def of(size: Time, offset=None):
+        """
+        Creates a new TumblingEventTimeWindows WindowAssigner that assigns elements
+        to time windows based on the element timestamp, offset and a staggering offset, depending on
+        the staggering policy.
+
+        :param size The size of the generated windows.
+        :param offset The globalOffset which window start would be shifted by.
+        """
+        if offset is None:
+            return TumblingEventTimeWindows(size.to_milliseconds(), 0)
+        else:
+            return TumblingEventTimeWindows(size.to_milliseconds(), offset.to_milliseconds())
+
+    def get_window_serializer(self) -> TypeSerializer[TimeWindow]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return True
+
+    def __repr__(self):
+        return "TumblingEventTimeWindows(%s, %s)" % (self._size, self._offset)
+
+
+class SlidingProcessingTimeWindows(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the current system
+    time of the machine the operation is running on. Windows can possibly overlap.
+
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+    >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+    >>> .window(SlidingProcessingTimeWindows.of(Time.minutes(1), Time.seconds(10)))
+    """
+
+    def __init__(self, size: int, slide: int, offset: int):
+        if abs(offset) >= slide or size <= 0:
+            raise Exception("SlidingProcessingTimeWindows parameters must satisfy "
+                            + "abs(offset) < slide and size > 0")
+
+        self._size = size
+        self._slide = slide
+        self._offset = offset
+        self._pane_size = math.gcd(size, slide)
+        self._num_panes_per_window = size // self._pane_size
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[TimeWindow]:
+        current_processing_time = context.get_current_processing_time()
+        last_start = TimeWindow.get_window_start_with_offset(
+            current_processing_time, self._offset, self._slide)
+        windows = [TimeWindow(start, start + self._size)
+                   for start in range(last_start,
+                                      current_processing_time - self._size, -self._slide)]
+        return windows
+
+    def get_default_trigger(self, env) -> Trigger[T, TimeWindow]:
+        return ProcessingTimeTrigger()
+
+    @staticmethod
+    def of(size: Time, slide: Time, offset=None):
+        """
+        Creates a new {@code SlidingProcessingTimeWindows} {@link WindowAssigner} that assigns
+        elements to time windows based on the element timestamp and offset.
+
+        For example, if you want window a stream by hour,but window begins at the 15th minutes of
+        each hour, you can use {@code of(Time.hours(1),Time.minutes(15))},then you will get time
+        windows start at 0:15:00,1:15:00,2:15:00,etc.
+
+        Rather than that,if you are living in somewhere which is not using UTC±00:00 time, such as
+        China which is using UTC+08:00,and you want a time window with size of one day, and window
+        begins at every 00:00:00 of local time,you may use {@code of(Time.days(1),Time.hours(-8))}.
+        The parameter of offset is {@code Time.hours(-8))} since UTC+08:00 is 8 hours earlier than
+        UTC time.
+
+        :param size The size of the generated windows.
+        :param slide The slide interval of the generated windows.
+        :param offset The offset which window start would be shifted by.
+        :return The time policy.
+        """
+        if offset is None:
+            return SlidingProcessingTimeWindows(size.to_milliseconds(), slide.to_milliseconds(), 0)
+        else:
+            return SlidingProcessingTimeWindows(size.to_milliseconds(), slide.to_milliseconds(),
+                                                offset.to_milliseconds())
+
+    def get_window_serializer(self) -> TypeSerializer[TimeWindow]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self) -> str:
+        return "SlidingProcessingTimeWindows(%s, %s, %s)" % (self._size, self._slide, self._offset)
+
+
+class SlidingEventTimeWindows(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the timestamp of the
+    elements. Windows can possibly overlap.
+
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+    >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+    >>> .window(SlidingEventTimeWindows.of(Time.minutes(1), Time.seconds(10)))
+    """
+
+    def __init__(self, size: int, slide: int, offset: int):
+        if abs(offset) >= slide or size <= 0:
+            raise Exception("SlidingEventTimeWindows parameters must satisfy "
+                            + "abs(offset) < slide and size > 0")
+
+        self._size = size
+        self._slide = slide
+        self._offset = offset
+        self._pane_size = math.gcd(size, slide)
+        self._num_panes_per_window = size // self._pane_size
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[TimeWindow]:
+        if timestamp > MIN_LONG_VALUE:
+            last_start = TimeWindow.get_window_start_with_offset(timestamp,
+                                                                 self._offset, self._slide)
+            windows = [TimeWindow(start, start + self._size)
+                       for start in range(last_start, timestamp - self._size, -self._slide)]
+            return windows
+        else:
+            raise Exception("Record has jvm.java.lang.Long.MIN_VALUE timestamp (= no timestamp marker). "
+                            + "Is the time characteristic set to 'ProcessingTime', "
+                              "or did you forget to call "
+                            + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, TimeWindow]:
+        return EventTimeTrigger()
+
+    @staticmethod
+    def of(size: Time, slide: Time, offset=None):

Review comment:
       ditto

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +543,795 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class SessionWindowTimeGapExtractor(ABC):
+    """
+    A {@code SessionWindowTimeGapExtractor} extracts session time gaps for Dynamic Session Window
+    Assigners.
+    :param <ABC> The type of elements that this {@code SessionWindowTimeGapExtractor} can extract
+        session time gaps from.
+    """
+
+    @abstractmethod
+    def extract(self, element: Any) -> int:
+        """
+        Extracts the session time gap.
+        :param element The input element.
+        :return The session time gap in milliseconds.
+        """
+        pass
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        reduce_state = ctx.get_partitioned_state(self._count_state_descriptor)  # type:  ReducingState
+        reduce_state.add(1)
+        if reduce_state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            reduce_state.clear()
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class CountTumblingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into fixed-size windows based on the count number
+    of the elements. Windows cannot overlap.
+    """
+
+    def __init__(self, window_size: int):
+        """
+        Windows this KeyedStream into tumbling count windows.
+        :param window_size: The size of the windows in number of elements.
+        """
+        self._window_size = window_size
+        self._count_descriptor = ValueStateDescriptor('count-assigner', Types.LONG())
+
+    @staticmethod
+    def of(window_size: int):
+        return CountTumblingWindowAssigner(window_size)
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[CountWindow]:
+        count = context.get_runtime_context().get_state(self._count_descriptor)
+        count_value = count.value()
+        if count_value is None:
+            current_count = 0
+        else:
+            current_count = count_value
+        count.update(current_count + 1)
+        result = [CountWindow(current_count // self._window_size)]
+        return result
+
+    def get_default_trigger(self, env) -> Trigger[T, CountWindow]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[CountWindow]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self) -> str:
+        return "CountTumblingWindowAssigner(%s)" % self._window_size
+
+
+class CountSlidingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the count number of
+    the elements. Windows can possibly overlap.
+    """
+
+    def __init__(self, window_size: int, window_slide: int):
+        """
+        Windows this KeyedStream into sliding count windows.
+        :param window_size: The size of the windows in number of elements.
+        :param window_slide: The slide interval in number of elements.
+        """
+        self._window_size = window_size
+        self._window_slide = window_slide
+        self._count_descriptor = ValueStateDescriptor('slide-count-assigner', Types.LONG())
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[CountWindow]:
+        count = context.get_runtime_context().get_state(self._count_descriptor)  # type: ValueState
+        count_value = count.value()
+        if count_value is None:
+            current_count = 0
+        else:
+            current_count = count_value
+        count.update(current_count + 1)
+        last_id = current_count // self._window_slide
+        last_start = last_id * self._window_slide
+        last_end = last_start + self._window_size - 1
+        windows = []
+        while last_id >= 0 and last_start <= current_count <= last_end:
+            if last_start <= current_count <= last_end:
+                windows.append(CountWindow(last_id))
+            last_id -= 1
+            last_start -= self._window_slide
+            last_end -= self._window_slide
+        return windows
+
+    def get_default_trigger(self, env) -> Trigger[T, CountWindow]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[CountWindow]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self):
+        return "CountSlidingWindowAssigner(%s, %s)" % (self._window_size, self._window_slide)
+
+
+class TumblingProcessingTimeWindows(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of
+    the machine the operation is running on. Windows cannot overlap.
+
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+    >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING())
+    >>> .window(TumblingProcessingTimeWindows.of(Time.minutes(1), Time.seconds(10)))
+    """
+    def __init__(self, size: int, offset: int):
+        if abs(offset) >= size:
+            raise Exception("TumblingProcessingTimeWindows parameters must satisfy "
+                            "abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        current_processing_time = context.get_current_processing_time()
+        start = TimeWindow.get_window_start_with_offset(current_processing_time, self._offset,
+                                                        self._size)
+        return [TimeWindow(start, start + self._size)]
+
+    def get_default_trigger(self, env) -> Trigger[T, TimeWindow]:
+        return ProcessingTimeTrigger()
+
+    @staticmethod
+    def of(size: Time, offset=None):
+        """
+        Creates a new {@code TumblingProcessingTimeWindows} {@link WindowAssigner} that assigns
+        elements to time windows based on the element timestamp and offset.
+
+        For example, if you want window a stream by hour,but window begins at the 15th minutes of
+        each hour, you can use {@code of(Time.hours(1),Time.minutes(15))},then you will get time
+        windows start at 0:15:00,1:15:00,2:15:00,etc.
+
+        Rather than that,if you are living in somewhere which is not using UTC±00:00 time, such as
+        China which is using UTC+08:00,and you want a time window with size of one day, and window
+        begins at every 00:00:00 of local time,you may use {@code of(Time.days(1),Time.hours(-8))}.
+        The parameter of offset is {@code Time.hours(-8))} since UTC+08:00 is 8 hours earlier than
+        UTC time.
+
+        :param size The size of the generated windows.
+        :param offset The offset which window start would be shifted by.
+        :return The time policy.
+        """
+        if offset is None:
+            return TumblingProcessingTimeWindows(size.to_milliseconds(), 0)
+        else:
+            return TumblingProcessingTimeWindows(size.to_milliseconds(), offset.to_milliseconds())
+
+    @property
+    def get_window_serializer(self) -> TypeSerializer[TimeWindow]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self):
+        return "TumblingProcessingTimeWindows(%s, %s)" % (self._size, self._offset)
+
+
+class TumblingEventTimeWindows(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the timestamp of the
+    elements. Windows cannot overlap.
+
+    For example, in order to window into windows of 1 minute:
+    ::
+    >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+    >>> .window(TumblingEventTimeWindows.of(Time.minutes(1)))
+    """
+
+    def __init__(self, size: int, offset: int):
+        if abs(offset) >= size:
+            raise Exception("TumblingEventTimeWindows parameters must satisfy abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if timestamp > MIN_LONG_VALUE:
+            start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._size)
+            return [TimeWindow(start, start + self._size)]
+        else:
+            raise Exception("Record has jvm.java.lang.Long.MIN_VALUE timestamp (= no timestamp marker). "
+                            + "Is the time characteristic set to 'ProcessingTime', "
+                            + "or did you forget to call "
+                            + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, TimeWindow]:
+        return EventTimeTrigger()
+
+    @staticmethod
+    def of(size: Time, offset=None):
+        """
+        Creates a new TumblingEventTimeWindows WindowAssigner that assigns elements
+        to time windows based on the element timestamp, offset and a staggering offset, depending on
+        the staggering policy.
+
+        :param size The size of the generated windows.
+        :param offset The globalOffset which window start would be shifted by.
+        """
+        if offset is None:
+            return TumblingEventTimeWindows(size.to_milliseconds(), 0)
+        else:
+            return TumblingEventTimeWindows(size.to_milliseconds(), offset.to_milliseconds())
+
+    def get_window_serializer(self) -> TypeSerializer[TimeWindow]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return True
+
+    def __repr__(self):
+        return "TumblingEventTimeWindows(%s, %s)" % (self._size, self._offset)
+
+
+class SlidingProcessingTimeWindows(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the current system
+    time of the machine the operation is running on. Windows can possibly overlap.
+
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+    >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+    >>> .window(SlidingProcessingTimeWindows.of(Time.minutes(1), Time.seconds(10)))
+    """
+
+    def __init__(self, size: int, slide: int, offset: int):
+        if abs(offset) >= slide or size <= 0:
+            raise Exception("SlidingProcessingTimeWindows parameters must satisfy "
+                            + "abs(offset) < slide and size > 0")
+
+        self._size = size
+        self._slide = slide
+        self._offset = offset
+        self._pane_size = math.gcd(size, slide)
+        self._num_panes_per_window = size // self._pane_size
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[TimeWindow]:
+        current_processing_time = context.get_current_processing_time()
+        last_start = TimeWindow.get_window_start_with_offset(
+            current_processing_time, self._offset, self._slide)
+        windows = [TimeWindow(start, start + self._size)
+                   for start in range(last_start,
+                                      current_processing_time - self._size, -self._slide)]
+        return windows
+
+    def get_default_trigger(self, env) -> Trigger[T, TimeWindow]:
+        return ProcessingTimeTrigger()
+
+    @staticmethod
+    def of(size: Time, slide: Time, offset=None):
+        """
+        Creates a new {@code SlidingProcessingTimeWindows} {@link WindowAssigner} that assigns
+        elements to time windows based on the element timestamp and offset.
+
+        For example, if you want window a stream by hour,but window begins at the 15th minutes of
+        each hour, you can use {@code of(Time.hours(1),Time.minutes(15))},then you will get time
+        windows start at 0:15:00,1:15:00,2:15:00,etc.
+
+        Rather than that,if you are living in somewhere which is not using UTC±00:00 time, such as
+        China which is using UTC+08:00,and you want a time window with size of one day, and window
+        begins at every 00:00:00 of local time,you may use {@code of(Time.days(1),Time.hours(-8))}.
+        The parameter of offset is {@code Time.hours(-8))} since UTC+08:00 is 8 hours earlier than
+        UTC time.
+
+        :param size The size of the generated windows.
+        :param slide The slide interval of the generated windows.
+        :param offset The offset which window start would be shifted by.
+        :return The time policy.
+        """
+        if offset is None:
+            return SlidingProcessingTimeWindows(size.to_milliseconds(), slide.to_milliseconds(), 0)
+        else:
+            return SlidingProcessingTimeWindows(size.to_milliseconds(), slide.to_milliseconds(),
+                                                offset.to_milliseconds())
+
+    def get_window_serializer(self) -> TypeSerializer[TimeWindow]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self) -> str:
+        return "SlidingProcessingTimeWindows(%s, %s, %s)" % (self._size, self._slide, self._offset)
+
+
+class SlidingEventTimeWindows(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the timestamp of the
+    elements. Windows can possibly overlap.
+
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+    >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+    >>> .window(SlidingEventTimeWindows.of(Time.minutes(1), Time.seconds(10)))
+    """
+
+    def __init__(self, size: int, slide: int, offset: int):
+        if abs(offset) >= slide or size <= 0:
+            raise Exception("SlidingEventTimeWindows parameters must satisfy "
+                            + "abs(offset) < slide and size > 0")
+
+        self._size = size
+        self._slide = slide
+        self._offset = offset
+        self._pane_size = math.gcd(size, slide)
+        self._num_panes_per_window = size // self._pane_size
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[TimeWindow]:
+        if timestamp > MIN_LONG_VALUE:
+            last_start = TimeWindow.get_window_start_with_offset(timestamp,
+                                                                 self._offset, self._slide)
+            windows = [TimeWindow(start, start + self._size)
+                       for start in range(last_start, timestamp - self._size, -self._slide)]
+            return windows
+        else:
+            raise Exception("Record has jvm.java.lang.Long.MIN_VALUE timestamp (= no timestamp marker). "
+                            + "Is the time characteristic set to 'ProcessingTime', "
+                              "or did you forget to call "
+                            + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, TimeWindow]:
+        return EventTimeTrigger()
+
+    @staticmethod
+    def of(size: Time, slide: Time, offset=None):
+        """
+        Creates a new {@code SlidingEventTimeWindows} {@link WindowAssigner} that assigns elements
+        to time windows based on the element timestamp and offset.
+
+        For example, if you want window a stream by hour,but window begins at the 15th minutes of
+        each hour, you can use {@code of(Time.hours(1),Time.minutes(15))},then you will get time
+        windows start at 0:15:00,1:15:00,2:15:00,etc.
+
+        Rather than that,if you are living in somewhere which is not using UTC±00:00 time, such as
+        China which is using UTC+08:00,and you want a time window with size of one day, and window
+        begins at every 00:00:00 of local time,you may use {@code of(Time.days(1),Time.hours(-8))}.
+        The parameter of offset is {@code Time.hours(-8))} since UTC+08:00 is 8 hours earlier than
+        UTC time.
+
+        :param size The size of the generated windows.
+        :param slide The slide interval of the generated windows.
+        :param offset The offset which window start would be shifted by.
+        :return The time policy.
+        """
+        if offset is None:
+            return SlidingEventTimeWindows(size.to_milliseconds(), slide.to_milliseconds(), 0)
+        else:
+            return SlidingEventTimeWindows(size.to_milliseconds(), slide.to_milliseconds(),
+                                           offset.to_milliseconds())
+
+    def get_window_serializer(self) -> TypeSerializer[TimeWindow]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return True
+
+    def __repr__(self) -> str:
+        return "SlidingEventTimeWindows(%s, %s, %s)" % (self._size, self._slide, self._offset)
+
+
+class ProcessingTimeSessionWindows(MergingWindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into sessions based on the current processing
+    time. Windows cannot overlap.
+
+    For example, the processing interval is set to 1 minutes:
+    ::
+    >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+    >>> .window(ProcessingTimeSessionWindows.with_gap(Time.minutes(1)))
+    """
+
+    def __init__(self, session_gap: int):
+        if session_gap <= 0:
+            raise Exception("SessionWindowAssigner parameters must satisfy 0 < size")
+
+        self._session_gap = session_gap
+
+    def merge_windows(self,
+                      windows: Iterable[TimeWindow],
+                      callback: 'MergingWindowAssigner.MergeCallback[TimeWindow]') -> None:
+        window_list = [w for w in windows]
+        window_list.sort()
+        for i in range(1, len(window_list)):
+            if window_list[i - 1].end > window_list[i].start:
+                callback.merge([window_list[i - 1], window_list[i]],
+                               TimeWindow(window_list[i - 1].start, window_list[i].end))
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[TimeWindow]:
+        timestamp = context.get_current_processing_time()
+
+        return [TimeWindow(timestamp, timestamp + self._session_gap)]
+
+    def get_default_trigger(self, env) -> Trigger[T, TimeWindow]:
+        return ProcessingTimeTrigger()
+
+    @staticmethod
+    def with_gap(size: Time):
+        """
+        Creates a new SessionWindows WindowAssigner that assigns elements to sessions based on
+        the element timestamp.
+        :param size: The session timeout, i.e. the time gap between sessions
+        :return: The policy.
+        """
+        return ProcessingTimeSessionWindows(size.to_milliseconds())
+
+    @staticmethod
+    def with_dynamic_gap(session_window_time_gap_extractor: SessionWindowTimeGapExtractor):
+        """
+        Creates a new SessionWindows WindowAssigner that assigns elements to sessions based on the
+        element timestamp.
+        :param session_window_time_gap_extractor: The extractor to use to extract the time gap
+            from the input elements
+        :return: The policy.
+        """
+        return DynamicProcessingTimeSessionWindows(session_window_time_gap_extractor)
+
+    def get_window_serializer(self) -> TypeSerializer[TimeWindow]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self):
+        return "ProcessingTimeSessionWindows(%s, %s)" % self._session_gap
+
+
+class EventTimeSessionWindows(MergingWindowAssigner[T, TimeWindow]):
+    """
+    A {@link WindowAssigner} that windows elements into sessions based on the timestamp of the
+    elements. Windows cannot overlap.
+
+    For example, Set the timestamp of the element to 1 minutes:
+    ::
+    >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+    >>> .window(EventTimeSessionWindows.with_gap(Time.minutes(1)))
+    """
+
+    def __init__(self, session_gap: int):
+        if session_gap <= 0:
+            raise Exception("SessionWindowAssigner parameters must satisfy 0 < size")

Review comment:
       ```suggestion
               raise Exception("EventTimeSessionWindows parameters must satisfy 0 < size")
   ```

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +543,795 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class SessionWindowTimeGapExtractor(ABC):
+    """
+    A {@code SessionWindowTimeGapExtractor} extracts session time gaps for Dynamic Session Window
+    Assigners.
+    :param <ABC> The type of elements that this {@code SessionWindowTimeGapExtractor} can extract
+        session time gaps from.
+    """
+
+    @abstractmethod
+    def extract(self, element: Any) -> int:
+        """
+        Extracts the session time gap.
+        :param element The input element.
+        :return The session time gap in milliseconds.
+        """
+        pass
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        reduce_state = ctx.get_partitioned_state(self._count_state_descriptor)  # type:  ReducingState
+        reduce_state.add(1)
+        if reduce_state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            reduce_state.clear()
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class CountTumblingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into fixed-size windows based on the count number
+    of the elements. Windows cannot overlap.
+    """
+
+    def __init__(self, window_size: int):
+        """
+        Windows this KeyedStream into tumbling count windows.
+        :param window_size: The size of the windows in number of elements.
+        """
+        self._window_size = window_size
+        self._count_descriptor = ValueStateDescriptor('count-assigner', Types.LONG())
+
+    @staticmethod
+    def of(window_size: int):
+        return CountTumblingWindowAssigner(window_size)
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[CountWindow]:
+        count = context.get_runtime_context().get_state(self._count_descriptor)
+        count_value = count.value()
+        if count_value is None:
+            current_count = 0
+        else:
+            current_count = count_value
+        count.update(current_count + 1)
+        result = [CountWindow(current_count // self._window_size)]
+        return result
+
+    def get_default_trigger(self, env) -> Trigger[T, CountWindow]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[CountWindow]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self) -> str:
+        return "CountTumblingWindowAssigner(%s)" % self._window_size
+
+
+class CountSlidingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the count number of
+    the elements. Windows can possibly overlap.
+    """
+
+    def __init__(self, window_size: int, window_slide: int):
+        """
+        Windows this KeyedStream into sliding count windows.
+        :param window_size: The size of the windows in number of elements.
+        :param window_slide: The slide interval in number of elements.
+        """
+        self._window_size = window_size
+        self._window_slide = window_slide
+        self._count_descriptor = ValueStateDescriptor('slide-count-assigner', Types.LONG())
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[CountWindow]:
+        count = context.get_runtime_context().get_state(self._count_descriptor)  # type: ValueState
+        count_value = count.value()
+        if count_value is None:
+            current_count = 0
+        else:
+            current_count = count_value
+        count.update(current_count + 1)
+        last_id = current_count // self._window_slide
+        last_start = last_id * self._window_slide
+        last_end = last_start + self._window_size - 1
+        windows = []
+        while last_id >= 0 and last_start <= current_count <= last_end:
+            if last_start <= current_count <= last_end:
+                windows.append(CountWindow(last_id))
+            last_id -= 1
+            last_start -= self._window_slide
+            last_end -= self._window_slide
+        return windows
+
+    def get_default_trigger(self, env) -> Trigger[T, CountWindow]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[CountWindow]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self):
+        return "CountSlidingWindowAssigner(%s, %s)" % (self._window_size, self._window_slide)
+
+
+class TumblingProcessingTimeWindows(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of
+    the machine the operation is running on. Windows cannot overlap.
+
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+    >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING())
+    >>> .window(TumblingProcessingTimeWindows.of(Time.minutes(1), Time.seconds(10)))
+    """
+    def __init__(self, size: int, offset: int):
+        if abs(offset) >= size:
+            raise Exception("TumblingProcessingTimeWindows parameters must satisfy "
+                            "abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        current_processing_time = context.get_current_processing_time()
+        start = TimeWindow.get_window_start_with_offset(current_processing_time, self._offset,
+                                                        self._size)
+        return [TimeWindow(start, start + self._size)]
+
+    def get_default_trigger(self, env) -> Trigger[T, TimeWindow]:
+        return ProcessingTimeTrigger()
+
+    @staticmethod
+    def of(size: Time, offset=None):
+        """
+        Creates a new {@code TumblingProcessingTimeWindows} {@link WindowAssigner} that assigns
+        elements to time windows based on the element timestamp and offset.
+
+        For example, if you want window a stream by hour,but window begins at the 15th minutes of
+        each hour, you can use {@code of(Time.hours(1),Time.minutes(15))},then you will get time
+        windows start at 0:15:00,1:15:00,2:15:00,etc.
+
+        Rather than that,if you are living in somewhere which is not using UTC±00:00 time, such as
+        China which is using UTC+08:00,and you want a time window with size of one day, and window
+        begins at every 00:00:00 of local time,you may use {@code of(Time.days(1),Time.hours(-8))}.
+        The parameter of offset is {@code Time.hours(-8))} since UTC+08:00 is 8 hours earlier than
+        UTC time.
+
+        :param size The size of the generated windows.
+        :param offset The offset which window start would be shifted by.
+        :return The time policy.
+        """
+        if offset is None:
+            return TumblingProcessingTimeWindows(size.to_milliseconds(), 0)
+        else:
+            return TumblingProcessingTimeWindows(size.to_milliseconds(), offset.to_milliseconds())
+
+    @property
+    def get_window_serializer(self) -> TypeSerializer[TimeWindow]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self):
+        return "TumblingProcessingTimeWindows(%s, %s)" % (self._size, self._offset)
+
+
+class TumblingEventTimeWindows(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the timestamp of the
+    elements. Windows cannot overlap.
+
+    For example, in order to window into windows of 1 minute:
+    ::
+    >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+    >>> .window(TumblingEventTimeWindows.of(Time.minutes(1)))
+    """
+
+    def __init__(self, size: int, offset: int):
+        if abs(offset) >= size:
+            raise Exception("TumblingEventTimeWindows parameters must satisfy abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if timestamp > MIN_LONG_VALUE:
+            start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._size)
+            return [TimeWindow(start, start + self._size)]
+        else:
+            raise Exception("Record has jvm.java.lang.Long.MIN_VALUE timestamp (= no timestamp marker). "
+                            + "Is the time characteristic set to 'ProcessingTime', "
+                            + "or did you forget to call "
+                            + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, TimeWindow]:
+        return EventTimeTrigger()
+
+    @staticmethod
+    def of(size: Time, offset=None):
+        """
+        Creates a new TumblingEventTimeWindows WindowAssigner that assigns elements
+        to time windows based on the element timestamp, offset and a staggering offset, depending on
+        the staggering policy.
+
+        :param size The size of the generated windows.
+        :param offset The globalOffset which window start would be shifted by.
+        """
+        if offset is None:
+            return TumblingEventTimeWindows(size.to_milliseconds(), 0)
+        else:
+            return TumblingEventTimeWindows(size.to_milliseconds(), offset.to_milliseconds())
+
+    def get_window_serializer(self) -> TypeSerializer[TimeWindow]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return True
+
+    def __repr__(self):
+        return "TumblingEventTimeWindows(%s, %s)" % (self._size, self._offset)
+
+
+class SlidingProcessingTimeWindows(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the current system
+    time of the machine the operation is running on. Windows can possibly overlap.
+
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+    >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+    >>> .window(SlidingProcessingTimeWindows.of(Time.minutes(1), Time.seconds(10)))
+    """
+
+    def __init__(self, size: int, slide: int, offset: int):
+        if abs(offset) >= slide or size <= 0:
+            raise Exception("SlidingProcessingTimeWindows parameters must satisfy "
+                            + "abs(offset) < slide and size > 0")
+
+        self._size = size
+        self._slide = slide
+        self._offset = offset
+        self._pane_size = math.gcd(size, slide)
+        self._num_panes_per_window = size // self._pane_size
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[TimeWindow]:
+        current_processing_time = context.get_current_processing_time()
+        last_start = TimeWindow.get_window_start_with_offset(
+            current_processing_time, self._offset, self._slide)
+        windows = [TimeWindow(start, start + self._size)
+                   for start in range(last_start,
+                                      current_processing_time - self._size, -self._slide)]
+        return windows
+
+    def get_default_trigger(self, env) -> Trigger[T, TimeWindow]:
+        return ProcessingTimeTrigger()
+
+    @staticmethod
+    def of(size: Time, slide: Time, offset=None):
+        """
+        Creates a new {@code SlidingProcessingTimeWindows} {@link WindowAssigner} that assigns
+        elements to time windows based on the element timestamp and offset.
+
+        For example, if you want window a stream by hour,but window begins at the 15th minutes of
+        each hour, you can use {@code of(Time.hours(1),Time.minutes(15))},then you will get time
+        windows start at 0:15:00,1:15:00,2:15:00,etc.
+
+        Rather than that,if you are living in somewhere which is not using UTC±00:00 time, such as
+        China which is using UTC+08:00,and you want a time window with size of one day, and window
+        begins at every 00:00:00 of local time,you may use {@code of(Time.days(1),Time.hours(-8))}.
+        The parameter of offset is {@code Time.hours(-8))} since UTC+08:00 is 8 hours earlier than
+        UTC time.
+
+        :param size The size of the generated windows.
+        :param slide The slide interval of the generated windows.
+        :param offset The offset which window start would be shifted by.
+        :return The time policy.
+        """
+        if offset is None:
+            return SlidingProcessingTimeWindows(size.to_milliseconds(), slide.to_milliseconds(), 0)
+        else:
+            return SlidingProcessingTimeWindows(size.to_milliseconds(), slide.to_milliseconds(),
+                                                offset.to_milliseconds())
+
+    def get_window_serializer(self) -> TypeSerializer[TimeWindow]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self) -> str:
+        return "SlidingProcessingTimeWindows(%s, %s, %s)" % (self._size, self._slide, self._offset)
+
+
+class SlidingEventTimeWindows(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the timestamp of the
+    elements. Windows can possibly overlap.
+
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+    >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+    >>> .window(SlidingEventTimeWindows.of(Time.minutes(1), Time.seconds(10)))
+    """
+
+    def __init__(self, size: int, slide: int, offset: int):
+        if abs(offset) >= slide or size <= 0:
+            raise Exception("SlidingEventTimeWindows parameters must satisfy "
+                            + "abs(offset) < slide and size > 0")
+
+        self._size = size
+        self._slide = slide
+        self._offset = offset
+        self._pane_size = math.gcd(size, slide)
+        self._num_panes_per_window = size // self._pane_size

Review comment:
       where it's used?

##########
File path: flink-python/pyflink/datastream/tests/test_data_stream.py
##########
@@ -1144,6 +1110,114 @@ def __init__(self, name, num):
         expected = ['c', 'c', 'b']
         self.assert_equals_sorted(expected, results)
 
+    def test_event_time_tumbling_window(self):
+        self.env.set_parallelism(1)
+        data_stream = self.env.from_collection([
+            ('hi', 1), ('hi', 2), ('hi', 3), ('hi', 4), ('hi', 5), ('hi', 8), ('hi', 9),
+            ('hi', 15)],
+            type_info=Types.TUPLE([Types.STRING(), Types.INT()]))  # type: DataStream
+        watermark_strategy = WatermarkStrategy.for_monotonous_timestamps() \
+            .with_timestamp_assigner(SecondColumnTimestampAssigner())
+        data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            .window(TumblingEventTimeWindows.of(Time.milliseconds(5))) \
+            .process(CountWindowProcessFunction(), Types.TUPLE([Types.STRING(), Types.INT()])) \
+            .add_sink(self.test_sink)
+
+        self.env.execute('test_event_time_tumbling_window')
+        results = self.test_sink.get_results()
+        expected = ['(hi,4)', '(hi,3)', '(hi,1)']
+        self.assert_equals_sorted(expected, results)
+
+    def test_count_tumbling_window(self):
+        self.env.set_parallelism(1)
+        data_stream = self.env.from_collection([
+            (1, 'hi'), (2, 'hello'), (3, 'hi'), (4, 'hello'), (5, 'hi'), (6, 'hello'),
+            (6, 'hello')],
+            type_info=Types.TUPLE([Types.INT(), Types.STRING()]))  # type: DataStream
+        data_stream.key_by(lambda x: x[1], key_type=Types.STRING()) \
+            .count_window(3, 2) \
+            .apply(SumWindowFunction(), Types.TUPLE([Types.STRING(), Types.INT()])) \
+            .add_sink(self.test_sink)
+
+        self.env.execute('test_count_tumbling_window')
+        results = self.test_sink.get_results()
+        expected = ['(hi,9)', '(hello,12)']
+        self.assert_equals_sorted(expected, results)
+
+    def test_event_time_sliding_window(self):
+        self.env.set_parallelism(1)
+        data_stream = self.env.from_collection([
+            ('hi', 1), ('hi', 2), ('hi', 3), ('hi', 4), ('hi', 5), ('hi', 8), ('hi', 9),
+            ('hi', 15)],
+            type_info=Types.TUPLE([Types.STRING(), Types.INT()]))  # type: DataStream
+        watermark_strategy = WatermarkStrategy.for_monotonous_timestamps() \
+            .with_timestamp_assigner(SecondColumnTimestampAssigner())
+
+        data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            .window(SlidingEventTimeWindows.of(Time.milliseconds(5), Time.milliseconds(2))) \
+            .process(CountWindowProcessFunction(), Types.TUPLE([Types.STRING(), Types.INT()])) \
+            .add_sink(self.test_sink)
+
+        self.env.execute('test_event_time_sliding_window')
+        results = self.test_sink.get_results()
+        expected = ['(hi,2)', '(hi,4)', '(hi,4)', '(hi,3)', '(hi,2)', '(hi,2)', '(hi,1)', '(hi,1)']
+        self.assert_equals_sorted(expected, results)
+
+    def test_count_sliding_window(self):
+        self.env.set_parallelism(1)
+        data_stream = self.env.from_collection([
+            (1, 'hi'), (2, 'hello'), (3, 'hi'), (4, 'hello'), (5, 'hi'), (6, 'hello')],
+            type_info=Types.TUPLE([Types.INT(), Types.STRING()]))  # type: DataStream
+        data_stream.key_by(lambda x: x[1], key_type=Types.STRING()) \
+            .window(CountSlidingWindowAssigner(2, 1)) \
+            .apply(SumWindowFunction(), Types.TUPLE([Types.STRING(), Types.INT()])) \
+            .add_sink(self.test_sink)
+
+        self.env.execute('test_count_sliding_window')
+        results = self.test_sink.get_results()
+        expected = ['(hello,6)', '(hi,8)', '(hi,4)', '(hello,10)']
+        self.assert_equals_sorted(expected, results)
+
+    def test_event_time_session_window(self):
+        self.env.set_parallelism(1)
+        data_stream = self.env.from_collection([
+            ('hi', 1), ('hi', 2), ('hi', 3), ('hi', 4), ('hi', 8), ('hi', 9), ('hi', 15)],
+            type_info=Types.TUPLE([Types.STRING(), Types.INT()]))  # type: DataStream
+        watermark_strategy = WatermarkStrategy.for_monotonous_timestamps() \
+            .with_timestamp_assigner(SecondColumnTimestampAssigner())
+
+        data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            .window(EventTimeSessionWindows.with_gap(Time.seconds(5))) \
+            .process(CountWindowProcessFunction(), Types.TUPLE([Types.STRING(), Types.INT()])) \
+            .add_sink(self.test_sink)
+
+        self.env.execute('test_event_time_session_window')
+        results = self.test_sink.get_results()
+        expected = ['(hi,7)']

Review comment:
       shouldn't the expected result is: (hi,6), (hi,1)?

##########
File path: flink-python/pyflink/datastream/tests/test_data_stream.py
##########
@@ -1144,6 +1110,114 @@ def __init__(self, name, num):
         expected = ['c', 'c', 'b']
         self.assert_equals_sorted(expected, results)
 
+    def test_event_time_tumbling_window(self):
+        self.env.set_parallelism(1)
+        data_stream = self.env.from_collection([
+            ('hi', 1), ('hi', 2), ('hi', 3), ('hi', 4), ('hi', 5), ('hi', 8), ('hi', 9),
+            ('hi', 15)],
+            type_info=Types.TUPLE([Types.STRING(), Types.INT()]))  # type: DataStream
+        watermark_strategy = WatermarkStrategy.for_monotonous_timestamps() \
+            .with_timestamp_assigner(SecondColumnTimestampAssigner())
+        data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            .window(TumblingEventTimeWindows.of(Time.milliseconds(5))) \
+            .process(CountWindowProcessFunction(), Types.TUPLE([Types.STRING(), Types.INT()])) \
+            .add_sink(self.test_sink)
+
+        self.env.execute('test_event_time_tumbling_window')
+        results = self.test_sink.get_results()
+        expected = ['(hi,4)', '(hi,3)', '(hi,1)']
+        self.assert_equals_sorted(expected, results)
+
+    def test_count_tumbling_window(self):
+        self.env.set_parallelism(1)
+        data_stream = self.env.from_collection([
+            (1, 'hi'), (2, 'hello'), (3, 'hi'), (4, 'hello'), (5, 'hi'), (6, 'hello'),
+            (6, 'hello')],
+            type_info=Types.TUPLE([Types.INT(), Types.STRING()]))  # type: DataStream
+        data_stream.key_by(lambda x: x[1], key_type=Types.STRING()) \
+            .count_window(3, 2) \

Review comment:
       actually this is a sliding window, however the test case is named `test_count_tumbling_window`

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +543,795 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class SessionWindowTimeGapExtractor(ABC):
+    """
+    A {@code SessionWindowTimeGapExtractor} extracts session time gaps for Dynamic Session Window
+    Assigners.
+    :param <ABC> The type of elements that this {@code SessionWindowTimeGapExtractor} can extract
+        session time gaps from.
+    """
+
+    @abstractmethod
+    def extract(self, element: Any) -> int:
+        """
+        Extracts the session time gap.
+        :param element The input element.
+        :return The session time gap in milliseconds.
+        """
+        pass
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        reduce_state = ctx.get_partitioned_state(self._count_state_descriptor)  # type:  ReducingState
+        reduce_state.add(1)
+        if reduce_state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            reduce_state.clear()
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class CountTumblingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into fixed-size windows based on the count number
+    of the elements. Windows cannot overlap.
+    """
+
+    def __init__(self, window_size: int):
+        """
+        Windows this KeyedStream into tumbling count windows.
+        :param window_size: The size of the windows in number of elements.
+        """
+        self._window_size = window_size
+        self._count_descriptor = ValueStateDescriptor('count-assigner', Types.LONG())
+
+    @staticmethod
+    def of(window_size: int):
+        return CountTumblingWindowAssigner(window_size)
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[CountWindow]:
+        count = context.get_runtime_context().get_state(self._count_descriptor)
+        count_value = count.value()
+        if count_value is None:
+            current_count = 0
+        else:
+            current_count = count_value
+        count.update(current_count + 1)
+        result = [CountWindow(current_count // self._window_size)]
+        return result
+
+    def get_default_trigger(self, env) -> Trigger[T, CountWindow]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[CountWindow]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self) -> str:
+        return "CountTumblingWindowAssigner(%s)" % self._window_size
+
+
+class CountSlidingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the count number of
+    the elements. Windows can possibly overlap.
+    """
+
+    def __init__(self, window_size: int, window_slide: int):
+        """
+        Windows this KeyedStream into sliding count windows.
+        :param window_size: The size of the windows in number of elements.
+        :param window_slide: The slide interval in number of elements.
+        """
+        self._window_size = window_size
+        self._window_slide = window_slide
+        self._count_descriptor = ValueStateDescriptor('slide-count-assigner', Types.LONG())
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[CountWindow]:
+        count = context.get_runtime_context().get_state(self._count_descriptor)  # type: ValueState
+        count_value = count.value()
+        if count_value is None:
+            current_count = 0
+        else:
+            current_count = count_value
+        count.update(current_count + 1)
+        last_id = current_count // self._window_slide
+        last_start = last_id * self._window_slide
+        last_end = last_start + self._window_size - 1
+        windows = []
+        while last_id >= 0 and last_start <= current_count <= last_end:
+            if last_start <= current_count <= last_end:
+                windows.append(CountWindow(last_id))
+            last_id -= 1
+            last_start -= self._window_slide
+            last_end -= self._window_slide
+        return windows
+
+    def get_default_trigger(self, env) -> Trigger[T, CountWindow]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[CountWindow]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self):
+        return "CountSlidingWindowAssigner(%s, %s)" % (self._window_size, self._window_slide)
+
+
+class TumblingProcessingTimeWindows(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of
+    the machine the operation is running on. Windows cannot overlap.
+
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+    >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING())
+    >>> .window(TumblingProcessingTimeWindows.of(Time.minutes(1), Time.seconds(10)))
+    """
+    def __init__(self, size: int, offset: int):
+        if abs(offset) >= size:
+            raise Exception("TumblingProcessingTimeWindows parameters must satisfy "
+                            "abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        current_processing_time = context.get_current_processing_time()
+        start = TimeWindow.get_window_start_with_offset(current_processing_time, self._offset,
+                                                        self._size)
+        return [TimeWindow(start, start + self._size)]
+
+    def get_default_trigger(self, env) -> Trigger[T, TimeWindow]:
+        return ProcessingTimeTrigger()
+
+    @staticmethod
+    def of(size: Time, offset=None):
+        """
+        Creates a new {@code TumblingProcessingTimeWindows} {@link WindowAssigner} that assigns
+        elements to time windows based on the element timestamp and offset.
+
+        For example, if you want window a stream by hour,but window begins at the 15th minutes of
+        each hour, you can use {@code of(Time.hours(1),Time.minutes(15))},then you will get time
+        windows start at 0:15:00,1:15:00,2:15:00,etc.
+
+        Rather than that,if you are living in somewhere which is not using UTC±00:00 time, such as
+        China which is using UTC+08:00,and you want a time window with size of one day, and window
+        begins at every 00:00:00 of local time,you may use {@code of(Time.days(1),Time.hours(-8))}.
+        The parameter of offset is {@code Time.hours(-8))} since UTC+08:00 is 8 hours earlier than
+        UTC time.
+
+        :param size The size of the generated windows.
+        :param offset The offset which window start would be shifted by.
+        :return The time policy.
+        """
+        if offset is None:
+            return TumblingProcessingTimeWindows(size.to_milliseconds(), 0)
+        else:
+            return TumblingProcessingTimeWindows(size.to_milliseconds(), offset.to_milliseconds())
+
+    @property
+    def get_window_serializer(self) -> TypeSerializer[TimeWindow]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self):
+        return "TumblingProcessingTimeWindows(%s, %s)" % (self._size, self._offset)
+
+
+class TumblingEventTimeWindows(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the timestamp of the
+    elements. Windows cannot overlap.
+
+    For example, in order to window into windows of 1 minute:
+    ::
+    >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+    >>> .window(TumblingEventTimeWindows.of(Time.minutes(1)))
+    """
+
+    def __init__(self, size: int, offset: int):
+        if abs(offset) >= size:
+            raise Exception("TumblingEventTimeWindows parameters must satisfy abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if timestamp > MIN_LONG_VALUE:
+            start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._size)
+            return [TimeWindow(start, start + self._size)]
+        else:
+            raise Exception("Record has jvm.java.lang.Long.MIN_VALUE timestamp (= no timestamp marker). "
+                            + "Is the time characteristic set to 'ProcessingTime', "
+                            + "or did you forget to call "
+                            + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, TimeWindow]:
+        return EventTimeTrigger()
+
+    @staticmethod
+    def of(size: Time, offset=None):

Review comment:
       ditto

##########
File path: flink-python/pyflink/datastream/tests/test_data_stream.py
##########
@@ -1090,8 +1056,8 @@ def test_reduce(self):
         ds = self.env.from_collection([(1, 'a'), (2, 'a'), (3, 'a'), (4, 'b')],
                                       type_info=Types.ROW([Types.INT(), Types.STRING()]))
         ds.key_by(lambda a: a[1]) \
-          .reduce(lambda a, b: Row(a[0] + b[0], b[1])) \

Review comment:
       revert

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +543,795 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class SessionWindowTimeGapExtractor(ABC):
+    """
+    A {@code SessionWindowTimeGapExtractor} extracts session time gaps for Dynamic Session Window
+    Assigners.
+    :param <ABC> The type of elements that this {@code SessionWindowTimeGapExtractor} can extract
+        session time gaps from.
+    """
+
+    @abstractmethod
+    def extract(self, element: Any) -> int:
+        """
+        Extracts the session time gap.
+        :param element The input element.
+        :return The session time gap in milliseconds.
+        """
+        pass
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        reduce_state = ctx.get_partitioned_state(self._count_state_descriptor)  # type:  ReducingState
+        reduce_state.add(1)
+        if reduce_state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            reduce_state.clear()
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class CountTumblingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into fixed-size windows based on the count number
+    of the elements. Windows cannot overlap.
+    """
+
+    def __init__(self, window_size: int):
+        """
+        Windows this KeyedStream into tumbling count windows.
+        :param window_size: The size of the windows in number of elements.
+        """
+        self._window_size = window_size
+        self._count_descriptor = ValueStateDescriptor('count-assigner', Types.LONG())
+
+    @staticmethod
+    def of(window_size: int):
+        return CountTumblingWindowAssigner(window_size)
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[CountWindow]:
+        count = context.get_runtime_context().get_state(self._count_descriptor)
+        count_value = count.value()
+        if count_value is None:
+            current_count = 0
+        else:
+            current_count = count_value
+        count.update(current_count + 1)
+        result = [CountWindow(current_count // self._window_size)]
+        return result
+
+    def get_default_trigger(self, env) -> Trigger[T, CountWindow]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[CountWindow]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self) -> str:
+        return "CountTumblingWindowAssigner(%s)" % self._window_size
+
+
+class CountSlidingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the count number of
+    the elements. Windows can possibly overlap.
+    """
+
+    def __init__(self, window_size: int, window_slide: int):
+        """
+        Windows this KeyedStream into sliding count windows.
+        :param window_size: The size of the windows in number of elements.
+        :param window_slide: The slide interval in number of elements.
+        """
+        self._window_size = window_size
+        self._window_slide = window_slide
+        self._count_descriptor = ValueStateDescriptor('slide-count-assigner', Types.LONG())
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[CountWindow]:
+        count = context.get_runtime_context().get_state(self._count_descriptor)  # type: ValueState
+        count_value = count.value()
+        if count_value is None:
+            current_count = 0
+        else:
+            current_count = count_value
+        count.update(current_count + 1)
+        last_id = current_count // self._window_slide
+        last_start = last_id * self._window_slide
+        last_end = last_start + self._window_size - 1
+        windows = []
+        while last_id >= 0 and last_start <= current_count <= last_end:
+            if last_start <= current_count <= last_end:
+                windows.append(CountWindow(last_id))
+            last_id -= 1
+            last_start -= self._window_slide
+            last_end -= self._window_slide
+        return windows
+
+    def get_default_trigger(self, env) -> Trigger[T, CountWindow]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[CountWindow]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self):
+        return "CountSlidingWindowAssigner(%s, %s)" % (self._window_size, self._window_slide)
+
+
+class TumblingProcessingTimeWindows(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of
+    the machine the operation is running on. Windows cannot overlap.
+
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+    >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING())
+    >>> .window(TumblingProcessingTimeWindows.of(Time.minutes(1), Time.seconds(10)))
+    """
+    def __init__(self, size: int, offset: int):
+        if abs(offset) >= size:
+            raise Exception("TumblingProcessingTimeWindows parameters must satisfy "
+                            "abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        current_processing_time = context.get_current_processing_time()
+        start = TimeWindow.get_window_start_with_offset(current_processing_time, self._offset,
+                                                        self._size)
+        return [TimeWindow(start, start + self._size)]
+
+    def get_default_trigger(self, env) -> Trigger[T, TimeWindow]:
+        return ProcessingTimeTrigger()
+
+    @staticmethod
+    def of(size: Time, offset=None):
+        """
+        Creates a new {@code TumblingProcessingTimeWindows} {@link WindowAssigner} that assigns
+        elements to time windows based on the element timestamp and offset.
+
+        For example, if you want window a stream by hour,but window begins at the 15th minutes of
+        each hour, you can use {@code of(Time.hours(1),Time.minutes(15))},then you will get time
+        windows start at 0:15:00,1:15:00,2:15:00,etc.
+
+        Rather than that,if you are living in somewhere which is not using UTC±00:00 time, such as
+        China which is using UTC+08:00,and you want a time window with size of one day, and window
+        begins at every 00:00:00 of local time,you may use {@code of(Time.days(1),Time.hours(-8))}.
+        The parameter of offset is {@code Time.hours(-8))} since UTC+08:00 is 8 hours earlier than
+        UTC time.
+
+        :param size The size of the generated windows.
+        :param offset The offset which window start would be shifted by.
+        :return The time policy.
+        """
+        if offset is None:
+            return TumblingProcessingTimeWindows(size.to_milliseconds(), 0)
+        else:
+            return TumblingProcessingTimeWindows(size.to_milliseconds(), offset.to_milliseconds())
+
+    @property
+    def get_window_serializer(self) -> TypeSerializer[TimeWindow]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self):
+        return "TumblingProcessingTimeWindows(%s, %s)" % (self._size, self._offset)
+
+
+class TumblingEventTimeWindows(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the timestamp of the
+    elements. Windows cannot overlap.
+
+    For example, in order to window into windows of 1 minute:
+    ::
+    >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+    >>> .window(TumblingEventTimeWindows.of(Time.minutes(1)))
+    """
+
+    def __init__(self, size: int, offset: int):
+        if abs(offset) >= size:
+            raise Exception("TumblingEventTimeWindows parameters must satisfy abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if timestamp > MIN_LONG_VALUE:
+            start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._size)
+            return [TimeWindow(start, start + self._size)]
+        else:
+            raise Exception("Record has jvm.java.lang.Long.MIN_VALUE timestamp (= no timestamp marker). "
+                            + "Is the time characteristic set to 'ProcessingTime', "
+                            + "or did you forget to call "
+                            + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, TimeWindow]:
+        return EventTimeTrigger()
+
+    @staticmethod
+    def of(size: Time, offset=None):
+        """
+        Creates a new TumblingEventTimeWindows WindowAssigner that assigns elements
+        to time windows based on the element timestamp, offset and a staggering offset, depending on
+        the staggering policy.
+
+        :param size The size of the generated windows.
+        :param offset The globalOffset which window start would be shifted by.
+        """
+        if offset is None:
+            return TumblingEventTimeWindows(size.to_milliseconds(), 0)
+        else:
+            return TumblingEventTimeWindows(size.to_milliseconds(), offset.to_milliseconds())
+
+    def get_window_serializer(self) -> TypeSerializer[TimeWindow]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return True
+
+    def __repr__(self):
+        return "TumblingEventTimeWindows(%s, %s)" % (self._size, self._offset)
+
+
+class SlidingProcessingTimeWindows(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the current system
+    time of the machine the operation is running on. Windows can possibly overlap.
+
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+    >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+    >>> .window(SlidingProcessingTimeWindows.of(Time.minutes(1), Time.seconds(10)))
+    """
+
+    def __init__(self, size: int, slide: int, offset: int):
+        if abs(offset) >= slide or size <= 0:
+            raise Exception("SlidingProcessingTimeWindows parameters must satisfy "
+                            + "abs(offset) < slide and size > 0")
+
+        self._size = size
+        self._slide = slide
+        self._offset = offset
+        self._pane_size = math.gcd(size, slide)
+        self._num_panes_per_window = size // self._pane_size
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[TimeWindow]:
+        current_processing_time = context.get_current_processing_time()
+        last_start = TimeWindow.get_window_start_with_offset(
+            current_processing_time, self._offset, self._slide)
+        windows = [TimeWindow(start, start + self._size)
+                   for start in range(last_start,
+                                      current_processing_time - self._size, -self._slide)]
+        return windows
+
+    def get_default_trigger(self, env) -> Trigger[T, TimeWindow]:
+        return ProcessingTimeTrigger()
+
+    @staticmethod
+    def of(size: Time, slide: Time, offset=None):
+        """
+        Creates a new {@code SlidingProcessingTimeWindows} {@link WindowAssigner} that assigns
+        elements to time windows based on the element timestamp and offset.
+
+        For example, if you want window a stream by hour,but window begins at the 15th minutes of
+        each hour, you can use {@code of(Time.hours(1),Time.minutes(15))},then you will get time
+        windows start at 0:15:00,1:15:00,2:15:00,etc.
+
+        Rather than that,if you are living in somewhere which is not using UTC±00:00 time, such as
+        China which is using UTC+08:00,and you want a time window with size of one day, and window
+        begins at every 00:00:00 of local time,you may use {@code of(Time.days(1),Time.hours(-8))}.
+        The parameter of offset is {@code Time.hours(-8))} since UTC+08:00 is 8 hours earlier than
+        UTC time.
+
+        :param size The size of the generated windows.
+        :param slide The slide interval of the generated windows.
+        :param offset The offset which window start would be shifted by.
+        :return The time policy.
+        """
+        if offset is None:
+            return SlidingProcessingTimeWindows(size.to_milliseconds(), slide.to_milliseconds(), 0)
+        else:
+            return SlidingProcessingTimeWindows(size.to_milliseconds(), slide.to_milliseconds(),
+                                                offset.to_milliseconds())
+
+    def get_window_serializer(self) -> TypeSerializer[TimeWindow]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self) -> str:
+        return "SlidingProcessingTimeWindows(%s, %s, %s)" % (self._size, self._slide, self._offset)
+
+
+class SlidingEventTimeWindows(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the timestamp of the
+    elements. Windows can possibly overlap.
+
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+    >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+    >>> .window(SlidingEventTimeWindows.of(Time.minutes(1), Time.seconds(10)))
+    """
+
+    def __init__(self, size: int, slide: int, offset: int):
+        if abs(offset) >= slide or size <= 0:
+            raise Exception("SlidingEventTimeWindows parameters must satisfy "
+                            + "abs(offset) < slide and size > 0")
+
+        self._size = size
+        self._slide = slide
+        self._offset = offset
+        self._pane_size = math.gcd(size, slide)
+        self._num_panes_per_window = size // self._pane_size
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[TimeWindow]:
+        if timestamp > MIN_LONG_VALUE:
+            last_start = TimeWindow.get_window_start_with_offset(timestamp,
+                                                                 self._offset, self._slide)
+            windows = [TimeWindow(start, start + self._size)
+                       for start in range(last_start, timestamp - self._size, -self._slide)]
+            return windows
+        else:
+            raise Exception("Record has jvm.java.lang.Long.MIN_VALUE timestamp (= no timestamp marker). "
+                            + "Is the time characteristic set to 'ProcessingTime', "
+                              "or did you forget to call "
+                            + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, TimeWindow]:
+        return EventTimeTrigger()
+
+    @staticmethod
+    def of(size: Time, slide: Time, offset=None):
+        """
+        Creates a new {@code SlidingEventTimeWindows} {@link WindowAssigner} that assigns elements
+        to time windows based on the element timestamp and offset.
+
+        For example, if you want window a stream by hour,but window begins at the 15th minutes of
+        each hour, you can use {@code of(Time.hours(1),Time.minutes(15))},then you will get time
+        windows start at 0:15:00,1:15:00,2:15:00,etc.
+
+        Rather than that,if you are living in somewhere which is not using UTC±00:00 time, such as
+        China which is using UTC+08:00,and you want a time window with size of one day, and window
+        begins at every 00:00:00 of local time,you may use {@code of(Time.days(1),Time.hours(-8))}.
+        The parameter of offset is {@code Time.hours(-8))} since UTC+08:00 is 8 hours earlier than
+        UTC time.
+
+        :param size The size of the generated windows.
+        :param slide The slide interval of the generated windows.
+        :param offset The offset which window start would be shifted by.
+        :return The time policy.
+        """
+        if offset is None:
+            return SlidingEventTimeWindows(size.to_milliseconds(), slide.to_milliseconds(), 0)
+        else:
+            return SlidingEventTimeWindows(size.to_milliseconds(), slide.to_milliseconds(),
+                                           offset.to_milliseconds())
+
+    def get_window_serializer(self) -> TypeSerializer[TimeWindow]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return True
+
+    def __repr__(self) -> str:
+        return "SlidingEventTimeWindows(%s, %s, %s)" % (self._size, self._slide, self._offset)
+
+
+class ProcessingTimeSessionWindows(MergingWindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into sessions based on the current processing
+    time. Windows cannot overlap.
+
+    For example, the processing interval is set to 1 minutes:
+    ::
+    >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+    >>> .window(ProcessingTimeSessionWindows.with_gap(Time.minutes(1)))
+    """
+
+    def __init__(self, session_gap: int):
+        if session_gap <= 0:
+            raise Exception("SessionWindowAssigner parameters must satisfy 0 < size")

Review comment:
       ```suggestion
               raise Exception("ProcessingTimeSessionWindows parameters must satisfy 0 < size")
   ```

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +543,795 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class SessionWindowTimeGapExtractor(ABC):
+    """
+    A {@code SessionWindowTimeGapExtractor} extracts session time gaps for Dynamic Session Window
+    Assigners.
+    :param <ABC> The type of elements that this {@code SessionWindowTimeGapExtractor} can extract
+        session time gaps from.
+    """
+
+    @abstractmethod
+    def extract(self, element: Any) -> int:
+        """
+        Extracts the session time gap.
+        :param element The input element.
+        :return The session time gap in milliseconds.
+        """
+        pass
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        reduce_state = ctx.get_partitioned_state(self._count_state_descriptor)  # type:  ReducingState
+        reduce_state.add(1)
+        if reduce_state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            reduce_state.clear()
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class CountTumblingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into fixed-size windows based on the count number
+    of the elements. Windows cannot overlap.
+    """
+
+    def __init__(self, window_size: int):
+        """
+        Windows this KeyedStream into tumbling count windows.
+        :param window_size: The size of the windows in number of elements.
+        """
+        self._window_size = window_size
+        self._count_descriptor = ValueStateDescriptor('count-assigner', Types.LONG())
+
+    @staticmethod
+    def of(window_size: int):
+        return CountTumblingWindowAssigner(window_size)
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[CountWindow]:
+        count = context.get_runtime_context().get_state(self._count_descriptor)
+        count_value = count.value()
+        if count_value is None:
+            current_count = 0
+        else:
+            current_count = count_value
+        count.update(current_count + 1)
+        result = [CountWindow(current_count // self._window_size)]
+        return result
+
+    def get_default_trigger(self, env) -> Trigger[T, CountWindow]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[CountWindow]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self) -> str:
+        return "CountTumblingWindowAssigner(%s)" % self._window_size
+
+
+class CountSlidingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the count number of
+    the elements. Windows can possibly overlap.
+    """
+
+    def __init__(self, window_size: int, window_slide: int):
+        """
+        Windows this KeyedStream into sliding count windows.
+        :param window_size: The size of the windows in number of elements.
+        :param window_slide: The slide interval in number of elements.
+        """
+        self._window_size = window_size
+        self._window_slide = window_slide
+        self._count_descriptor = ValueStateDescriptor('slide-count-assigner', Types.LONG())
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[CountWindow]:
+        count = context.get_runtime_context().get_state(self._count_descriptor)  # type: ValueState
+        count_value = count.value()
+        if count_value is None:
+            current_count = 0
+        else:
+            current_count = count_value
+        count.update(current_count + 1)
+        last_id = current_count // self._window_slide
+        last_start = last_id * self._window_slide
+        last_end = last_start + self._window_size - 1
+        windows = []
+        while last_id >= 0 and last_start <= current_count <= last_end:
+            if last_start <= current_count <= last_end:
+                windows.append(CountWindow(last_id))
+            last_id -= 1
+            last_start -= self._window_slide
+            last_end -= self._window_slide
+        return windows
+
+    def get_default_trigger(self, env) -> Trigger[T, CountWindow]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[CountWindow]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self):
+        return "CountSlidingWindowAssigner(%s, %s)" % (self._window_size, self._window_slide)
+
+
+class TumblingProcessingTimeWindows(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of
+    the machine the operation is running on. Windows cannot overlap.
+
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+    >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING())
+    >>> .window(TumblingProcessingTimeWindows.of(Time.minutes(1), Time.seconds(10)))
+    """
+    def __init__(self, size: int, offset: int):
+        if abs(offset) >= size:
+            raise Exception("TumblingProcessingTimeWindows parameters must satisfy "
+                            "abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        current_processing_time = context.get_current_processing_time()
+        start = TimeWindow.get_window_start_with_offset(current_processing_time, self._offset,
+                                                        self._size)
+        return [TimeWindow(start, start + self._size)]
+
+    def get_default_trigger(self, env) -> Trigger[T, TimeWindow]:
+        return ProcessingTimeTrigger()
+
+    @staticmethod
+    def of(size: Time, offset=None):
+        """
+        Creates a new {@code TumblingProcessingTimeWindows} {@link WindowAssigner} that assigns
+        elements to time windows based on the element timestamp and offset.
+
+        For example, if you want window a stream by hour,but window begins at the 15th minutes of
+        each hour, you can use {@code of(Time.hours(1),Time.minutes(15))},then you will get time
+        windows start at 0:15:00,1:15:00,2:15:00,etc.
+
+        Rather than that,if you are living in somewhere which is not using UTC±00:00 time, such as
+        China which is using UTC+08:00,and you want a time window with size of one day, and window
+        begins at every 00:00:00 of local time,you may use {@code of(Time.days(1),Time.hours(-8))}.
+        The parameter of offset is {@code Time.hours(-8))} since UTC+08:00 is 8 hours earlier than
+        UTC time.
+
+        :param size The size of the generated windows.
+        :param offset The offset which window start would be shifted by.
+        :return The time policy.
+        """
+        if offset is None:
+            return TumblingProcessingTimeWindows(size.to_milliseconds(), 0)
+        else:
+            return TumblingProcessingTimeWindows(size.to_milliseconds(), offset.to_milliseconds())
+
+    @property
+    def get_window_serializer(self) -> TypeSerializer[TimeWindow]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self):
+        return "TumblingProcessingTimeWindows(%s, %s)" % (self._size, self._offset)
+
+
+class TumblingEventTimeWindows(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the timestamp of the
+    elements. Windows cannot overlap.
+
+    For example, in order to window into windows of 1 minute:
+    ::
+    >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+    >>> .window(TumblingEventTimeWindows.of(Time.minutes(1)))
+    """
+
+    def __init__(self, size: int, offset: int):
+        if abs(offset) >= size:
+            raise Exception("TumblingEventTimeWindows parameters must satisfy abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if timestamp > MIN_LONG_VALUE:
+            start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._size)
+            return [TimeWindow(start, start + self._size)]
+        else:
+            raise Exception("Record has jvm.java.lang.Long.MIN_VALUE timestamp (= no timestamp marker). "
+                            + "Is the time characteristic set to 'ProcessingTime', "
+                            + "or did you forget to call "
+                            + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, TimeWindow]:
+        return EventTimeTrigger()
+
+    @staticmethod
+    def of(size: Time, offset=None):
+        """
+        Creates a new TumblingEventTimeWindows WindowAssigner that assigns elements
+        to time windows based on the element timestamp, offset and a staggering offset, depending on
+        the staggering policy.
+
+        :param size The size of the generated windows.
+        :param offset The globalOffset which window start would be shifted by.
+        """
+        if offset is None:
+            return TumblingEventTimeWindows(size.to_milliseconds(), 0)
+        else:
+            return TumblingEventTimeWindows(size.to_milliseconds(), offset.to_milliseconds())
+
+    def get_window_serializer(self) -> TypeSerializer[TimeWindow]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return True
+
+    def __repr__(self):
+        return "TumblingEventTimeWindows(%s, %s)" % (self._size, self._offset)
+
+
+class SlidingProcessingTimeWindows(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the current system
+    time of the machine the operation is running on. Windows can possibly overlap.
+
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+    >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+    >>> .window(SlidingProcessingTimeWindows.of(Time.minutes(1), Time.seconds(10)))
+    """
+
+    def __init__(self, size: int, slide: int, offset: int):
+        if abs(offset) >= slide or size <= 0:
+            raise Exception("SlidingProcessingTimeWindows parameters must satisfy "
+                            + "abs(offset) < slide and size > 0")
+
+        self._size = size
+        self._slide = slide
+        self._offset = offset
+        self._pane_size = math.gcd(size, slide)
+        self._num_panes_per_window = size // self._pane_size
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[TimeWindow]:
+        current_processing_time = context.get_current_processing_time()
+        last_start = TimeWindow.get_window_start_with_offset(
+            current_processing_time, self._offset, self._slide)
+        windows = [TimeWindow(start, start + self._size)
+                   for start in range(last_start,
+                                      current_processing_time - self._size, -self._slide)]
+        return windows
+
+    def get_default_trigger(self, env) -> Trigger[T, TimeWindow]:
+        return ProcessingTimeTrigger()
+
+    @staticmethod
+    def of(size: Time, slide: Time, offset=None):

Review comment:
       ditto

##########
File path: flink-python/pyflink/datastream/tests/test_data_stream.py
##########
@@ -178,8 +177,8 @@ def flat_map2(self, value):
         ds_2 = ds_1.map(lambda x: x * 2)
 
         (ds_1.connect(ds_2).flat_map(MyCoFlatMapFunction(), output_type=Types.INT())
-             .connect(ds_2).map(MyCoMapFunction(), output_type=Types.INT())

Review comment:
       revert

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +543,795 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class SessionWindowTimeGapExtractor(ABC):
+    """
+    A {@code SessionWindowTimeGapExtractor} extracts session time gaps for Dynamic Session Window
+    Assigners.
+    :param <ABC> The type of elements that this {@code SessionWindowTimeGapExtractor} can extract
+        session time gaps from.
+    """
+
+    @abstractmethod
+    def extract(self, element: Any) -> int:
+        """
+        Extracts the session time gap.
+        :param element The input element.
+        :return The session time gap in milliseconds.
+        """
+        pass
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        reduce_state = ctx.get_partitioned_state(self._count_state_descriptor)  # type:  ReducingState
+        reduce_state.add(1)
+        if reduce_state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            reduce_state.clear()
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class CountTumblingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into fixed-size windows based on the count number
+    of the elements. Windows cannot overlap.
+    """
+
+    def __init__(self, window_size: int):
+        """
+        Windows this KeyedStream into tumbling count windows.
+        :param window_size: The size of the windows in number of elements.
+        """
+        self._window_size = window_size
+        self._count_descriptor = ValueStateDescriptor('count-assigner', Types.LONG())
+
+    @staticmethod
+    def of(window_size: int):
+        return CountTumblingWindowAssigner(window_size)
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[CountWindow]:
+        count = context.get_runtime_context().get_state(self._count_descriptor)
+        count_value = count.value()
+        if count_value is None:
+            current_count = 0
+        else:
+            current_count = count_value
+        count.update(current_count + 1)
+        result = [CountWindow(current_count // self._window_size)]
+        return result
+
+    def get_default_trigger(self, env) -> Trigger[T, CountWindow]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[CountWindow]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self) -> str:
+        return "CountTumblingWindowAssigner(%s)" % self._window_size
+
+
+class CountSlidingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the count number of
+    the elements. Windows can possibly overlap.
+    """
+
+    def __init__(self, window_size: int, window_slide: int):
+        """
+        Windows this KeyedStream into sliding count windows.
+        :param window_size: The size of the windows in number of elements.
+        :param window_slide: The slide interval in number of elements.
+        """
+        self._window_size = window_size
+        self._window_slide = window_slide
+        self._count_descriptor = ValueStateDescriptor('slide-count-assigner', Types.LONG())
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[CountWindow]:
+        count = context.get_runtime_context().get_state(self._count_descriptor)  # type: ValueState
+        count_value = count.value()
+        if count_value is None:
+            current_count = 0
+        else:
+            current_count = count_value
+        count.update(current_count + 1)
+        last_id = current_count // self._window_slide
+        last_start = last_id * self._window_slide
+        last_end = last_start + self._window_size - 1
+        windows = []
+        while last_id >= 0 and last_start <= current_count <= last_end:
+            if last_start <= current_count <= last_end:
+                windows.append(CountWindow(last_id))
+            last_id -= 1
+            last_start -= self._window_slide
+            last_end -= self._window_slide
+        return windows
+
+    def get_default_trigger(self, env) -> Trigger[T, CountWindow]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[CountWindow]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self):
+        return "CountSlidingWindowAssigner(%s, %s)" % (self._window_size, self._window_slide)
+
+
+class TumblingProcessingTimeWindows(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of
+    the machine the operation is running on. Windows cannot overlap.
+
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+    >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING())
+    >>> .window(TumblingProcessingTimeWindows.of(Time.minutes(1), Time.seconds(10)))
+    """
+    def __init__(self, size: int, offset: int):
+        if abs(offset) >= size:
+            raise Exception("TumblingProcessingTimeWindows parameters must satisfy "
+                            "abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        current_processing_time = context.get_current_processing_time()
+        start = TimeWindow.get_window_start_with_offset(current_processing_time, self._offset,
+                                                        self._size)
+        return [TimeWindow(start, start + self._size)]
+
+    def get_default_trigger(self, env) -> Trigger[T, TimeWindow]:
+        return ProcessingTimeTrigger()
+
+    @staticmethod
+    def of(size: Time, offset=None):
+        """
+        Creates a new {@code TumblingProcessingTimeWindows} {@link WindowAssigner} that assigns
+        elements to time windows based on the element timestamp and offset.
+
+        For example, if you want window a stream by hour,but window begins at the 15th minutes of
+        each hour, you can use {@code of(Time.hours(1),Time.minutes(15))},then you will get time
+        windows start at 0:15:00,1:15:00,2:15:00,etc.
+
+        Rather than that,if you are living in somewhere which is not using UTC±00:00 time, such as
+        China which is using UTC+08:00,and you want a time window with size of one day, and window
+        begins at every 00:00:00 of local time,you may use {@code of(Time.days(1),Time.hours(-8))}.
+        The parameter of offset is {@code Time.hours(-8))} since UTC+08:00 is 8 hours earlier than
+        UTC time.
+
+        :param size The size of the generated windows.
+        :param offset The offset which window start would be shifted by.
+        :return The time policy.
+        """
+        if offset is None:
+            return TumblingProcessingTimeWindows(size.to_milliseconds(), 0)
+        else:
+            return TumblingProcessingTimeWindows(size.to_milliseconds(), offset.to_milliseconds())
+
+    @property
+    def get_window_serializer(self) -> TypeSerializer[TimeWindow]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self):
+        return "TumblingProcessingTimeWindows(%s, %s)" % (self._size, self._offset)
+
+
+class TumblingEventTimeWindows(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the timestamp of the
+    elements. Windows cannot overlap.
+
+    For example, in order to window into windows of 1 minute:
+    ::
+    >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+    >>> .window(TumblingEventTimeWindows.of(Time.minutes(1)))
+    """
+
+    def __init__(self, size: int, offset: int):
+        if abs(offset) >= size:
+            raise Exception("TumblingEventTimeWindows parameters must satisfy abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if timestamp > MIN_LONG_VALUE:
+            start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._size)
+            return [TimeWindow(start, start + self._size)]
+        else:
+            raise Exception("Record has jvm.java.lang.Long.MIN_VALUE timestamp (= no timestamp marker). "
+                            + "Is the time characteristic set to 'ProcessingTime', "
+                            + "or did you forget to call "
+                            + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, TimeWindow]:
+        return EventTimeTrigger()
+
+    @staticmethod
+    def of(size: Time, offset=None):
+        """
+        Creates a new TumblingEventTimeWindows WindowAssigner that assigns elements
+        to time windows based on the element timestamp, offset and a staggering offset, depending on
+        the staggering policy.
+
+        :param size The size of the generated windows.
+        :param offset The globalOffset which window start would be shifted by.
+        """
+        if offset is None:
+            return TumblingEventTimeWindows(size.to_milliseconds(), 0)
+        else:
+            return TumblingEventTimeWindows(size.to_milliseconds(), offset.to_milliseconds())
+
+    def get_window_serializer(self) -> TypeSerializer[TimeWindow]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return True
+
+    def __repr__(self):
+        return "TumblingEventTimeWindows(%s, %s)" % (self._size, self._offset)
+
+
+class SlidingProcessingTimeWindows(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the current system
+    time of the machine the operation is running on. Windows can possibly overlap.
+
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+    >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+    >>> .window(SlidingProcessingTimeWindows.of(Time.minutes(1), Time.seconds(10)))
+    """
+
+    def __init__(self, size: int, slide: int, offset: int):
+        if abs(offset) >= slide or size <= 0:
+            raise Exception("SlidingProcessingTimeWindows parameters must satisfy "
+                            + "abs(offset) < slide and size > 0")
+
+        self._size = size
+        self._slide = slide
+        self._offset = offset
+        self._pane_size = math.gcd(size, slide)
+        self._num_panes_per_window = size // self._pane_size

Review comment:
       where it's used?

##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -15,28 +15,51 @@
 #  See the License for the specific language governing permissions and
 # limitations under the License.
 ################################################################################
-import sys
+import math
 from abc import ABC, abstractmethod
 from enum import Enum
 from io import BytesIO
-from typing import TypeVar, Generic, Iterable, Collection
+from typing import TypeVar, Generic, Iterable, Collection, Any
 
+from pyflink.common import Time, Types
 from pyflink.common.serializer import TypeSerializer
 from pyflink.datastream.functions import RuntimeContext, InternalWindowFunction
-from pyflink.datastream.state import StateDescriptor, State
+from pyflink.datastream.state import StateDescriptor, ReducingStateDescriptor, \
+    ValueStateDescriptor, ValueState, State, ReducingState
 from pyflink.metrics import MetricGroup
 
 __all__ = ['Window',
            'TimeWindow',
            'CountWindow',
+           'TumblingProcessingTimeWindows',
+           'TumblingEventTimeWindows',
+           'SlidingProcessingTimeWindows',
+           'SlidingEventTimeWindows',
+           'ProcessingTimeSessionWindows',
+           'EventTimeSessionWindows',
+           'DynamicProcessingTimeSessionWindows',
+           'DynamicEventTimeSessionWindows',
            'WindowAssigner',
            'MergingWindowAssigner',
+           'CountTumblingWindowAssigner',
+           'CountSlidingWindowAssigner',
            'TriggerResult',
            'Trigger',
+           'EventTimeTrigger',
+           'ProcessingTimeTrigger',
+           'CountTrigger',
            'TimeWindowSerializer',
-           'CountWindowSerializer']
+           'CountWindowSerializer',
+           'SessionWindowTimeGapExtractor']
 
-MAX_LONG_VALUE = sys.maxsize
+"""
+A constant holding the maximum value a long can have, 2^63 – 1.
+"""
+MAX_LONG_VALUE = 0x7fffffffffffffff

Review comment:
       Should we also fix the MAX_LONG_VALUE/MIN_LONG_VALUE in other places?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 678546804066c142c23cdcc36e88dc442210282a Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900) 
   * 4cf2279af74f89d0f9f8b446ccd8b72bf57495cb UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 4cf2279af74f89d0f9f8b446ccd8b72bf57495cb Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908) 
   * d178dd422b702f51ee85b41e43ec4b306c60aa8c UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * d178dd422b702f51ee85b41e43ec4b306c60aa8c Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914) 
   * ff0bcbd841960c3e524a62c3c45f426707fb5306 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32952",
       "triggerID" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "triggerType" : "PUSH"
     }, {
       "hash" : "894eb800e4045264cec298908892659466e78b7b",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32964",
       "triggerID" : "894eb800e4045264cec298908892659466e78b7b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "770b70d23b1f989d2de869f0d92c37615743513e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "770b70d23b1f989d2de869f0d92c37615743513e",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 894eb800e4045264cec298908892659466e78b7b Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32964) 
   * 770b70d23b1f989d2de869f0d92c37615743513e UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 1240d4313817b4ae0bfca21090eb6f1cbd40f78e Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684) 
   * 4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 8e46affb05226eeed6b8eb20df971445139654a8 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535) 
   * 610f23f625a4559bef6f000916a2de37a5cb3b38 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 8e46affb05226eeed6b8eb20df971445139654a8 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535) 
   * 610f23f625a4559bef6f000916a2de37a5cb3b38 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 8e46affb05226eeed6b8eb20df971445139654a8 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 8e46affb05226eeed6b8eb20df971445139654a8 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 4cf2279af74f89d0f9f8b446ccd8b72bf57495cb Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * ff0bcbd841960c3e524a62c3c45f426707fb5306 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946) 
   * 84846361d6db21a5ac721494ebb5badf23b274a1 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * ff0bcbd841960c3e524a62c3c45f426707fb5306 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946) 
   * 84846361d6db21a5ac721494ebb5badf23b274a1 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * ff0bcbd841960c3e524a62c3c45f426707fb5306 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 84846361d6db21a5ac721494ebb5badf23b274a1 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947) 
   * 195bf4369bf1eb0cdc1727847e63ce0d6fd31f66 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] Vancior commented on a change in pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
Vancior commented on a change in pull request #18957:
URL: https://github.com/apache/flink/pull/18957#discussion_r823294596



##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +538,721 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of the
+    machine the operation is running on. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(10), False))
+
+    A WindowAssigner that windows elements into windows based on the timestamp of the elements.
+    Windows cannot overlap.
+    For example, in order to window into windows of 1 minute:
+     ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(60000, 0, True))
+    """
+
+    def __init__(self, size: int, offset: int, is_event_time: bool):
+        if abs(offset) >= size:
+            raise Exception("TumblingWindowAssigner parameters must satisfy abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+        self._is_event_time = is_event_time
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if self._is_event_time is False:
+            current_processing_time = context.get_current_processing_time()
+            start = TimeWindow.get_window_start_with_offset(current_processing_time, self._offset,
+                                                            self._size)
+            return [TimeWindow(start, start + self._size)]
+        else:
+            if timestamp > MIN_LONG_VALUE:
+                start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._size)
+                return [TimeWindow(start, start + self._size)]
+            else:
+                raise Exception("Record has MIN_LONG_VALUE timestamp (= no timestamp marker). "
+                                + "Is the time characteristic set to 'ProcessingTime', or did you forget to call "
+                                + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        if self._is_event_time is True:
+            return EventTimeTrigger()
+        else:
+            return ProcessingTimeTrigger()
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return self._is_event_time
+
+    def __repr__(self):
+        return "TumblingWindowAssigner(%s,%s,%s)" % (self._size, self._offset, self.is_event_time)
+
+
+class CountTumblingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into fixed-size windows based on the count number
+    of the elements. Windows cannot overlap.
+    """
+
+    def __init__(self, window_size: int):
+        """
+        Windows this KeyedStream into tumbling count windows.
+        :param window_size: The size of the windows in number of elements.
+        """
+        self._window_size = window_size
+        self._counter_state_descriptor = ReducingStateDescriptor(
+            "assigner_counter", lambda a, b: a + b, Types.LONG())
+
+    @staticmethod
+    def of(window_size: int):
+        return CountTumblingWindowAssigner(window_size)
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        counter = context.get_runtime_context().get_reducing_state(
+            self._counter_state_descriptor)
+        if counter.get() is None:
+            counter.add(0)
+        result = [CountWindow(counter.get() // self._window_size)]
+        counter.add(1)
+        return result
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self) -> str:
+        return "CountTumblingWindowAssigner(%s)" % (self._window_size)
+
+
+class SlidingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the current system time
+    of the machine the operation is running on. Windows can possibly overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(SlidingWindowAssigner(60000, 10000, 0, False))
+
+    A WindowAssigner that windows elements into sliding windows based on the timestamp of the
+    elements. Windows can possibly overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(SlidingWindowAssigner(60000, 10000, 0, True))
+    """
+
+    def __init__(self, size: int, slide: int, offset: int, is_event_time: bool):
+        if abs(offset) >= slide or size <= 0:
+            raise Exception("SlidingWindowAssigner parameters must satisfy "
+                            + "abs(offset) < slide and size > 0")
+
+        self._size = size
+        self._slide = slide
+        self._offset = offset
+        self._is_event_time = is_event_time
+        self._pane_size = math.gcd(size, slide)
+        self._num_panes_per_window = size // self._pane_size
+
+    def assign_windows(
+        self,
+        element: T,
+        timestamp: int,
+        context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        if self._is_event_time is False:
+            current_processing_time = context.get_current_processing_time()
+            last_start = TimeWindow.get_window_start_with_offset(
+                current_processing_time, self._offset, self._slide)
+            windows = [TimeWindow(start, start + self._size)
+                       for start in range(last_start,
+                                          current_processing_time - self._size, -self._slide)]
+            return windows
+        else:
+            if timestamp > MIN_LONG_VALUE:
+                last_start = TimeWindow.get_window_start_with_offset(timestamp,
+                                                                     self._offset, self._slide)
+                windows = [TimeWindow(start, start + self._size)
+                           for start in range(last_start, timestamp - self._size, -self._slide)]
+                return windows
+            else:
+                raise Exception("Record has MIN_LONG_VALUE timestamp (= no timestamp marker). "
+                                + "Is the time characteristic set to 'ProcessingTime', "
+                                  "or did you forget to call "
+                                + "'data_stream.assign_timestamps_and_watermarks(...)'?")
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        if self._is_event_time is True:
+            return EventTimeTrigger()
+        else:
+            return ProcessingTimeTrigger()
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return self._is_event_time
+
+    def __repr__(self) -> str:
+        return "SlidingWindowAssigner(%s, %s, %s, %s)" % (
+            self._size, self._slide, self._offset, self._is_event_time)
+
+
+class CountSlidingWindowAssigner(WindowAssigner[T, CountWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the count number of
+    the elements. Windows can possibly overlap.
+    """
+
+    def __init__(self, window_size: int, window_slide: int):
+        """
+        Windows this KeyedStream into sliding count windows.
+        :param window_size: The size of the windows in number of elements.
+        :param window_slide: The slide interval in number of elements.
+        """
+        self._window_size = window_size
+        self._window_slide = window_slide
+        self._count = None  # type: ValueState
+        self._counter_state_descriptor = ReducingStateDescriptor(
+            "slide-count-assigner", lambda a, b: a + b, Types.LONG())
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        count_descriptor = ValueStateDescriptor('slide-count-assigner', Types.LONG())
+        self._count = context.get_runtime_context().get_state(count_descriptor)
+        count_value = self._count.value()
+        if count_value is None:
+            current_count = 0
+        else:
+            current_count = count_value
+        self._count.update(current_count + 1)
+        last_id = current_count // self._window_slide
+        last_start = last_id * self._window_slide
+        last_end = last_start + self._window_size - 1
+        windows = []
+        while last_id >= 0 and last_start <= current_count <= last_end:
+            if last_start <= current_count <= last_end:
+                windows.append(CountWindow(last_id))
+            last_id -= 1
+            last_start -= self._window_slide
+            last_end -= self._window_slide
+        return windows
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self):
+        return "CountSlidingWindowAssigner(%s, %s)" % (self._window_size, self._window_slide)
+
+
+class SessionWindowAssigner(MergingWindowAssigner[T, TimeWindow]):
+    """
+        WindowAssigner that windows elements into sessions based on the timestamp. Windows cannot
+        overlap.
+        """
+
+    def __init__(self, session_gap: int, is_event_time: bool):
+        if session_gap <= 0:
+            raise Exception("SessionWindowAssigner parameters must satisfy 0 < size")
+
+        self._session_gap = session_gap
+        self._is_event_time = is_event_time
+
+    def merge_windows(self,
+                      windows: Iterable[W],
+                      callback: 'MergingWindowAssigner.MergeCallback[W]') -> None:
+        window_list = [w for w in windows]
+        window_list.sort()
+        for i in range(1, len(window_list)):

Review comment:
       This logic is incorrect when receiving late records, checkout java implementation




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 47cdf91d5086b292f20d40700c7bd9052e32a6e0 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * d9ffed1defa7514c27d3ba9a9e3008de8b84bc68 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805) 
   * 0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * d9ffed1defa7514c27d3ba9a9e3008de8b84bc68 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 678546804066c142c23cdcc36e88dc442210282a Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900) 
   * 4cf2279af74f89d0f9f8b446ccd8b72bf57495cb Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 678546804066c142c23cdcc36e88dc442210282a Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900) 
   * 4cf2279af74f89d0f9f8b446ccd8b72bf57495cb UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32952",
       "triggerID" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "triggerType" : "PUSH"
     }, {
       "hash" : "894eb800e4045264cec298908892659466e78b7b",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "894eb800e4045264cec298908892659466e78b7b",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 195bf4369bf1eb0cdc1727847e63ce0d6fd31f66 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32952) 
   * 894eb800e4045264cec298908892659466e78b7b UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32952",
       "triggerID" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 195bf4369bf1eb0cdc1727847e63ce0d6fd31f66 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32952) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 8e46affb05226eeed6b8eb20df971445139654a8 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * bb918383e68f2c6539ceab99853972a73e9fec0c Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 8fea20f61b27f7b803c12798ad85dbe97e05dcdf Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7cf23e6c62f998b19c138f9dd9b5e60787054f05",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "7cf23e6c62f998b19c138f9dd9b5e60787054f05",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * bb918383e68f2c6539ceab99853972a73e9fec0c Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423) 
   * 7cf23e6c62f998b19c138f9dd9b5e60787054f05 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * bb918383e68f2c6539ceab99853972a73e9fec0c Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * ca91f03a2674f7041d3a66c618f833809c6e98aa Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675) 
   * 47cdf91d5086b292f20d40700c7bd9052e32a6e0 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680) 
   * 1240d4313817b4ae0bfca21090eb6f1cbd40f78e UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 1240d4313817b4ae0bfca21090eb6f1cbd40f78e Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684) 
   * 4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707) 
   * 14ae6d15c25f454fe7675bf7b0e129834b99ae37 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708) 
   * a36182858952a61ef43ccbdc184440354bb86110 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * ce5259932767032a24e15f5475f31ff336d40a48 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897) 
   * 678546804066c142c23cdcc36e88dc442210282a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 4cf2279af74f89d0f9f8b446ccd8b72bf57495cb Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 4cf2279af74f89d0f9f8b446ccd8b72bf57495cb Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908) 
   * d178dd422b702f51ee85b41e43ec4b306c60aa8c Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 1240d4313817b4ae0bfca21090eb6f1cbd40f78e Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 14ae6d15c25f454fe7675bf7b0e129834b99ae37 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708) 
   * a36182858952a61ef43ccbdc184440354bb86110 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * d9ffed1defa7514c27d3ba9a9e3008de8b84bc68 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707) 
   * 14ae6d15c25f454fe7675bf7b0e129834b99ae37 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "76fd3d06d35cd82a85bb6e48fa90338485cc0273",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "76fd3d06d35cd82a85bb6e48fa90338485cc0273",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * a36182858952a61ef43ccbdc184440354bb86110 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721) 
   * 76fd3d06d35cd82a85bb6e48fa90338485cc0273 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * d9ffed1defa7514c27d3ba9a9e3008de8b84bc68 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805) 
   * 0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * d9ffed1defa7514c27d3ba9a9e3008de8b84bc68 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805) 
   * 0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * ce5259932767032a24e15f5475f31ff336d40a48 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897) 
   * 678546804066c142c23cdcc36e88dc442210282a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900) 
   * 4cf2279af74f89d0f9f8b446ccd8b72bf57495cb UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 678546804066c142c23cdcc36e88dc442210282a Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * ff0bcbd841960c3e524a62c3c45f426707fb5306 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946) 
   * 84846361d6db21a5ac721494ebb5badf23b274a1 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947) 
   * 195bf4369bf1eb0cdc1727847e63ce0d6fd31f66 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32952",
       "triggerID" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "triggerType" : "PUSH"
     }, {
       "hash" : "894eb800e4045264cec298908892659466e78b7b",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "894eb800e4045264cec298908892659466e78b7b",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 195bf4369bf1eb0cdc1727847e63ce0d6fd31f66 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32952) 
   * 894eb800e4045264cec298908892659466e78b7b UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dianfu closed pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
dianfu closed pull request #18957:
URL: https://github.com/apache/flink/pull/18957


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 84846361d6db21a5ac721494ebb5badf23b274a1 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 84846361d6db21a5ac721494ebb5badf23b274a1 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dianfu commented on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
dianfu commented on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1066221067


   The test failure is not related and it has passed: https://dev.azure.com/dianfu/Flink/_build/results?buildId=562&view=results


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 1240d4313817b4ae0bfca21090eb6f1cbd40f78e Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684) 
   * 4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707) 
   * 14ae6d15c25f454fe7675bf7b0e129834b99ae37 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 8e46affb05226eeed6b8eb20df971445139654a8 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535) 
   * 610f23f625a4559bef6f000916a2de37a5cb3b38 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b6fd8d95b532e070c17d4cdbfc7ad85125d9df38",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "b6fd8d95b532e070c17d4cdbfc7ad85125d9df38",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 610f23f625a4559bef6f000916a2de37a5cb3b38 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666) 
   * ca91f03a2674f7041d3a66c618f833809c6e98aa Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675) 
   * b6fd8d95b532e070c17d4cdbfc7ad85125d9df38 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 47cdf91d5086b292f20d40700c7bd9052e32a6e0 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680) 
   * 1240d4313817b4ae0bfca21090eb6f1cbd40f78e UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 610f23f625a4559bef6f000916a2de37a5cb3b38 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666) 
   * ca91f03a2674f7041d3a66c618f833809c6e98aa UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 1240d4313817b4ae0bfca21090eb6f1cbd40f78e Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 8fea20f61b27f7b803c12798ad85dbe97e05dcdf Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * a4a5fc419dbf82c079a24327846d4a7baa500959 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a7cdfbf43f34e7bf4de81ad0b1a48edb6bdff7da",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a7cdfbf43f34e7bf4de81ad0b1a48edb6bdff7da",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * a4a5fc419dbf82c079a24327846d4a7baa500959 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531) 
   * a7cdfbf43f34e7bf4de81ad0b1a48edb6bdff7da UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 14ae6d15c25f454fe7675bf7b0e129834b99ae37 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * d9ffed1defa7514c27d3ba9a9e3008de8b84bc68 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805) 
   * 0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] Vancior commented on a change in pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
Vancior commented on a change in pull request #18957:
URL: https://github.com/apache/flink/pull/18957#discussion_r823283875



##########
File path: flink-python/pyflink/datastream/window.py
##########
@@ -517,3 +538,721 @@ def __init__(self,
         self.window_state_descriptor = window_state_descriptor
         self.internal_window_function = internal_window_function
         self.window_serializer = window_serializer
+
+
+class EventTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_watermark():
+            ctx.register_event_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to
+    which a pane belongs.
+    """
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp())
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        window_max_timestamp = window.max_timestamp()
+        if window_max_timestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(window_max_timestamp)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp())
+
+
+class CountTrigger(Trigger[T, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+
+    def __init__(self, window_size: int):
+        self._window_size = window_size
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.LONG())
+
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def can_merge(self) -> bool:
+        return True
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[T, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of the
+    machine the operation is running on. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(10), False))
+
+    A WindowAssigner that windows elements into windows based on the timestamp of the elements.
+    Windows cannot overlap.
+    For example, in order to window into windows of 1 minute:
+     ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(60000, 0, True))
+    """
+
+    def __init__(self, size: int, offset: int, is_event_time: bool):
+        if abs(offset) >= size:
+            raise Exception("TumblingWindowAssigner parameters must satisfy abs(offset) < size")
+
+        self._size = size
+        self._offset = offset
+        self._is_event_time = is_event_time
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if self._is_event_time is False:
+            current_processing_time = context.get_current_processing_time()
+            start = TimeWindow.get_window_start_with_offset(current_processing_time, self._offset,
+                                                            self._size)
+            return [TimeWindow(start, start + self._size)]
+        else:
+            if timestamp > MIN_LONG_VALUE:
+                start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._size)
+                return [TimeWindow(start, start + self._size)]
+            else:
+                raise Exception("Record has MIN_LONG_VALUE timestamp (= no timestamp marker). "

Review comment:
       maybe use `jvm.java.lang.Long.MIN_VALUE` instead?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a27dbdfdb1bb06a2748e3dd021c735312404e8a7",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a27dbdfdb1bb06a2748e3dd021c735312404e8a7",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * a36182858952a61ef43ccbdc184440354bb86110 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721) 
   * a27dbdfdb1bb06a2748e3dd021c735312404e8a7 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * d9ffed1defa7514c27d3ba9a9e3008de8b84bc68 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805) 
   * 0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 678546804066c142c23cdcc36e88dc442210282a Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900) 
   * 4cf2279af74f89d0f9f8b446ccd8b72bf57495cb Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908) 
   * d178dd422b702f51ee85b41e43ec4b306c60aa8c UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 84846361d6db21a5ac721494ebb5badf23b274a1 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947) 
   * 195bf4369bf1eb0cdc1727847e63ce0d6fd31f66 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 84846361d6db21a5ac721494ebb5badf23b274a1 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "76e2fe5afddd2c95642a772eac434325ccd63fa4",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "76e2fe5afddd2c95642a772eac434325ccd63fa4",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810) 
   * 76e2fe5afddd2c95642a772eac434325ccd63fa4 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810) 
   * ce5259932767032a24e15f5475f31ff336d40a48 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * d178dd422b702f51ee85b41e43ec4b306c60aa8c Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32952",
       "triggerID" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "triggerType" : "PUSH"
     }, {
       "hash" : "894eb800e4045264cec298908892659466e78b7b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32964",
       "triggerID" : "894eb800e4045264cec298908892659466e78b7b",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 195bf4369bf1eb0cdc1727847e63ce0d6fd31f66 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32952) 
   * 894eb800e4045264cec298908892659466e78b7b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32964) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 610f23f625a4559bef6f000916a2de37a5cb3b38 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666) 
   * ca91f03a2674f7041d3a66c618f833809c6e98aa Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * ff0bcbd841960c3e524a62c3c45f426707fb5306 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946) 
   * 84846361d6db21a5ac721494ebb5badf23b274a1 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * ff0bcbd841960c3e524a62c3c45f426707fb5306 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946) 
   * 84846361d6db21a5ac721494ebb5badf23b274a1 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32952",
       "triggerID" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "triggerType" : "PUSH"
     }, {
       "hash" : "894eb800e4045264cec298908892659466e78b7b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32964",
       "triggerID" : "894eb800e4045264cec298908892659466e78b7b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "770b70d23b1f989d2de869f0d92c37615743513e",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33018",
       "triggerID" : "770b70d23b1f989d2de869f0d92c37615743513e",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 770b70d23b1f989d2de869f0d92c37615743513e Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33018) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4289112d7f34337784d436380f7f906cd8b1da7e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "4289112d7f34337784d436380f7f906cd8b1da7e",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * bb918383e68f2c6539ceab99853972a73e9fec0c Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423) 
   * 4289112d7f34337784d436380f7f906cd8b1da7e UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 8e46affb05226eeed6b8eb20df971445139654a8 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * a4a5fc419dbf82c079a24327846d4a7baa500959 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 47cdf91d5086b292f20d40700c7bd9052e32a6e0 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680) 
   * 1240d4313817b4ae0bfca21090eb6f1cbd40f78e UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 14ae6d15c25f454fe7675bf7b0e129834b99ae37 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708) 
   * a36182858952a61ef43ccbdc184440354bb86110 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 14ae6d15c25f454fe7675bf7b0e129834b99ae37 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810) 
   * ce5259932767032a24e15f5475f31ff336d40a48 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 1240d4313817b4ae0bfca21090eb6f1cbd40f78e Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684) 
   * 4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * d178dd422b702f51ee85b41e43ec4b306c60aa8c Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914) 
   * ff0bcbd841960c3e524a62c3c45f426707fb5306 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946) 
   * 84846361d6db21a5ac721494ebb5badf23b274a1 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32952",
       "triggerID" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "triggerType" : "PUSH"
     }, {
       "hash" : "894eb800e4045264cec298908892659466e78b7b",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "894eb800e4045264cec298908892659466e78b7b",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 195bf4369bf1eb0cdc1727847e63ce0d6fd31f66 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32952) 
   * 894eb800e4045264cec298908892659466e78b7b UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32952",
       "triggerID" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 195bf4369bf1eb0cdc1727847e63ce0d6fd31f66 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32952) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 4cf2279af74f89d0f9f8b446ccd8b72bf57495cb Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32952",
       "triggerID" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 195bf4369bf1eb0cdc1727847e63ce0d6fd31f66 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32952) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * ff0bcbd841960c3e524a62c3c45f426707fb5306 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * bb918383e68f2c6539ceab99853972a73e9fec0c Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423) 
   * a4a5fc419dbf82c079a24327846d4a7baa500959 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * bb918383e68f2c6539ceab99853972a73e9fec0c Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dianfu commented on a change in pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
dianfu commented on a change in pull request #18957:
URL: https://github.com/apache/flink/pull/18957#discussion_r818274208



##########
File path: flink-python/pyflink/fn_execution/datastream/window/window_assigner.py
##########
@@ -0,0 +1,376 @@
+import math
+from typing import Iterable, Collection
+
+from pyflink.common import TypeSerializer, Time
+from pyflink.common.typeinfo import Types
+from pyflink.datastream import Trigger
+from pyflink.datastream.state import ValueStateDescriptor, ValueState, ReducingStateDescriptor
+from pyflink.datastream.window import TimeWindow, CountWindow, WindowAssigner, T, TimeWindowSerializer, TriggerResult, \
+    CountWindowSerializer, MergingWindowAssigner
+from pyflink.fn_execution.table.window_context import W
+
+
+class EventTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+

Review comment:
       need to override the can_merge method

##########
File path: flink-python/pyflink/fn_execution/datastream/window/window_assigner.py
##########
@@ -0,0 +1,376 @@
+import math

Review comment:
       missing license header

##########
File path: flink-python/pyflink/fn_execution/datastream/window/window_assigner.py
##########
@@ -0,0 +1,376 @@
+import math
+from typing import Iterable, Collection
+
+from pyflink.common import TypeSerializer, Time
+from pyflink.common.typeinfo import Types
+from pyflink.datastream import Trigger
+from pyflink.datastream.state import ValueStateDescriptor, ValueState, ReducingStateDescriptor
+from pyflink.datastream.window import TimeWindow, CountWindow, WindowAssigner, T, TimeWindowSerializer, TriggerResult, \
+    CountWindowSerializer, MergingWindowAssigner
+from pyflink.fn_execution.table.window_context import W
+
+
+class EventTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp()
+        if windowMaxTimestamp >= ctx.get_current_watermark():

Review comment:
       ```suggestion
           if windowMaxTimestamp > ctx.get_current_watermark():
   ```

##########
File path: flink-python/pyflink/fn_execution/datastream/window/window_assigner.py
##########
@@ -0,0 +1,376 @@
+import math
+from typing import Iterable, Collection
+
+from pyflink.common import TypeSerializer, Time
+from pyflink.common.typeinfo import Types
+from pyflink.datastream import Trigger
+from pyflink.datastream.state import ValueStateDescriptor, ValueState, ReducingStateDescriptor
+from pyflink.datastream.window import TimeWindow, CountWindow, WindowAssigner, T, TimeWindowSerializer, TriggerResult, \
+    CountWindowSerializer, MergingWindowAssigner
+from pyflink.fn_execution.table.window_context import W
+
+
+class EventTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp()
+        if windowMaxTimestamp >= ctx.get_current_watermark():
+            ctx.register_event_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp());
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,

Review comment:
       need to override the can_merge method

##########
File path: flink-python/pyflink/fn_execution/datastream/window/window_assigner.py
##########
@@ -0,0 +1,376 @@
+import math
+from typing import Iterable, Collection
+
+from pyflink.common import TypeSerializer, Time
+from pyflink.common.typeinfo import Types
+from pyflink.datastream import Trigger
+from pyflink.datastream.state import ValueStateDescriptor, ValueState, ReducingStateDescriptor
+from pyflink.datastream.window import TimeWindow, CountWindow, WindowAssigner, T, TimeWindowSerializer, TriggerResult, \
+    CountWindowSerializer, MergingWindowAssigner
+from pyflink.fn_execution.table.window_context import W
+
+
+class EventTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp()
+        if windowMaxTimestamp >= ctx.get_current_watermark():
+            ctx.register_event_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to which a pane belongs.

Review comment:
       check style issue, it exceeds 100 characters in one line.

##########
File path: flink-python/pyflink/fn_execution/datastream/window/window_assigner.py
##########
@@ -0,0 +1,376 @@
+import math
+from typing import Iterable, Collection
+
+from pyflink.common import TypeSerializer, Time
+from pyflink.common.typeinfo import Types
+from pyflink.datastream import Trigger
+from pyflink.datastream.state import ValueStateDescriptor, ValueState, ReducingStateDescriptor
+from pyflink.datastream.window import TimeWindow, CountWindow, WindowAssigner, T, TimeWindowSerializer, TriggerResult, \
+    CountWindowSerializer, MergingWindowAssigner
+from pyflink.fn_execution.table.window_context import W
+
+
+class EventTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp()
+        if windowMaxTimestamp >= ctx.get_current_watermark():
+            ctx.register_event_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp());
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp();
+        if windowMaxTimestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp());
+
+
+class CountTrigger(Trigger[object, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+    def __init__(self, window_size: int):
+        self._window_size = int(window_size)

Review comment:
       ```suggestion
           self._window_size = window_size
   ```

##########
File path: flink-python/pyflink/fn_execution/datastream/window/window_assigner.py
##########
@@ -0,0 +1,376 @@
+import math
+from typing import Iterable, Collection
+
+from pyflink.common import TypeSerializer, Time
+from pyflink.common.typeinfo import Types
+from pyflink.datastream import Trigger
+from pyflink.datastream.state import ValueStateDescriptor, ValueState, ReducingStateDescriptor
+from pyflink.datastream.window import TimeWindow, CountWindow, WindowAssigner, T, TimeWindowSerializer, TriggerResult, \
+    CountWindowSerializer, MergingWindowAssigner
+from pyflink.fn_execution.table.window_context import W
+
+
+class EventTimeTrigger(Trigger[object, TimeWindow]):

Review comment:
       ```suggestion
   class EventTimeTrigger(Trigger[T, TimeWindow]):
   ```

##########
File path: flink-python/pyflink/fn_execution/datastream/window/window_assigner.py
##########
@@ -0,0 +1,376 @@
+import math
+from typing import Iterable, Collection
+
+from pyflink.common import TypeSerializer, Time
+from pyflink.common.typeinfo import Types
+from pyflink.datastream import Trigger
+from pyflink.datastream.state import ValueStateDescriptor, ValueState, ReducingStateDescriptor
+from pyflink.datastream.window import TimeWindow, CountWindow, WindowAssigner, T, TimeWindowSerializer, TriggerResult, \
+    CountWindowSerializer, MergingWindowAssigner
+from pyflink.fn_execution.table.window_context import W
+
+
+class EventTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp()
+        if windowMaxTimestamp >= ctx.get_current_watermark():
+            ctx.register_event_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp());
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp();
+        if windowMaxTimestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp());
+
+
+class CountTrigger(Trigger[object, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+    def __init__(self, window_size: int):
+        self._window_size = int(window_size)
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.BIG_INT())
+
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[object, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of the machine the operation is running on. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(10), False))
+
+    A WindowAssigner that windows elements into windows based on the timestamp of the elements. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute:
+     ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(0), True))
+    """
+    def __init__(self, size: Time, offset: Time, is_event_time: bool):

Review comment:
       ```suggestion
       def __init__(self, size: int, offset: int, is_event_time: bool):
   ```
   Use int to keep consistent with the Java API. Besides, it would be great to add a few static utility methods just as did in the Java API.

##########
File path: flink-python/pyflink/fn_execution/datastream/window/window_assigner.py
##########
@@ -0,0 +1,376 @@
+import math
+from typing import Iterable, Collection
+
+from pyflink.common import TypeSerializer, Time
+from pyflink.common.typeinfo import Types
+from pyflink.datastream import Trigger
+from pyflink.datastream.state import ValueStateDescriptor, ValueState, ReducingStateDescriptor
+from pyflink.datastream.window import TimeWindow, CountWindow, WindowAssigner, T, TimeWindowSerializer, TriggerResult, \
+    CountWindowSerializer, MergingWindowAssigner
+from pyflink.fn_execution.table.window_context import W
+
+
+class EventTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp()
+        if windowMaxTimestamp >= ctx.get_current_watermark():
+            ctx.register_event_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp());
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp();
+        if windowMaxTimestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp());
+
+
+class CountTrigger(Trigger[object, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+    def __init__(self, window_size: int):
+        self._window_size = int(window_size)
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.BIG_INT())
+
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[object, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of the machine the operation is running on. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(10), False))
+
+    A WindowAssigner that windows elements into windows based on the timestamp of the elements. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute:
+     ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(0), True))
+    """
+    def __init__(self, size: Time, offset: Time, is_event_time: bool):
+        self._size = size.to_milliseconds()
+        self._offset = offset.to_milliseconds()
+        self._is_event_time = is_event_time
+
+    def assign_windows(self,
+                       element: object,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if self._is_event_time is False:
+            timestamp = context.get_current_processing_time()
+
+        start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._size)
+        return [TimeWindow(start, start + self._size)]
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        if self._is_event_time is True:
+            return EventTimeTrigger()
+        else:
+            return ProcessingTimeTrigger()
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return self._is_event_time
+
+    def __repr__(self):
+        return "TumblingWindowAssigner(%s,%s,%s)" % (self._size, self._offset, self.is_event_time)
+
+'''
+A WindowAssigner that windows elements into windows based on the number of the elements. Windows cannot overlap.

Review comment:
       could be removed

##########
File path: flink-python/pyflink/fn_execution/datastream/window/window_assigner.py
##########
@@ -0,0 +1,376 @@
+import math
+from typing import Iterable, Collection
+
+from pyflink.common import TypeSerializer, Time
+from pyflink.common.typeinfo import Types
+from pyflink.datastream import Trigger
+from pyflink.datastream.state import ValueStateDescriptor, ValueState, ReducingStateDescriptor
+from pyflink.datastream.window import TimeWindow, CountWindow, WindowAssigner, T, TimeWindowSerializer, TriggerResult, \
+    CountWindowSerializer, MergingWindowAssigner
+from pyflink.fn_execution.table.window_context import W
+
+
+class EventTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp()
+        if windowMaxTimestamp >= ctx.get_current_watermark():
+            ctx.register_event_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp());
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp();
+        if windowMaxTimestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp());
+
+
+class CountTrigger(Trigger[object, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+    def __init__(self, window_size: int):
+        self._window_size = int(window_size)
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.BIG_INT())
+
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[object, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of the machine the operation is running on. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(10), False))
+
+    A WindowAssigner that windows elements into windows based on the timestamp of the elements. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute:
+     ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(0), True))
+    """
+    def __init__(self, size: Time, offset: Time, is_event_time: bool):
+        self._size = size.to_milliseconds()
+        self._offset = offset.to_milliseconds()
+        self._is_event_time = is_event_time
+
+    def assign_windows(self,
+                       element: object,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if self._is_event_time is False:
+            timestamp = context.get_current_processing_time()
+
+        start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._size)
+        return [TimeWindow(start, start + self._size)]
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        if self._is_event_time is True:
+            return EventTimeTrigger()
+        else:
+            return ProcessingTimeTrigger()
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return self._is_event_time
+
+    def __repr__(self):
+        return "TumblingWindowAssigner(%s,%s,%s)" % (self._size, self._offset, self.is_event_time)
+
+'''
+A WindowAssigner that windows elements into windows based on the number of the elements. Windows cannot overlap.
+'''
+class CountTumblingWindowAssigner(WindowAssigner[object, CountWindow]):
+    """
+    A WindowAssigner that windows elements into fixed-size windows based on the count number
+    of the elements. Windows cannot overlap.
+    """
+    def __init__(self, window_size: int):
+        """
+        Windows this KeyedStream into tumbling count windows.
+        :param window_size: The size of the windows in number of elements.
+        """
+        self._window_size = window_size
+        self._counter_state_descriptor = ReducingStateDescriptor(
+            "assigner_counter", lambda a, b: a + b, Types.BIG_INT())
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        counter = context.get_runtime_context().get_reducing_state(
+            self._counter_state_descriptor)
+        if counter.get() is None:
+            counter.add(0)
+        result = [CountWindow(counter.get() // self._window_size)]
+        counter.add(1)
+        return result
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self) -> str:
+        return "CountTumblingWindowAssigner(%s)" % (self._window_size)
+
+
+class SlidingWindowAssigner(WindowAssigner[object, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the current system time of the machine the operation is running on. Windows can possibly overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(SlidingWindowAssigner(Time.minutes(1), Time.seconds(10), Time.seconds(0), False))
+
+    A WindowAssigner that windows elements into sliding windows based on the timestamp of the elements. Windows can possibly overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(SlidingWindowAssigner(Time.minutes(1), Time.seconds(10), Time.seconds(0), True))
+    """
+    def __init__(self, size: Time, slide: Time, offset: Time, is_event_time: bool):
+        self._size = size.to_milliseconds()
+        self._slide = slide.to_milliseconds()
+        self._offset = offset.to_milliseconds()
+        self._is_event_time = is_event_time
+        self._pane_size = math.gcd(size, slide)
+        self._num_panes_per_window = size // self._pane_size
+
+    def assign_windows(
+        self,
+        element: T,
+        timestamp: int,
+        context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        if self._is_event_time is False:
+            timestamp = context.get_current_processing_time()
+
+        last_start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._slide)
+        windows = [TimeWindow(start, start + self._size)
+                   for start in range(last_start, timestamp - self._size, -self._slide)]
+        return windows
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        if self._is_event_time is True:
+            return EventTimeTrigger()
+        else:
+            return ProcessingTimeTrigger()
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return self._is_event_time
+
+    def __repr__(self) -> str:
+        return "SlidingWindowAssigner(%s, %s, %s, %s)" % (self._size, self._slide, self._offset, self._is_event_time)
+
+
+class CountSlidingWindowAssigner(WindowAssigner[object, CountWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the count number of
+    the elements. Windows can possibly overlap.
+    """
+    def __init__(self, window_size: int, window_slide: int):
+        """
+        Windows this KeyedStream into sliding count windows.
+        :param window_size: The size of the windows in number of elements.
+        :param window_slide: The slide interval in number of elements.
+        """
+        self._window_size = window_size
+        self._window_slide = window_slide
+        self._count = None  # type: ValueState
+        self._counter_state_descriptor = ReducingStateDescriptor(
+            "slide-count-assigner", lambda a, b: a + b, Types.BIG_INT())
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        count_descriptor = ValueStateDescriptor('slide-count-assigner', Types.LONG())
+        self._count = context.get_runtime_context().get_state(count_descriptor)
+        count_value = self._count.value()
+        if count_value is None:
+            current_count = 0
+        else:
+            current_count = count_value
+        self._count.update(current_count + 1)
+        last_id = current_count // self._window_slide
+        last_start = last_id * self._window_slide
+        last_end = last_start + self._window_size - 1
+        windows = []
+        while last_id >= 0 and last_start <= current_count <= last_end:
+            if last_start <= current_count <= last_end:
+                windows.append(CountWindow(last_id))
+            last_id -= 1
+            last_start -= self._window_slide
+            last_end -= self._window_slide
+        return windows
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self):
+        return "CountSlidingWindowAssigner(%s, %s)" % (self._window_size, self._window_slide)
+
+
+class SessionWindowAssigner(MergingWindowAssigner[object, TimeWindow]):

Review comment:
       It has already support dynamic session windows in the Java API and it would be great to align with it. See DynamicEventTimeSessionWindows for more details.

##########
File path: flink-python/pyflink/fn_execution/datastream/window/window_assigner.py
##########
@@ -0,0 +1,376 @@
+import math
+from typing import Iterable, Collection
+
+from pyflink.common import TypeSerializer, Time
+from pyflink.common.typeinfo import Types
+from pyflink.datastream import Trigger
+from pyflink.datastream.state import ValueStateDescriptor, ValueState, ReducingStateDescriptor
+from pyflink.datastream.window import TimeWindow, CountWindow, WindowAssigner, T, TimeWindowSerializer, TriggerResult, \
+    CountWindowSerializer, MergingWindowAssigner
+from pyflink.fn_execution.table.window_context import W
+
+
+class EventTimeTrigger(Trigger[object, TimeWindow]):

Review comment:
       Move all these classes to module pyflink/datastream/window.py

##########
File path: flink-python/pyflink/fn_execution/datastream/window/window_assigner.py
##########
@@ -0,0 +1,376 @@
+import math
+from typing import Iterable, Collection
+
+from pyflink.common import TypeSerializer, Time
+from pyflink.common.typeinfo import Types
+from pyflink.datastream import Trigger
+from pyflink.datastream.state import ValueStateDescriptor, ValueState, ReducingStateDescriptor
+from pyflink.datastream.window import TimeWindow, CountWindow, WindowAssigner, T, TimeWindowSerializer, TriggerResult, \
+    CountWindowSerializer, MergingWindowAssigner
+from pyflink.fn_execution.table.window_context import W
+
+
+class EventTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp()
+        if windowMaxTimestamp >= ctx.get_current_watermark():
+            ctx.register_event_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[object, TimeWindow]):

Review comment:
       ```suggestion
   class ProcessingTimeTrigger(Trigger[T, TimeWindow]):
   ```

##########
File path: flink-python/pyflink/fn_execution/datastream/window/window_assigner.py
##########
@@ -0,0 +1,376 @@
+import math
+from typing import Iterable, Collection
+
+from pyflink.common import TypeSerializer, Time
+from pyflink.common.typeinfo import Types
+from pyflink.datastream import Trigger
+from pyflink.datastream.state import ValueStateDescriptor, ValueState, ReducingStateDescriptor
+from pyflink.datastream.window import TimeWindow, CountWindow, WindowAssigner, T, TimeWindowSerializer, TriggerResult, \
+    CountWindowSerializer, MergingWindowAssigner
+from pyflink.fn_execution.table.window_context import W
+
+
+class EventTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp()
+        if windowMaxTimestamp >= ctx.get_current_watermark():
+            ctx.register_event_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp());

Review comment:
       ```suggestion
           ctx.register_processing_time_timer(window.max_timestamp())
   ```

##########
File path: flink-python/pyflink/fn_execution/datastream/window/window_assigner.py
##########
@@ -0,0 +1,376 @@
+import math
+from typing import Iterable, Collection
+
+from pyflink.common import TypeSerializer, Time
+from pyflink.common.typeinfo import Types
+from pyflink.datastream import Trigger
+from pyflink.datastream.state import ValueStateDescriptor, ValueState, ReducingStateDescriptor
+from pyflink.datastream.window import TimeWindow, CountWindow, WindowAssigner, T, TimeWindowSerializer, TriggerResult, \
+    CountWindowSerializer, MergingWindowAssigner
+from pyflink.fn_execution.table.window_context import W
+
+
+class EventTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp()

Review comment:
       ```suggestion
           window_max_timestamp = window.max_timestamp()
   ```

##########
File path: flink-python/pyflink/fn_execution/datastream/window/window_assigner.py
##########
@@ -0,0 +1,376 @@
+import math
+from typing import Iterable, Collection
+
+from pyflink.common import TypeSerializer, Time
+from pyflink.common.typeinfo import Types
+from pyflink.datastream import Trigger
+from pyflink.datastream.state import ValueStateDescriptor, ValueState, ReducingStateDescriptor
+from pyflink.datastream.window import TimeWindow, CountWindow, WindowAssigner, T, TimeWindowSerializer, TriggerResult, \
+    CountWindowSerializer, MergingWindowAssigner
+from pyflink.fn_execution.table.window_context import W
+
+
+class EventTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp()
+        if windowMaxTimestamp >= ctx.get_current_watermark():
+            ctx.register_event_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp());
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp();
+        if windowMaxTimestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp());

Review comment:
       ```suggestion
           ctx.delete_processing_time_timer(window.max_timestamp())
   ```

##########
File path: flink-python/pyflink/fn_execution/datastream/window/window_assigner.py
##########
@@ -0,0 +1,376 @@
+import math
+from typing import Iterable, Collection
+
+from pyflink.common import TypeSerializer, Time
+from pyflink.common.typeinfo import Types
+from pyflink.datastream import Trigger
+from pyflink.datastream.state import ValueStateDescriptor, ValueState, ReducingStateDescriptor
+from pyflink.datastream.window import TimeWindow, CountWindow, WindowAssigner, T, TimeWindowSerializer, TriggerResult, \
+    CountWindowSerializer, MergingWindowAssigner
+from pyflink.fn_execution.table.window_context import W
+
+
+class EventTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp()
+        if windowMaxTimestamp >= ctx.get_current_watermark():
+            ctx.register_event_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp());
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp();
+        if windowMaxTimestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp());
+
+
+class CountTrigger(Trigger[object, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+    def __init__(self, window_size: int):
+        self._window_size = int(window_size)
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.BIG_INT())
+
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.

Review comment:
       ```suggestion
               #               though, all elements are retained.
               state.clear()
   ```

##########
File path: flink-python/pyflink/fn_execution/datastream/window/window_assigner.py
##########
@@ -0,0 +1,376 @@
+import math
+from typing import Iterable, Collection
+
+from pyflink.common import TypeSerializer, Time
+from pyflink.common.typeinfo import Types
+from pyflink.datastream import Trigger
+from pyflink.datastream.state import ValueStateDescriptor, ValueState, ReducingStateDescriptor
+from pyflink.datastream.window import TimeWindow, CountWindow, WindowAssigner, T, TimeWindowSerializer, TriggerResult, \
+    CountWindowSerializer, MergingWindowAssigner
+from pyflink.fn_execution.table.window_context import W
+
+
+class EventTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp()
+        if windowMaxTimestamp >= ctx.get_current_watermark():
+            ctx.register_event_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp());
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp();
+        if windowMaxTimestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp());
+
+
+class CountTrigger(Trigger[object, CountWindow]):

Review comment:
       ```suggestion
   class CountTrigger(Trigger[T, CountWindow]):
   ```

##########
File path: flink-python/pyflink/fn_execution/datastream/window/window_assigner.py
##########
@@ -0,0 +1,376 @@
+import math
+from typing import Iterable, Collection
+
+from pyflink.common import TypeSerializer, Time
+from pyflink.common.typeinfo import Types
+from pyflink.datastream import Trigger
+from pyflink.datastream.state import ValueStateDescriptor, ValueState, ReducingStateDescriptor
+from pyflink.datastream.window import TimeWindow, CountWindow, WindowAssigner, T, TimeWindowSerializer, TriggerResult, \
+    CountWindowSerializer, MergingWindowAssigner
+from pyflink.fn_execution.table.window_context import W
+
+
+class EventTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp()
+        if windowMaxTimestamp >= ctx.get_current_watermark():
+            ctx.register_event_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp());
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp();
+        if windowMaxTimestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp());
+
+
+class CountTrigger(Trigger[object, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+    def __init__(self, window_size: int):
+        self._window_size = int(window_size)
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.BIG_INT())
+
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[object, TimeWindow]):

Review comment:
       ```suggestion
   class TumblingWindowAssigner(WindowAssigner[T, TimeWindow]):
   ```

##########
File path: flink-python/pyflink/fn_execution/datastream/window/window_assigner.py
##########
@@ -0,0 +1,376 @@
+import math
+from typing import Iterable, Collection
+
+from pyflink.common import TypeSerializer, Time
+from pyflink.common.typeinfo import Types
+from pyflink.datastream import Trigger
+from pyflink.datastream.state import ValueStateDescriptor, ValueState, ReducingStateDescriptor
+from pyflink.datastream.window import TimeWindow, CountWindow, WindowAssigner, T, TimeWindowSerializer, TriggerResult, \
+    CountWindowSerializer, MergingWindowAssigner
+from pyflink.fn_execution.table.window_context import W
+
+
+class EventTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp()
+        if windowMaxTimestamp >= ctx.get_current_watermark():
+            ctx.register_event_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp());
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp();
+        if windowMaxTimestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp());
+
+
+class CountTrigger(Trigger[object, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+    def __init__(self, window_size: int):
+        self._window_size = int(window_size)
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.BIG_INT())
+
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:

Review comment:
       need to override the can_merge method

##########
File path: flink-python/pyflink/fn_execution/datastream/window/window_assigner.py
##########
@@ -0,0 +1,376 @@
+import math
+from typing import Iterable, Collection
+
+from pyflink.common import TypeSerializer, Time
+from pyflink.common.typeinfo import Types
+from pyflink.datastream import Trigger
+from pyflink.datastream.state import ValueStateDescriptor, ValueState, ReducingStateDescriptor
+from pyflink.datastream.window import TimeWindow, CountWindow, WindowAssigner, T, TimeWindowSerializer, TriggerResult, \
+    CountWindowSerializer, MergingWindowAssigner
+from pyflink.fn_execution.table.window_context import W
+
+
+class EventTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp()
+        if windowMaxTimestamp >= ctx.get_current_watermark():
+            ctx.register_event_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp());
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp();

Review comment:
       ```suggestion
           window_max_timestamp = window.max_timestamp();
   ```

##########
File path: flink-python/pyflink/fn_execution/datastream/window/window_assigner.py
##########
@@ -0,0 +1,376 @@
+import math
+from typing import Iterable, Collection
+
+from pyflink.common import TypeSerializer, Time
+from pyflink.common.typeinfo import Types
+from pyflink.datastream import Trigger
+from pyflink.datastream.state import ValueStateDescriptor, ValueState, ReducingStateDescriptor
+from pyflink.datastream.window import TimeWindow, CountWindow, WindowAssigner, T, TimeWindowSerializer, TriggerResult, \
+    CountWindowSerializer, MergingWindowAssigner
+from pyflink.fn_execution.table.window_context import W
+
+
+class EventTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp()
+        if windowMaxTimestamp >= ctx.get_current_watermark():
+            ctx.register_event_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp());
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp();
+        if windowMaxTimestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp());
+
+
+class CountTrigger(Trigger[object, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+    def __init__(self, window_size: int):
+        self._window_size = int(window_size)
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.BIG_INT())

Review comment:
       ```suggestion
               "count", lambda a, b: a + b, Types.LONG())
   ```

##########
File path: flink-python/pyflink/fn_execution/datastream/window/window_assigner.py
##########
@@ -0,0 +1,376 @@
+import math
+from typing import Iterable, Collection
+
+from pyflink.common import TypeSerializer, Time
+from pyflink.common.typeinfo import Types
+from pyflink.datastream import Trigger
+from pyflink.datastream.state import ValueStateDescriptor, ValueState, ReducingStateDescriptor
+from pyflink.datastream.window import TimeWindow, CountWindow, WindowAssigner, T, TimeWindowSerializer, TriggerResult, \
+    CountWindowSerializer, MergingWindowAssigner
+from pyflink.fn_execution.table.window_context import W
+
+
+class EventTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp()
+        if windowMaxTimestamp >= ctx.get_current_watermark():
+            ctx.register_event_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp());
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp();
+        if windowMaxTimestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp());
+
+
+class CountTrigger(Trigger[object, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+    def __init__(self, window_size: int):
+        self._window_size = int(window_size)
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.BIG_INT())
+
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[object, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of the machine the operation is running on. Windows cannot overlap.

Review comment:
       need fix the check style issue. Please also fix other places.

##########
File path: flink-python/pyflink/fn_execution/datastream/window/window_assigner.py
##########
@@ -0,0 +1,376 @@
+import math
+from typing import Iterable, Collection
+
+from pyflink.common import TypeSerializer, Time
+from pyflink.common.typeinfo import Types
+from pyflink.datastream import Trigger
+from pyflink.datastream.state import ValueStateDescriptor, ValueState, ReducingStateDescriptor
+from pyflink.datastream.window import TimeWindow, CountWindow, WindowAssigner, T, TimeWindowSerializer, TriggerResult, \
+    CountWindowSerializer, MergingWindowAssigner
+from pyflink.fn_execution.table.window_context import W
+
+
+class EventTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp()
+        if windowMaxTimestamp >= ctx.get_current_watermark():
+            ctx.register_event_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp());
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp();
+        if windowMaxTimestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp());
+
+
+class CountTrigger(Trigger[object, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+    def __init__(self, window_size: int):
+        self._window_size = int(window_size)
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.BIG_INT())
+
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[object, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of the machine the operation is running on. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(10), False))
+
+    A WindowAssigner that windows elements into windows based on the timestamp of the elements. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute:
+     ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(0), True))
+    """
+    def __init__(self, size: Time, offset: Time, is_event_time: bool):
+        self._size = size.to_milliseconds()
+        self._offset = offset.to_milliseconds()
+        self._is_event_time = is_event_time
+
+    def assign_windows(self,
+                       element: object,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if self._is_event_time is False:
+            timestamp = context.get_current_processing_time()
+
+        start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._size)
+        return [TimeWindow(start, start + self._size)]
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        if self._is_event_time is True:
+            return EventTimeTrigger()
+        else:
+            return ProcessingTimeTrigger()
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return self._is_event_time
+
+    def __repr__(self):
+        return "TumblingWindowAssigner(%s,%s,%s)" % (self._size, self._offset, self.is_event_time)
+
+'''
+A WindowAssigner that windows elements into windows based on the number of the elements. Windows cannot overlap.
+'''
+class CountTumblingWindowAssigner(WindowAssigner[object, CountWindow]):
+    """
+    A WindowAssigner that windows elements into fixed-size windows based on the count number
+    of the elements. Windows cannot overlap.
+    """
+    def __init__(self, window_size: int):
+        """
+        Windows this KeyedStream into tumbling count windows.
+        :param window_size: The size of the windows in number of elements.
+        """
+        self._window_size = window_size
+        self._counter_state_descriptor = ReducingStateDescriptor(
+            "assigner_counter", lambda a, b: a + b, Types.BIG_INT())
+
+    def assign_windows(self,
+                       element: T,
+                       timestamp: int,
+                       context: 'WindowAssigner.WindowAssignerContext') -> Collection[W]:
+        counter = context.get_runtime_context().get_reducing_state(
+            self._counter_state_descriptor)
+        if counter.get() is None:
+            counter.add(0)
+        result = [CountWindow(counter.get() // self._window_size)]
+        counter.add(1)
+        return result
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        return CountTrigger(self._window_size)
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return CountWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return False
+
+    def __repr__(self) -> str:
+        return "CountTumblingWindowAssigner(%s)" % (self._window_size)
+
+
+class SlidingWindowAssigner(WindowAssigner[object, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into sliding windows based on the current system time of the machine the operation is running on. Windows can possibly overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(SlidingWindowAssigner(Time.minutes(1), Time.seconds(10), Time.seconds(0), False))
+
+    A WindowAssigner that windows elements into sliding windows based on the timestamp of the elements. Windows can possibly overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(SlidingWindowAssigner(Time.minutes(1), Time.seconds(10), Time.seconds(0), True))
+    """
+    def __init__(self, size: Time, slide: Time, offset: Time, is_event_time: bool):

Review comment:
       ```suggestion
       def __init__(self, size: int, slide: int, offset: int, is_event_time: bool):
   ```

##########
File path: flink-python/pyflink/fn_execution/datastream/window/window_assigner.py
##########
@@ -0,0 +1,376 @@
+import math
+from typing import Iterable, Collection
+
+from pyflink.common import TypeSerializer, Time
+from pyflink.common.typeinfo import Types
+from pyflink.datastream import Trigger
+from pyflink.datastream.state import ValueStateDescriptor, ValueState, ReducingStateDescriptor
+from pyflink.datastream.window import TimeWindow, CountWindow, WindowAssigner, T, TimeWindowSerializer, TriggerResult, \
+    CountWindowSerializer, MergingWindowAssigner
+from pyflink.fn_execution.table.window_context import W
+
+
+class EventTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp()
+        if windowMaxTimestamp >= ctx.get_current_watermark():
+            ctx.register_event_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp());
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp();
+        if windowMaxTimestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp());
+
+
+class CountTrigger(Trigger[object, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+    def __init__(self, window_size: int):
+        self._window_size = int(window_size)
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.BIG_INT())
+
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[object, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of the machine the operation is running on. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(10), False))
+
+    A WindowAssigner that windows elements into windows based on the timestamp of the elements. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute:
+     ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(0), True))
+    """
+    def __init__(self, size: Time, offset: Time, is_event_time: bool):
+        self._size = size.to_milliseconds()
+        self._offset = offset.to_milliseconds()
+        self._is_event_time = is_event_time
+
+    def assign_windows(self,
+                       element: object,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if self._is_event_time is False:
+            timestamp = context.get_current_processing_time()
+
+        start = TimeWindow.get_window_start_with_offset(timestamp, self._offset, self._size)
+        return [TimeWindow(start, start + self._size)]
+
+    def get_default_trigger(self, env) -> Trigger[T, W]:
+        if self._is_event_time is True:
+            return EventTimeTrigger()
+        else:
+            return ProcessingTimeTrigger()
+
+    def get_window_serializer(self) -> TypeSerializer[W]:
+        return TimeWindowSerializer()
+
+    def is_event_time(self) -> bool:
+        return self._is_event_time
+
+    def __repr__(self):
+        return "TumblingWindowAssigner(%s,%s,%s)" % (self._size, self._offset, self.is_event_time)
+
+'''
+A WindowAssigner that windows elements into windows based on the number of the elements. Windows cannot overlap.
+'''
+class CountTumblingWindowAssigner(WindowAssigner[object, CountWindow]):

Review comment:
       It uses GlobalWindows for count window in the Java API. Is it possible to do so in Python?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] Vancior commented on a change in pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
Vancior commented on a change in pull request #18957:
URL: https://github.com/apache/flink/pull/18957#discussion_r819330802



##########
File path: flink-python/pyflink/fn_execution/datastream/window/window_assigner.py
##########
@@ -0,0 +1,376 @@
+import math
+from typing import Iterable, Collection
+
+from pyflink.common import TypeSerializer, Time
+from pyflink.common.typeinfo import Types
+from pyflink.datastream import Trigger
+from pyflink.datastream.state import ValueStateDescriptor, ValueState, ReducingStateDescriptor
+from pyflink.datastream.window import TimeWindow, CountWindow, WindowAssigner, T, TimeWindowSerializer, TriggerResult, \
+    CountWindowSerializer, MergingWindowAssigner
+from pyflink.fn_execution.table.window_context import W
+
+
+class EventTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the watermark passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: TimeWindow,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if window.max_timestamp() <= ctx.get_current_watermark():
+            return TriggerResult.FIRE
+        else:
+            ctx.register_event_time_timer(window.max_timestamp())
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: TimeWindow,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: TimeWindow,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        if time == window.max_timestamp():
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: TimeWindow,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp()
+        if windowMaxTimestamp >= ctx.get_current_watermark():
+            ctx.register_event_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: TimeWindow,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_event_time_timer(window.max_timestamp())
+
+
+class ProcessingTimeTrigger(Trigger[object, TimeWindow]):
+    """
+    A Trigger that fires once the current system time passes the end of the window to which a pane belongs.
+    """
+    def on_element(self,
+                   element: T,
+                   timestamp: int,
+                   window: W,
+                   ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        ctx.register_processing_time_timer(window.max_timestamp());
+        return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: W,
+                           ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.FIRE
+
+    def on_event_time(self,
+                      time: int,
+                      window: W,
+                      ctx: 'Trigger.TriggerContext') -> TriggerResult:
+        return TriggerResult.CONTINUE
+
+    def on_merge(self,
+                 window: W,
+                 ctx: 'Trigger.OnMergeContext') -> None:
+        windowMaxTimestamp = window.max_timestamp();
+        if windowMaxTimestamp > ctx.get_current_processing_time():
+            ctx.register_processing_time_timer(windowMaxTimestamp)
+
+    def clear(self,
+              window: W,
+              ctx: 'Trigger.TriggerContext') -> None:
+        ctx.delete_processing_time_timer(window.max_timestamp());
+
+
+class CountTrigger(Trigger[object, CountWindow]):
+    """
+    A Trigger that fires once the count of elements in a pane reaches the given count.
+    """
+    def __init__(self, window_size: int):
+        self._window_size = int(window_size)
+        self._count_state_descriptor = ReducingStateDescriptor(
+            "trigger_counter", lambda a, b: a + b, Types.BIG_INT())
+
+    def on_element(self,
+                   element: object,
+                   timestamp: int,
+                   window: CountWindow,
+                   ctx: Trigger.TriggerContext) -> TriggerResult:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)  # type: ReducingState
+        state.add(1)
+        if state.get() >= self._window_size:
+            # On FIRE, the window is evaluated and results are emitted. The window is not purged
+            #               though, all elements are retained.
+            return TriggerResult.FIRE
+        else:
+            # No action is taken on the window.
+            return TriggerResult.CONTINUE
+
+    def on_processing_time(self,
+                           time: int,
+                           window: CountWindow,
+                           ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_event_time(self,
+                      time: int,
+                      window: CountWindow,
+                      ctx: Trigger.TriggerContext) -> TriggerResult:
+        # No action is taken on the window.
+        return TriggerResult.CONTINUE
+
+    def on_merge(self, window: CountWindow, ctx: Trigger.OnMergeContext) -> None:
+        ctx.merge_partitioned_state(self._count_state_descriptor)
+
+    def clear(self, window: CountWindow, ctx: Trigger.TriggerContext) -> None:
+        state = ctx.get_partitioned_state(self._count_state_descriptor)
+        state.clear()
+
+
+class TumblingWindowAssigner(WindowAssigner[object, TimeWindow]):
+    """
+    A WindowAssigner that windows elements into windows based on the current system time of the machine the operation is running on. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute, every 10 seconds:
+    ::
+            >>> data_stream.key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(10), False))
+
+    A WindowAssigner that windows elements into windows based on the timestamp of the elements. Windows cannot overlap.
+    For example, in order to window into windows of 1 minute:
+     ::
+            >>> data_stream.assign_timestamps_and_watermarks(watermark_strategy) \
+            >>> .key_by(lambda x: x[0], key_type=Types.STRING()) \
+            >>> .window(TumblingWindowAssigner(Time.minutes(1), Time.seconds(0), True))
+    """
+    def __init__(self, size: Time, offset: Time, is_event_time: bool):
+        self._size = size.to_milliseconds()
+        self._offset = offset.to_milliseconds()
+        self._is_event_time = is_event_time
+
+    def assign_windows(self,
+                       element: object,
+                       timestamp: int,
+                       context: WindowAssigner.WindowAssignerContext) -> Collection[TimeWindow]:
+        if self._is_event_time is False:
+            timestamp = context.get_current_processing_time()
+

Review comment:
       check `timestamp > Long.MIN_VALUE` under event-time mode as java does, or there'll be struct packing error when watermark strategy is not set for data stream.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * bb918383e68f2c6539ceab99853972a73e9fec0c Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a7cdfbf43f34e7bf4de81ad0b1a48edb6bdff7da",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a7cdfbf43f34e7bf4de81ad0b1a48edb6bdff7da",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * a4a5fc419dbf82c079a24327846d4a7baa500959 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531) 
   * a7cdfbf43f34e7bf4de81ad0b1a48edb6bdff7da UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 610f23f625a4559bef6f000916a2de37a5cb3b38 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666) 
   * ca91f03a2674f7041d3a66c618f833809c6e98aa UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 14ae6d15c25f454fe7675bf7b0e129834b99ae37 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708) 
   * a36182858952a61ef43ccbdc184440354bb86110 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a35548c5dfffa717fd95c7a129208680b05b1d39",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a35548c5dfffa717fd95c7a129208680b05b1d39",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * a36182858952a61ef43ccbdc184440354bb86110 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721) 
   * a35548c5dfffa717fd95c7a129208680b05b1d39 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 4cf2279af74f89d0f9f8b446ccd8b72bf57495cb Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "09f86dead211bf78b463611eef76e61349dd505f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "09f86dead211bf78b463611eef76e61349dd505f",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810) 
   * 09f86dead211bf78b463611eef76e61349dd505f UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810) 
   * ce5259932767032a24e15f5475f31ff336d40a48 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897) 
   * 678546804066c142c23cdcc36e88dc442210282a UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * ff0bcbd841960c3e524a62c3c45f426707fb5306 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * ff0bcbd841960c3e524a62c3c45f426707fb5306 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 84846361d6db21a5ac721494ebb5badf23b274a1 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947) 
   * 195bf4369bf1eb0cdc1727847e63ce0d6fd31f66 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * ff0bcbd841960c3e524a62c3c45f426707fb5306 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946) 
   * 84846361d6db21a5ac721494ebb5badf23b274a1 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * ff0bcbd841960c3e524a62c3c45f426707fb5306 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946) 
   * 84846361d6db21a5ac721494ebb5badf23b274a1 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 84846361d6db21a5ac721494ebb5badf23b274a1 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947) 
   * 195bf4369bf1eb0cdc1727847e63ce0d6fd31f66 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 8e46affb05226eeed6b8eb20df971445139654a8 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535) 
   * 610f23f625a4559bef6f000916a2de37a5cb3b38 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * ca91f03a2674f7041d3a66c618f833809c6e98aa Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675) 
   * 47cdf91d5086b292f20d40700c7bd9052e32a6e0 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 47cdf91d5086b292f20d40700c7bd9052e32a6e0 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 8e46affb05226eeed6b8eb20df971445139654a8 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * a36182858952a61ef43ccbdc184440354bb86110 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721) 
   * d9ffed1defa7514c27d3ba9a9e3008de8b84bc68 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 8e46affb05226eeed6b8eb20df971445139654a8 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535) 
   * 610f23f625a4559bef6f000916a2de37a5cb3b38 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 4cf2279af74f89d0f9f8b446ccd8b72bf57495cb Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908) 
   * d178dd422b702f51ee85b41e43ec4b306c60aa8c UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8aeb53f79b28bbaf6009c6683f4b9856f1e406f9",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "8aeb53f79b28bbaf6009c6683f4b9856f1e406f9",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810) 
   * 8aeb53f79b28bbaf6009c6683f4b9856f1e406f9 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 4cf2279af74f89d0f9f8b446ccd8b72bf57495cb Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908) 
   * d178dd422b702f51ee85b41e43ec4b306c60aa8c UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * d178dd422b702f51ee85b41e43ec4b306c60aa8c Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * ff0bcbd841960c3e524a62c3c45f426707fb5306 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 84846361d6db21a5ac721494ebb5badf23b274a1 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 8fea20f61b27f7b803c12798ad85dbe97e05dcdf Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405) 
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * bb918383e68f2c6539ceab99853972a73e9fec0c UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * a4a5fc419dbf82c079a24327846d4a7baa500959 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531) 
   * 8e46affb05226eeed6b8eb20df971445139654a8 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * bb918383e68f2c6539ceab99853972a73e9fec0c Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * bb918383e68f2c6539ceab99853972a73e9fec0c Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423) 
   * a4a5fc419dbf82c079a24327846d4a7baa500959 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * a4a5fc419dbf82c079a24327846d4a7baa500959 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * a4a5fc419dbf82c079a24327846d4a7baa500959 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "04fa43dece0bf52af32a6d4252d7ca7fcecfd785",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "04fa43dece0bf52af32a6d4252d7ca7fcecfd785",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * a36182858952a61ef43ccbdc184440354bb86110 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721) 
   * 04fa43dece0bf52af32a6d4252d7ca7fcecfd785 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "252376e8359e4807ed1408b257d4c7b14647ee0a",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "252376e8359e4807ed1408b257d4c7b14647ee0a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * a36182858952a61ef43ccbdc184440354bb86110 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721) 
   * 252376e8359e4807ed1408b257d4c7b14647ee0a UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * d9ffed1defa7514c27d3ba9a9e3008de8b84bc68 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 8e46affb05226eeed6b8eb20df971445139654a8 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535) 
   * 610f23f625a4559bef6f000916a2de37a5cb3b38 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666) 
   * ca91f03a2674f7041d3a66c618f833809c6e98aa UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 1240d4313817b4ae0bfca21090eb6f1cbd40f78e Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684) 
   * 4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 14ae6d15c25f454fe7675bf7b0e129834b99ae37 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708) 
   * a36182858952a61ef43ccbdc184440354bb86110 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 47cdf91d5086b292f20d40700c7bd9052e32a6e0 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680) 
   * 1240d4313817b4ae0bfca21090eb6f1cbd40f78e UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 47cdf91d5086b292f20d40700c7bd9052e32a6e0 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 1240d4313817b4ae0bfca21090eb6f1cbd40f78e Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 8e46affb05226eeed6b8eb20df971445139654a8 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535) 
   * 610f23f625a4559bef6f000916a2de37a5cb3b38 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 610f23f625a4559bef6f000916a2de37a5cb3b38 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 1240d4313817b4ae0bfca21090eb6f1cbd40f78e Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * a36182858952a61ef43ccbdc184440354bb86110 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 678546804066c142c23cdcc36e88dc442210282a Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * d178dd422b702f51ee85b41e43ec4b306c60aa8c Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914) 
   * ff0bcbd841960c3e524a62c3c45f426707fb5306 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * d178dd422b702f51ee85b41e43ec4b306c60aa8c Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 84846361d6db21a5ac721494ebb5badf23b274a1 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32952",
       "triggerID" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 195bf4369bf1eb0cdc1727847e63ce0d6fd31f66 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32952) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * a36182858952a61ef43ccbdc184440354bb86110 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721) 
   * d9ffed1defa7514c27d3ba9a9e3008de8b84bc68 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805) 
   * 0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * d178dd422b702f51ee85b41e43ec4b306c60aa8c Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914) 
   * ff0bcbd841960c3e524a62c3c45f426707fb5306 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 84846361d6db21a5ac721494ebb5badf23b274a1 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947) 
   * 195bf4369bf1eb0cdc1727847e63ce0d6fd31f66 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 84846361d6db21a5ac721494ebb5badf23b274a1 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 84846361d6db21a5ac721494ebb5badf23b274a1 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810) 
   * ce5259932767032a24e15f5475f31ff336d40a48 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 84846361d6db21a5ac721494ebb5badf23b274a1 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32952",
       "triggerID" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "triggerType" : "PUSH"
     }, {
       "hash" : "894eb800e4045264cec298908892659466e78b7b",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "894eb800e4045264cec298908892659466e78b7b",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 195bf4369bf1eb0cdc1727847e63ce0d6fd31f66 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32952) 
   * 894eb800e4045264cec298908892659466e78b7b UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * a4a5fc419dbf82c079a24327846d4a7baa500959 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * a4a5fc419dbf82c079a24327846d4a7baa500959 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531) 
   * 8e46affb05226eeed6b8eb20df971445139654a8 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * a4a5fc419dbf82c079a24327846d4a7baa500959 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531) 
   * 8e46affb05226eeed6b8eb20df971445139654a8 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 8fea20f61b27f7b803c12798ad85dbe97e05dcdf Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405) 
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * d178dd422b702f51ee85b41e43ec4b306c60aa8c Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914) 
   * ff0bcbd841960c3e524a62c3c45f426707fb5306 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * d178dd422b702f51ee85b41e43ec4b306c60aa8c Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 84846361d6db21a5ac721494ebb5badf23b274a1 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947) 
   * 195bf4369bf1eb0cdc1727847e63ce0d6fd31f66 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "09f86dead211bf78b463611eef76e61349dd505f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "09f86dead211bf78b463611eef76e61349dd505f",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810) 
   * 09f86dead211bf78b463611eef76e61349dd505f UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 4cf2279af74f89d0f9f8b446ccd8b72bf57495cb Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908) 
   * d178dd422b702f51ee85b41e43ec4b306c60aa8c UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 84846361d6db21a5ac721494ebb5badf23b274a1 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947) 
   * 195bf4369bf1eb0cdc1727847e63ce0d6fd31f66 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32810",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32897",
       "triggerID" : "ce5259932767032a24e15f5475f31ff336d40a48",
       "triggerType" : "PUSH"
     }, {
       "hash" : "678546804066c142c23cdcc36e88dc442210282a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32900",
       "triggerID" : "678546804066c142c23cdcc36e88dc442210282a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32908",
       "triggerID" : "4cf2279af74f89d0f9f8b446ccd8b72bf57495cb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32914",
       "triggerID" : "d178dd422b702f51ee85b41e43ec4b306c60aa8c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32946",
       "triggerID" : "ff0bcbd841960c3e524a62c3c45f426707fb5306",
       "triggerType" : "PUSH"
     }, {
       "hash" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32947",
       "triggerID" : "84846361d6db21a5ac721494ebb5badf23b274a1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32952",
       "triggerID" : "195bf4369bf1eb0cdc1727847e63ce0d6fd31f66",
       "triggerType" : "PUSH"
     }, {
       "hash" : "894eb800e4045264cec298908892659466e78b7b",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32964",
       "triggerID" : "894eb800e4045264cec298908892659466e78b7b",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 894eb800e4045264cec298908892659466e78b7b Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32964) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] javacaoyu commented on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
javacaoyu commented on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056802273


   hennice


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4289112d7f34337784d436380f7f906cd8b1da7e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "4289112d7f34337784d436380f7f906cd8b1da7e",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * bb918383e68f2c6539ceab99853972a73e9fec0c Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423) 
   * 4289112d7f34337784d436380f7f906cd8b1da7e UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a7cdfbf43f34e7bf4de81ad0b1a48edb6bdff7da",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a7cdfbf43f34e7bf4de81ad0b1a48edb6bdff7da",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * a4a5fc419dbf82c079a24327846d4a7baa500959 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531) 
   * a7cdfbf43f34e7bf4de81ad0b1a48edb6bdff7da UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a7cdfbf43f34e7bf4de81ad0b1a48edb6bdff7da",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a7cdfbf43f34e7bf4de81ad0b1a48edb6bdff7da",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * a4a5fc419dbf82c079a24327846d4a7baa500959 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531) 
   * a7cdfbf43f34e7bf4de81ad0b1a48edb6bdff7da UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * a4a5fc419dbf82c079a24327846d4a7baa500959 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4289112d7f34337784d436380f7f906cd8b1da7e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "4289112d7f34337784d436380f7f906cd8b1da7e",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * bb918383e68f2c6539ceab99853972a73e9fec0c Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423) 
   * 4289112d7f34337784d436380f7f906cd8b1da7e UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4289112d7f34337784d436380f7f906cd8b1da7e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "4289112d7f34337784d436380f7f906cd8b1da7e",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * bb918383e68f2c6539ceab99853972a73e9fec0c Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423) 
   * 4289112d7f34337784d436380f7f906cd8b1da7e UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 8e46affb05226eeed6b8eb20df971445139654a8 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535) 
   * 610f23f625a4559bef6f000916a2de37a5cb3b38 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 8e46affb05226eeed6b8eb20df971445139654a8 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535) 
   * 610f23f625a4559bef6f000916a2de37a5cb3b38 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 8e46affb05226eeed6b8eb20df971445139654a8 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 1240d4313817b4ae0bfca21090eb6f1cbd40f78e Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684) 
   * 4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 47cdf91d5086b292f20d40700c7bd9052e32a6e0 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680) 
   * 1240d4313817b4ae0bfca21090eb6f1cbd40f78e Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 1240d4313817b4ae0bfca21090eb6f1cbd40f78e Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 14ae6d15c25f454fe7675bf7b0e129834b99ae37 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * 14ae6d15c25f454fe7675bf7b0e129834b99ae37 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     }, {
       "hash" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * d9ffed1defa7514c27d3ba9a9e3008de8b84bc68 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805) 
   * 0ee22778504c1bf11c25b5e9df0a8b5fc88a83cc UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * d9ffed1defa7514c27d3ba9a9e3008de8b84bc68 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805",
       "triggerID" : "d9ffed1defa7514c27d3ba9a9e3008de8b84bc68",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * d9ffed1defa7514c27d3ba9a9e3008de8b84bc68 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32805) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * a36182858952a61ef43ccbdc184440354bb86110 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18957: [FLINK-26444][python]Window allocator supporting pyflink datastream API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18957:
URL: https://github.com/apache/flink/pull/18957#issuecomment-1056202149


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32405",
       "triggerID" : "8fea20f61b27f7b803c12798ad85dbe97e05dcdf",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec1e0a435186082e5ac1481bc093f9bdd9d94d70",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32423",
       "triggerID" : "bb918383e68f2c6539ceab99853972a73e9fec0c",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32531",
       "triggerID" : "a4a5fc419dbf82c079a24327846d4a7baa500959",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32535",
       "triggerID" : "8e46affb05226eeed6b8eb20df971445139654a8",
       "triggerType" : "PUSH"
     }, {
       "hash" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32666",
       "triggerID" : "610f23f625a4559bef6f000916a2de37a5cb3b38",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32675",
       "triggerID" : "ca91f03a2674f7041d3a66c618f833809c6e98aa",
       "triggerType" : "PUSH"
     }, {
       "hash" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32680",
       "triggerID" : "47cdf91d5086b292f20d40700c7bd9052e32a6e0",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32684",
       "triggerID" : "1240d4313817b4ae0bfca21090eb6f1cbd40f78e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32707",
       "triggerID" : "4a7b1aabd62ed4165f39ce499a2d9696fb4bef0d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32708",
       "triggerID" : "14ae6d15c25f454fe7675bf7b0e129834b99ae37",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a36182858952a61ef43ccbdc184440354bb86110",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721",
       "triggerID" : "a36182858952a61ef43ccbdc184440354bb86110",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d18a0ac3e74011eddb3b544b49f48b1cd5bf621f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "d18a0ac3e74011eddb3b544b49f48b1cd5bf621f",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec1e0a435186082e5ac1481bc093f9bdd9d94d70 UNKNOWN
   * a36182858952a61ef43ccbdc184440354bb86110 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=32721) 
   * d18a0ac3e74011eddb3b544b49f48b1cd5bf621f UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org