You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by GitBox <gi...@apache.org> on 2022/03/17 06:53:29 UTC

[GitHub] [flink] javacaoyu opened a new pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

javacaoyu opened a new pull request #19126:
URL: https://github.com/apache/flink/pull/19126


   <!--
   *Thank you very much for contributing to Apache Flink - we are happy that you want to help us improve Flink. To help the community review your contribution in the best possible way, please go through the checklist below, which will get the contribution into a shape in which it can be best reviewed.*
   
   *Please understand that we do not do this to make contributions to Flink a hassle. In order to uphold a high standard of quality for code contributions, while at the same time managing a large number of contributions, we need contributors to prepare the contributions well, and give reviewers enough contextual information for the review. Please also understand that contributions that do not follow this guide will take longer to review and thus typically be picked up with lower priority by the community.*
   
   ## Contribution Checklist
   
     - Make sure that the pull request corresponds to a [JIRA issue](https://issues.apache.org/jira/projects/FLINK/issues). Exceptions are made for typos in JavaDoc or documentation files, which need no JIRA issue.
     
     - Name the pull request in the form "[FLINK-XXXX] [component] Title of the pull request", where *FLINK-XXXX* should be replaced by the actual issue number. Skip *component* if you are unsure about which is the best component.
     Typo fixes that have no associated JIRA issue should be named following this pattern: `[hotfix] [docs] Fix typo in event time introduction` or `[hotfix] [javadocs] Expand JavaDoc for PuncuatedWatermarkGenerator`.
   
     - Fill out the template below to describe the changes contributed by the pull request. That will give reviewers the context they need to do the review.
     
     - Make sure that the change passes the automated tests, i.e., `mvn clean verify` passes. You can set up Azure Pipelines CI to do that following [this guide](https://cwiki.apache.org/confluence/display/FLINK/Azure+Pipelines#AzurePipelines-Tutorial:SettingupAzurePipelinesforaforkoftheFlinkrepository).
   
     - Each pull request should address only one issue, not mix up code from multiple issues.
     
     - Each commit in the pull request has a meaningful commit message (including the JIRA id)
   
     - Once all items of the checklist are addressed, remove the above text and this checklist, leaving only the filled out template below.
   
   
   **(The sections below can be removed for hotfixes of typos)**
   -->
   
   ## What is the purpose of the change
   
   *(For example: This pull request makes task deployment go through the blob server, rather than through RPC. That way we avoid re-transferring them on each deployment (during recovery).)*
   
   
   ## Brief change log
   
   *(for example:)*
     - *The TaskInfo is stored in the blob store on job creation time as a persistent artifact*
     - *Deployments RPC transmits only the blob storage reference*
     - *TaskManagers retrieve the TaskInfo from the blob cache*
   
   
   ## Verifying this change
   
   Please make sure both new and modified tests in this PR follows the conventions defined in our code quality guide: https://flink.apache.org/contributing/code-style-and-quality-common.html#testing
   
   *(Please pick either of the following options)*
   
   This change is a trivial rework / code cleanup without any test coverage.
   
   *(or)*
   
   This change is already covered by existing tests, such as *(please describe tests)*.
   
   *(or)*
   
   This change added tests and can be verified as follows:
   
   *(example:)*
     - *Added integration tests for end-to-end deployment with large payloads (100MB)*
     - *Extended integration test for recovery after master (JobManager) failure*
     - *Added test that validates that TaskInfo is transferred only once across recoveries*
     - *Manually verified the change by running a 4 node cluster with 2 JobManagers and 4 TaskManagers, a stateful streaming program, and killing one JobManager and two TaskManagers during the execution, verifying that recovery happens correctly.*
   
   ## Does this pull request potentially affect one of the following parts:
   
     - Dependencies (does it add or upgrade a dependency): (yes / no)
     - The public API, i.e., is any changed class annotated with `@Public(Evolving)`: (yes / no)
     - The serializers: (yes / no / don't know)
     - The runtime per-record code paths (performance sensitive): (yes / no / don't know)
     - Anything that affects deployment or recovery: JobManager (and its components), Checkpointing, Kubernetes/Yarn, ZooKeeper: (yes / no / don't know)
     - The S3 file system connector: (yes / no / don't know)
   
   ## Documentation
   
     - Does this pull request introduce a new feature? (yes / no)
     - If yes, how is the feature documented? (not applicable / docs / JavaDocs / not documented)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dianfu commented on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
dianfu commented on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1072215788


   > When finish this PR, I can try to implement one of the methods first if you assign the jira task to me.
   Could you create the JIRA? Then I can assign it to you~
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33332",
       "triggerID" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33353",
       "triggerID" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a73169d7357a465b3d8b2a647c0ceaa8b435ee24 UNKNOWN
   * 5f30ad142a7c9a1edce730cad5a9abe424641eed Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33332) 
   * bdeb696bfd88b0a29ced843d0010016faef4d050 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33353) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 494dcf7e47a1da57d157c0f1ed3305746b31f44e Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305) 
   * bc228e4b0ec9e8454e36c98502d7ca740abb3a5d Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33332",
       "triggerID" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33353",
       "triggerID" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33363",
       "triggerID" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8813e5db24962156396d853e770bb539b7c7e622",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33476",
       "triggerID" : "8813e5db24962156396d853e770bb539b7c7e622",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a73169d7357a465b3d8b2a647c0ceaa8b435ee24 UNKNOWN
   * 8813e5db24962156396d853e770bb539b7c7e622 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33476) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2ffe8d696b5598078e5be621650cfedff3e31d2a Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] javacaoyu commented on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
javacaoyu commented on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1072352795


   Hi @dianfu 
   I have already finished that problem you raised.
   
   Wait you have time to check it and give it suggests.
   
   -----------------------------
   
   And the JIRA task are created. 
   [FLINK-26728](https://issues.apache.org/jira/browse/FLINK-26728) min
   [FLINK-26729](https://issues.apache.org/jira/browse/FLINK-26729) min_by
   [FLINK-26730](https://issues.apache.org/jira/browse/FLINK-26730) max
   [FLINK-26731](https://issues.apache.org/jira/browse/FLINK-26731) max_by
   
   Thanks.
   
   Best


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2ffe8d696b5598078e5be621650cfedff3e31d2a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33332",
       "triggerID" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * bc228e4b0ec9e8454e36c98502d7ca740abb3a5d Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315) 
   * a73169d7357a465b3d8b2a647c0ceaa8b435ee24 UNKNOWN
   * 5f30ad142a7c9a1edce730cad5a9abe424641eed Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33332) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33332",
       "triggerID" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33353",
       "triggerID" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a73169d7357a465b3d8b2a647c0ceaa8b435ee24 UNKNOWN
   * bdeb696bfd88b0a29ced843d0010016faef4d050 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33353) 
   * bed8769aa0ba4af511384b6e0555db9514be37ec UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33332",
       "triggerID" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33353",
       "triggerID" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a73169d7357a465b3d8b2a647c0ceaa8b435ee24 UNKNOWN
   * bdeb696bfd88b0a29ced843d0010016faef4d050 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33353) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33332",
       "triggerID" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33353",
       "triggerID" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33363",
       "triggerID" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a73169d7357a465b3d8b2a647c0ceaa8b435ee24 UNKNOWN
   * bed8769aa0ba4af511384b6e0555db9514be37ec Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33363) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dianfu commented on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
dianfu commented on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1072215788


   > When finish this PR, I can try to implement one of the methods first if you assign the jira task to me.
   Could you create the JIRA? Then I can assign it to you~
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33332",
       "triggerID" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33353",
       "triggerID" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33363",
       "triggerID" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8813e5db24962156396d853e770bb539b7c7e622",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33476",
       "triggerID" : "8813e5db24962156396d853e770bb539b7c7e622",
       "triggerType" : "PUSH"
     }, {
       "hash" : "495540809cc9a13e18b11151cdee5925cc0a12ae",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33629",
       "triggerID" : "495540809cc9a13e18b11151cdee5925cc0a12ae",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a73169d7357a465b3d8b2a647c0ceaa8b435ee24 UNKNOWN
   * 495540809cc9a13e18b11151cdee5925cc0a12ae Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33629) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33332",
       "triggerID" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33353",
       "triggerID" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a73169d7357a465b3d8b2a647c0ceaa8b435ee24 UNKNOWN
   * bdeb696bfd88b0a29ced843d0010016faef4d050 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33353) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2ffe8d696b5598078e5be621650cfedff3e31d2a Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229) 
   * 494dcf7e47a1da57d157c0f1ed3305746b31f44e UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2ffe8d696b5598078e5be621650cfedff3e31d2a Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229) 
   * 494dcf7e47a1da57d157c0f1ed3305746b31f44e Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305) 
   * bc228e4b0ec9e8454e36c98502d7ca740abb3a5d UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * bc228e4b0ec9e8454e36c98502d7ca740abb3a5d Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315) 
   * a73169d7357a465b3d8b2a647c0ceaa8b435ee24 UNKNOWN
   * 5f30ad142a7c9a1edce730cad5a9abe424641eed UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dianfu commented on a change in pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
dianfu commented on a change in pull request #19126:
URL: https://github.com/apache/flink/pull/19126#discussion_r829715015



##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -1168,6 +1168,96 @@ def process_element(self, value, ctx: 'KeyedProcessFunction.Context'):
         return self.process(FilterKeyedProcessFunctionAdapter(func), self._original_data_type_info)\
             .name("Filter")
 
+    def sum(self, position_to_sum: Union[int, str]) -> 'DataStream':
+        """
+        Applies an aggregation that gives a rolling sum of the data stream at the
+        given position grouped by the given key. An independent aggregate is kept
+        per key.
+
+        Example(Tuple data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('b', 1), ('b', 5)])
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data to sum):
+        ::
+
+            >>> ds = self.env.from_collection([('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],

Review comment:
       ```suggestion
               >>> ds = env.from_collection([('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
   ```

##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -1168,6 +1168,96 @@ def process_element(self, value, ctx: 'KeyedProcessFunction.Context'):
         return self.process(FilterKeyedProcessFunctionAdapter(func), self._original_data_type_info)\
             .name("Filter")
 
+    def sum(self, position_to_sum: Union[int, str]) -> 'DataStream':
+        """
+        Applies an aggregation that gives a rolling sum of the data stream at the
+        given position grouped by the given key. An independent aggregate is kept
+        per key.
+
+        Example(Tuple data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('b', 1), ('b', 5)])
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data to sum):
+        ::
+
+            >>> ds = self.env.from_collection([('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...                                type_info=Types.ROW([Types.STRING(), Types.INT()]))
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data with fields name to sum):
+        ::
+
+            >>> ds = self.env.from_collection(
+            ...     [('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...     type_info=Types.ROW_NAMED(["key", "value"], [Types.STRING(), Types.INT()])
+            ... )
+            >>> ds.key_by(lambda x: x[0]).sum("value")
+
+        :param position_to_sum:
+            The field position in the data points to sum, type can be int or str.
+            This is applicable to Tuple types, and {pyflink.common.types.Row} types.

Review comment:
       ```suggestion
               This is applicable to Tuple types, and :class:`pyflink.common.Row` types.
   ```

##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -1168,6 +1168,96 @@ def process_element(self, value, ctx: 'KeyedProcessFunction.Context'):
         return self.process(FilterKeyedProcessFunctionAdapter(func), self._original_data_type_info)\
             .name("Filter")
 
+    def sum(self, position_to_sum: Union[int, str]) -> 'DataStream':
+        """
+        Applies an aggregation that gives a rolling sum of the data stream at the
+        given position grouped by the given key. An independent aggregate is kept
+        per key.
+
+        Example(Tuple data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('b', 1), ('b', 5)])
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data to sum):
+        ::
+
+            >>> ds = self.env.from_collection([('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...                                type_info=Types.ROW([Types.STRING(), Types.INT()]))
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data with fields name to sum):
+        ::
+
+            >>> ds = self.env.from_collection(
+            ...     [('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...     type_info=Types.ROW_NAMED(["key", "value"], [Types.STRING(), Types.INT()])
+            ... )
+            >>> ds.key_by(lambda x: x[0]).sum("value")
+
+        :param position_to_sum:
+            The field position in the data points to sum, type can be int or str.
+            This is applicable to Tuple types, and {pyflink.common.types.Row} types.
+        :return: The transformed DataStream.
+        """
+        if not isinstance(position_to_sum, int) and not isinstance(position_to_sum, str):
+            raise TypeError("The input must be a int or str type for locate the value to sum")
+
+        output_type = _from_java_type(self._original_data_type_info.get_java_type_info())
+
+        class SumKeyedProcessFunctionAdapter(KeyedProcessFunction):

Review comment:
       What about creating a ReduceFunction and then calling self.reduce? It could simple the implementation.

##########
File path: flink-python/pyflink/datastream/tests/test_data_stream.py
##########
@@ -455,6 +455,56 @@ def filter(self, value):
         expected = ['+I[c, 1]', '+I[e, 2]']
         self.assert_equals_sorted(expected, results)
 
+    def test_keyed_sum_with_tuple_type(self):
+        ds = self.env.from_collection([('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+                                      type_info=Types.TUPLE([Types.STRING(), Types.INT()]))
+        keyed_stream = ds.key_by(lambda x: x[0], key_type=Types.STRING())
+
+        keyed_stream.sum(1)\
+            .add_sink(self.test_sink)
+        self.env.execute('key_by_sum_test_with_tuple_type')
+        results = self.test_sink.get_results(False)
+        if self.__class__ == StreamingModeDataStreamTests:

Review comment:
       What about splitting the test cases into StreamingModeDataStreamTests and BatchModeDataStreamTests separately and then we could avoid this check.

##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -1168,6 +1168,96 @@ def process_element(self, value, ctx: 'KeyedProcessFunction.Context'):
         return self.process(FilterKeyedProcessFunctionAdapter(func), self._original_data_type_info)\
             .name("Filter")
 
+    def sum(self, position_to_sum: Union[int, str]) -> 'DataStream':
+        """
+        Applies an aggregation that gives a rolling sum of the data stream at the
+        given position grouped by the given key. An independent aggregate is kept
+        per key.
+
+        Example(Tuple data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('b', 1), ('b', 5)])
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data to sum):
+        ::
+
+            >>> ds = self.env.from_collection([('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...                                type_info=Types.ROW([Types.STRING(), Types.INT()]))
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data with fields name to sum):
+        ::
+
+            >>> ds = self.env.from_collection(
+            ...     [('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...     type_info=Types.ROW_NAMED(["key", "value"], [Types.STRING(), Types.INT()])
+            ... )
+            >>> ds.key_by(lambda x: x[0]).sum("value")
+
+        :param position_to_sum:
+            The field position in the data points to sum, type can be int or str.
+            This is applicable to Tuple types, and {pyflink.common.types.Row} types.
+        :return: The transformed DataStream.
+        """
+        if not isinstance(position_to_sum, int) and not isinstance(position_to_sum, str):
+            raise TypeError("The input must be a int or str type for locate the value to sum")

Review comment:
       ```suggestion
               raise TypeError("The input must be of of int or str type to locate the value to sum")
   ```

##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -1168,6 +1168,96 @@ def process_element(self, value, ctx: 'KeyedProcessFunction.Context'):
         return self.process(FilterKeyedProcessFunctionAdapter(func), self._original_data_type_info)\
             .name("Filter")
 
+    def sum(self, position_to_sum: Union[int, str]) -> 'DataStream':
+        """
+        Applies an aggregation that gives a rolling sum of the data stream at the
+        given position grouped by the given key. An independent aggregate is kept
+        per key.
+
+        Example(Tuple data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('b', 1), ('b', 5)])
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data to sum):
+        ::
+
+            >>> ds = self.env.from_collection([('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...                                type_info=Types.ROW([Types.STRING(), Types.INT()]))
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data with fields name to sum):
+        ::
+
+            >>> ds = self.env.from_collection(

Review comment:
       ```suggestion
               >>> ds = env.from_collection(
   ```




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dianfu commented on a change in pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
dianfu commented on a change in pull request #19126:
URL: https://github.com/apache/flink/pull/19126#discussion_r830716015



##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -1174,6 +1174,66 @@ def process_element(self, value, ctx: 'KeyedProcessFunction.Context'):
         return self.process(FilterKeyedProcessFunctionAdapter(func), self._original_data_type_info)\
             .name("Filter")
 
+    def sum(self, position_to_sum: Union[int, str]) -> 'DataStream':
+        """
+        Applies an aggregation that gives a rolling sum of the data stream at the
+        given position grouped by the given key. An independent aggregate is kept
+        per key.
+
+        Example(Tuple data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('b', 1), ('b', 5)])
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...                                type_info=Types.ROW([Types.STRING(), Types.INT()]))
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data with fields name to sum):
+        ::
+
+            >>> ds = env.from_collection(
+            ...     [('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...     type_info=Types.ROW_NAMED(["key", "value"], [Types.STRING(), Types.INT()])
+            ... )
+            >>> ds.key_by(lambda x: x[0]).sum("value")
+
+        :param position_to_sum:
+            The field position in the data points to sum, type can be int or str.
+            This is applicable to Tuple types, and :class:`pyflink.common.Row` types.
+        :return: The transformed DataStream.
+        """
+        if not isinstance(position_to_sum, int) and not isinstance(position_to_sum, str):
+            raise TypeError("The input must be of int or str type to locate the value to sum")
+
+        class SumReduceFunction(ReduceFunction):
+
+            def __init__(self, position_to_sum):
+                self._pos = position_to_sum
+
+            def reduce(self, value1, value2):

Review comment:
       What about refactoring it as following?
   ```
                   def init_reduce_func(value1):
                       if isinstance(value1, tuple):
                           def reduce_func(v1, v2):
                               v1_list = list(v1)
                               v1_list[self._pos] = v1[self._pos] + v2[self._pos]
                               return tuple(v1_list)
                           self._reduce_func = reduce_func
                       elif isinstance(value1, (list, Row)):
                           def reduce_func(v1, v2):
                               v1[self._pos] = v1[self._pos] + v2[self._pos]
                           self._reduce_func = reduce_func
                       else:
                           raise TypeError("Sum operator only process the data of "
                                           "Tuple type and {pyflink.common.types.Row} type. "
                                           f"Actual type: {type(value1)}")
   
                   from numbers import Number
                   if not isinstance(value1[self._pos], Number):
                       raise TypeError("The value to sum by given position must be of numeric type; "
                                       f"actual {type(value1[self._pos])}, expected Number")
   
                   if not self._reduce_func:
                       init_reduce_func(value1)
                   return self._reduce_func(value1, value2)
   ```




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] javacaoyu commented on a change in pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
javacaoyu commented on a change in pull request #19126:
URL: https://github.com/apache/flink/pull/19126#discussion_r829826142



##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -1168,6 +1168,96 @@ def process_element(self, value, ctx: 'KeyedProcessFunction.Context'):
         return self.process(FilterKeyedProcessFunctionAdapter(func), self._original_data_type_info)\
             .name("Filter")
 
+    def sum(self, position_to_sum: Union[int, str]) -> 'DataStream':
+        """
+        Applies an aggregation that gives a rolling sum of the data stream at the
+        given position grouped by the given key. An independent aggregate is kept
+        per key.
+
+        Example(Tuple data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('b', 1), ('b', 5)])
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data to sum):
+        ::
+
+            >>> ds = self.env.from_collection([('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...                                type_info=Types.ROW([Types.STRING(), Types.INT()]))
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data with fields name to sum):
+        ::
+
+            >>> ds = self.env.from_collection(
+            ...     [('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...     type_info=Types.ROW_NAMED(["key", "value"], [Types.STRING(), Types.INT()])
+            ... )
+            >>> ds.key_by(lambda x: x[0]).sum("value")
+
+        :param position_to_sum:
+            The field position in the data points to sum, type can be int or str.
+            This is applicable to Tuple types, and {pyflink.common.types.Row} types.
+        :return: The transformed DataStream.
+        """
+        if not isinstance(position_to_sum, int) and not isinstance(position_to_sum, str):
+            raise TypeError("The input must be a int or str type for locate the value to sum")
+
+        output_type = _from_java_type(self._original_data_type_info.get_java_type_info())
+
+        class SumKeyedProcessFunctionAdapter(KeyedProcessFunction):

Review comment:
       > > When finish this PR, I can try to implement one of the methods first if you assign the jira task to me.
   > > Could you create the JIRA? Then I can assign it to you~
   
   Yes, I have already created 4 JIRA task.
   list it:
   - FLINK-26728 min
   - FLINK-26729 min_by
   - FLINK-26730 max
   - FLINK-26731 max_by
   
   Thanks.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33332",
       "triggerID" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * bc228e4b0ec9e8454e36c98502d7ca740abb3a5d Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315) 
   * a73169d7357a465b3d8b2a647c0ceaa8b435ee24 UNKNOWN
   * 5f30ad142a7c9a1edce730cad5a9abe424641eed Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33332) 
   * bdeb696bfd88b0a29ced843d0010016faef4d050 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33332",
       "triggerID" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33353",
       "triggerID" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33363",
       "triggerID" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8813e5db24962156396d853e770bb539b7c7e622",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33476",
       "triggerID" : "8813e5db24962156396d853e770bb539b7c7e622",
       "triggerType" : "PUSH"
     }, {
       "hash" : "495540809cc9a13e18b11151cdee5925cc0a12ae",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33629",
       "triggerID" : "495540809cc9a13e18b11151cdee5925cc0a12ae",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a73169d7357a465b3d8b2a647c0ceaa8b435ee24 UNKNOWN
   * 8813e5db24962156396d853e770bb539b7c7e622 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33476) 
   * 495540809cc9a13e18b11151cdee5925cc0a12ae Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33629) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * bc228e4b0ec9e8454e36c98502d7ca740abb3a5d Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315) 
   * a73169d7357a465b3d8b2a647c0ceaa8b435ee24 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33332",
       "triggerID" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33353",
       "triggerID" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a73169d7357a465b3d8b2a647c0ceaa8b435ee24 UNKNOWN
   * bdeb696bfd88b0a29ced843d0010016faef4d050 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33353) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] javacaoyu commented on a change in pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
javacaoyu commented on a change in pull request #19126:
URL: https://github.com/apache/flink/pull/19126#discussion_r829785324



##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -1168,6 +1168,96 @@ def process_element(self, value, ctx: 'KeyedProcessFunction.Context'):
         return self.process(FilterKeyedProcessFunctionAdapter(func), self._original_data_type_info)\
             .name("Filter")
 
+    def sum(self, position_to_sum: Union[int, str]) -> 'DataStream':
+        """
+        Applies an aggregation that gives a rolling sum of the data stream at the
+        given position grouped by the given key. An independent aggregate is kept
+        per key.
+
+        Example(Tuple data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('b', 1), ('b', 5)])
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data to sum):
+        ::
+
+            >>> ds = self.env.from_collection([('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...                                type_info=Types.ROW([Types.STRING(), Types.INT()]))
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data with fields name to sum):
+        ::
+
+            >>> ds = self.env.from_collection(
+            ...     [('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...     type_info=Types.ROW_NAMED(["key", "value"], [Types.STRING(), Types.INT()])
+            ... )
+            >>> ds.key_by(lambda x: x[0]).sum("value")
+
+        :param position_to_sum:
+            The field position in the data points to sum, type can be int or str.
+            This is applicable to Tuple types, and {pyflink.common.types.Row} types.
+        :return: The transformed DataStream.
+        """
+        if not isinstance(position_to_sum, int) and not isinstance(position_to_sum, str):
+            raise TypeError("The input must be a int or str type for locate the value to sum")
+
+        output_type = _from_java_type(self._original_data_type_info.get_java_type_info())
+
+        class SumKeyedProcessFunctionAdapter(KeyedProcessFunction):

Review comment:
       Its a good idea
   By logic, Apply reduce as the underlying implementation of sum operation is good design i think.
   I'll try to apply ReduceFunction from the new design to see if it's easier to implement.
   
   

##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -1168,6 +1168,96 @@ def process_element(self, value, ctx: 'KeyedProcessFunction.Context'):
         return self.process(FilterKeyedProcessFunctionAdapter(func), self._original_data_type_info)\
             .name("Filter")
 
+    def sum(self, position_to_sum: Union[int, str]) -> 'DataStream':
+        """
+        Applies an aggregation that gives a rolling sum of the data stream at the
+        given position grouped by the given key. An independent aggregate is kept
+        per key.
+
+        Example(Tuple data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('b', 1), ('b', 5)])
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data to sum):
+        ::
+
+            >>> ds = self.env.from_collection([('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...                                type_info=Types.ROW([Types.STRING(), Types.INT()]))
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data with fields name to sum):
+        ::
+
+            >>> ds = self.env.from_collection(
+            ...     [('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...     type_info=Types.ROW_NAMED(["key", "value"], [Types.STRING(), Types.INT()])
+            ... )
+            >>> ds.key_by(lambda x: x[0]).sum("value")
+
+        :param position_to_sum:
+            The field position in the data points to sum, type can be int or str.
+            This is applicable to Tuple types, and {pyflink.common.types.Row} types.
+        :return: The transformed DataStream.
+        """
+        if not isinstance(position_to_sum, int) and not isinstance(position_to_sum, str):
+            raise TypeError("The input must be a int or str type for locate the value to sum")
+
+        output_type = _from_java_type(self._original_data_type_info.get_java_type_info())
+
+        class SumKeyedProcessFunctionAdapter(KeyedProcessFunction):

Review comment:
       > > When finish this PR, I can try to implement one of the methods first if you assign the jira task to me.
   > > Could you create the JIRA? Then I can assign it to you~
   
   Yes, I have already created 4 JIRA task.
   list it:
   - FLINK-26728 min
   - FLINK-26729 min_by
   - FLINK-26730 max
   - FLINK-26731 max_by
   
   Thanks.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] javacaoyu commented on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
javacaoyu commented on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1072148169






-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dianfu closed pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
dianfu closed pull request #19126:
URL: https://github.com/apache/flink/pull/19126


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] javacaoyu commented on a change in pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
javacaoyu commented on a change in pull request #19126:
URL: https://github.com/apache/flink/pull/19126#discussion_r830864455



##########
File path: flink-python/pyflink/datastream/tests/test_data_stream.py
##########
@@ -1072,6 +1072,44 @@ def test_reduce_with_state(self):
         expected = ['+I[a, 0]', '+I[ab, 0]', '+I[c, 1]', '+I[cd, 1]', '+I[cde, 1]']
         self.assert_equals_sorted(expected, results)
 
+    def test_keyed_sum_with_tuple_type(self):

Review comment:
       Yes, I didn't take that into account and I'm making improvements now.
   
   Thanks.
   




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33332",
       "triggerID" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33353",
       "triggerID" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33363",
       "triggerID" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8813e5db24962156396d853e770bb539b7c7e622",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33476",
       "triggerID" : "8813e5db24962156396d853e770bb539b7c7e622",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a73169d7357a465b3d8b2a647c0ceaa8b435ee24 UNKNOWN
   * bed8769aa0ba4af511384b6e0555db9514be37ec Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33363) 
   * 8813e5db24962156396d853e770bb539b7c7e622 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33476) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] javacaoyu commented on a change in pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
javacaoyu commented on a change in pull request #19126:
URL: https://github.com/apache/flink/pull/19126#discussion_r830849153



##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -1174,6 +1174,66 @@ def process_element(self, value, ctx: 'KeyedProcessFunction.Context'):
         return self.process(FilterKeyedProcessFunctionAdapter(func), self._original_data_type_info)\
             .name("Filter")
 
+    def sum(self, position_to_sum: Union[int, str]) -> 'DataStream':
+        """
+        Applies an aggregation that gives a rolling sum of the data stream at the
+        given position grouped by the given key. An independent aggregate is kept
+        per key.
+
+        Example(Tuple data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('b', 1), ('b', 5)])
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...                                type_info=Types.ROW([Types.STRING(), Types.INT()]))
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data with fields name to sum):
+        ::
+
+            >>> ds = env.from_collection(
+            ...     [('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...     type_info=Types.ROW_NAMED(["key", "value"], [Types.STRING(), Types.INT()])
+            ... )
+            >>> ds.key_by(lambda x: x[0]).sum("value")
+
+        :param position_to_sum:
+            The field position in the data points to sum, type can be int or str.
+            This is applicable to Tuple types, and :class:`pyflink.common.Row` types.
+        :return: The transformed DataStream.
+        """
+        if not isinstance(position_to_sum, int) and not isinstance(position_to_sum, str):
+            raise TypeError("The input must be of int or str type to locate the value to sum")
+
+        class SumReduceFunction(ReduceFunction):
+
+            def __init__(self, position_to_sum):
+                self._pos = position_to_sum
+
+            def reduce(self, value1, value2):
+                from numbers import Number
+                if not isinstance(value1[self._pos], Number):
+                    raise TypeError("The value to sum by given position must be of numeric type; "
+                                    f"actual {type(value1[self._pos])}, expected Number")
+                if isinstance(value1, tuple):

Review comment:
       Yes, list type need to be supported.
   
   The list type was supported in the oldest code. Later I thought that because the elements in the standard list type are all of the same type. 
   
   For this kind of list data: ['key', 1]   , its not a standard list to use.
   Although we can key by data[0], sum by data[1]. But for this data, using tuple may be better.
   So i delete the list type support.
   
   But, By your prompts, I think the choice should be left to the user.
   Whether the list used by the user is standard or not, that is the user's business.
   We just need to provide list type support.
   
   So, i will added the list type support.
   and thanks for your prompts.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] javacaoyu commented on a change in pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
javacaoyu commented on a change in pull request #19126:
URL: https://github.com/apache/flink/pull/19126#discussion_r830861205



##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -1174,6 +1174,66 @@ def process_element(self, value, ctx: 'KeyedProcessFunction.Context'):
         return self.process(FilterKeyedProcessFunctionAdapter(func), self._original_data_type_info)\
             .name("Filter")
 
+    def sum(self, position_to_sum: Union[int, str]) -> 'DataStream':
+        """
+        Applies an aggregation that gives a rolling sum of the data stream at the
+        given position grouped by the given key. An independent aggregate is kept
+        per key.
+
+        Example(Tuple data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('b', 1), ('b', 5)])
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...                                type_info=Types.ROW([Types.STRING(), Types.INT()]))
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data with fields name to sum):
+        ::
+
+            >>> ds = env.from_collection(
+            ...     [('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...     type_info=Types.ROW_NAMED(["key", "value"], [Types.STRING(), Types.INT()])
+            ... )
+            >>> ds.key_by(lambda x: x[0]).sum("value")
+
+        :param position_to_sum:
+            The field position in the data points to sum, type can be int or str.
+            This is applicable to Tuple types, and :class:`pyflink.common.Row` types.
+        :return: The transformed DataStream.
+        """
+        if not isinstance(position_to_sum, int) and not isinstance(position_to_sum, str):
+            raise TypeError("The input must be of int or str type to locate the value to sum")
+
+        class SumReduceFunction(ReduceFunction):
+
+            def __init__(self, position_to_sum):
+                self._pos = position_to_sum
+
+            def reduce(self, value1, value2):
+                from numbers import Number
+                if not isinstance(value1[self._pos], Number):

Review comment:
       Thanks for your advice.
   
   I think there's no difference between using value1 or value2.
   
   When we got single data input(The first input value), the reduce method will not be called and its will assign the first value to reduce_state.
   
   When we got 2 or more data, the reduce method is executed.
   
   So, when the second data comes in, it's when the reduce method executes.
   
   But I don't know what your considerations are, based on your considerations, would you to give some advice to me?
   
   Thanks




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33332",
       "triggerID" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33353",
       "triggerID" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33363",
       "triggerID" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a73169d7357a465b3d8b2a647c0ceaa8b435ee24 UNKNOWN
   * bed8769aa0ba4af511384b6e0555db9514be37ec Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33363) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33332",
       "triggerID" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33353",
       "triggerID" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33363",
       "triggerID" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8813e5db24962156396d853e770bb539b7c7e622",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33476",
       "triggerID" : "8813e5db24962156396d853e770bb539b7c7e622",
       "triggerType" : "PUSH"
     }, {
       "hash" : "495540809cc9a13e18b11151cdee5925cc0a12ae",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "495540809cc9a13e18b11151cdee5925cc0a12ae",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a73169d7357a465b3d8b2a647c0ceaa8b435ee24 UNKNOWN
   * 8813e5db24962156396d853e770bb539b7c7e622 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33476) 
   * 495540809cc9a13e18b11151cdee5925cc0a12ae UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33332",
       "triggerID" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33353",
       "triggerID" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33363",
       "triggerID" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8813e5db24962156396d853e770bb539b7c7e622",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33476",
       "triggerID" : "8813e5db24962156396d853e770bb539b7c7e622",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a73169d7357a465b3d8b2a647c0ceaa8b435ee24 UNKNOWN
   * 8813e5db24962156396d853e770bb539b7c7e622 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33476) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dianfu commented on a change in pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
dianfu commented on a change in pull request #19126:
URL: https://github.com/apache/flink/pull/19126#discussion_r831004269



##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -1174,6 +1174,66 @@ def process_element(self, value, ctx: 'KeyedProcessFunction.Context'):
         return self.process(FilterKeyedProcessFunctionAdapter(func), self._original_data_type_info)\
             .name("Filter")
 
+    def sum(self, position_to_sum: Union[int, str]) -> 'DataStream':
+        """
+        Applies an aggregation that gives a rolling sum of the data stream at the
+        given position grouped by the given key. An independent aggregate is kept
+        per key.
+
+        Example(Tuple data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('b', 1), ('b', 5)])
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...                                type_info=Types.ROW([Types.STRING(), Types.INT()]))
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data with fields name to sum):
+        ::
+
+            >>> ds = env.from_collection(
+            ...     [('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...     type_info=Types.ROW_NAMED(["key", "value"], [Types.STRING(), Types.INT()])
+            ... )
+            >>> ds.key_by(lambda x: x[0]).sum("value")
+
+        :param position_to_sum:
+            The field position in the data points to sum, type can be int or str.
+            This is applicable to Tuple types, and :class:`pyflink.common.Row` types.
+        :return: The transformed DataStream.
+        """
+        if not isinstance(position_to_sum, int) and not isinstance(position_to_sum, str):
+            raise TypeError("The input must be of int or str type to locate the value to sum")
+
+        class SumReduceFunction(ReduceFunction):
+
+            def __init__(self, position_to_sum):
+                self._pos = position_to_sum
+
+            def reduce(self, value1, value2):
+                from numbers import Number
+                if not isinstance(value1[self._pos], Number):

Review comment:
       What happens when the first element is a number and the second element is not a number?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2ffe8d696b5598078e5be621650cfedff3e31d2a Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229) 
   * 494dcf7e47a1da57d157c0f1ed3305746b31f44e Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33332",
       "triggerID" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33353",
       "triggerID" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a73169d7357a465b3d8b2a647c0ceaa8b435ee24 UNKNOWN
   * 5f30ad142a7c9a1edce730cad5a9abe424641eed Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33332) 
   * bdeb696bfd88b0a29ced843d0010016faef4d050 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33353) 
   * bed8769aa0ba4af511384b6e0555db9514be37ec UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33332",
       "triggerID" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33353",
       "triggerID" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a73169d7357a465b3d8b2a647c0ceaa8b435ee24 UNKNOWN
   * bdeb696bfd88b0a29ced843d0010016faef4d050 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33353) 
   * bed8769aa0ba4af511384b6e0555db9514be37ec UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33332",
       "triggerID" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33353",
       "triggerID" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a73169d7357a465b3d8b2a647c0ceaa8b435ee24 UNKNOWN
   * bdeb696bfd88b0a29ced843d0010016faef4d050 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33353) 
   * bed8769aa0ba4af511384b6e0555db9514be37ec UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33332",
       "triggerID" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33353",
       "triggerID" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33363",
       "triggerID" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a73169d7357a465b3d8b2a647c0ceaa8b435ee24 UNKNOWN
   * bdeb696bfd88b0a29ced843d0010016faef4d050 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33353) 
   * bed8769aa0ba4af511384b6e0555db9514be37ec Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33363) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * bc228e4b0ec9e8454e36c98502d7ca740abb3a5d Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315) 
   * a73169d7357a465b3d8b2a647c0ceaa8b435ee24 UNKNOWN
   * 5f30ad142a7c9a1edce730cad5a9abe424641eed UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] javacaoyu commented on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
javacaoyu commented on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1075892181


   Hi @dianfu 
   I have completed the modifications for this batch and test passed by pytest flake8 etc.
   
   The main modifications are:
   1. Modify syntax errors in comments and Exception messages.
   2. Check that the value type is changed from value1 to value2.
   3. Based on your suggestions, refactor the implementation of the SumReduceFunction class.
   4. Support for array (list) types and basic data types (number).
   5. Set the default value of 0 for the incoming parameter.
   6. Merge unit test code, combining multiple test cases into one env.execute.
   
   When you have time, please check it out and give suggestions.
   
   Thanks a lot.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] javacaoyu commented on a change in pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
javacaoyu commented on a change in pull request #19126:
URL: https://github.com/apache/flink/pull/19126#discussion_r831017538



##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -1174,6 +1174,66 @@ def process_element(self, value, ctx: 'KeyedProcessFunction.Context'):
         return self.process(FilterKeyedProcessFunctionAdapter(func), self._original_data_type_info)\
             .name("Filter")
 
+    def sum(self, position_to_sum: Union[int, str]) -> 'DataStream':
+        """
+        Applies an aggregation that gives a rolling sum of the data stream at the
+        given position grouped by the given key. An independent aggregate is kept
+        per key.
+
+        Example(Tuple data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('b', 1), ('b', 5)])
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...                                type_info=Types.ROW([Types.STRING(), Types.INT()]))
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data with fields name to sum):
+        ::
+
+            >>> ds = env.from_collection(
+            ...     [('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...     type_info=Types.ROW_NAMED(["key", "value"], [Types.STRING(), Types.INT()])
+            ... )
+            >>> ds.key_by(lambda x: x[0]).sum("value")
+
+        :param position_to_sum:
+            The field position in the data points to sum, type can be int or str.
+            This is applicable to Tuple types, and :class:`pyflink.common.Row` types.
+        :return: The transformed DataStream.
+        """
+        if not isinstance(position_to_sum, int) and not isinstance(position_to_sum, str):
+            raise TypeError("The input must be of int or str type to locate the value to sum")
+
+        class SumReduceFunction(ReduceFunction):
+
+            def __init__(self, position_to_sum):
+                self._pos = position_to_sum
+
+            def reduce(self, value1, value2):
+                from numbers import Number
+                if not isinstance(value1[self._pos], Number):

Review comment:
       Yes, very thanks for your advice, let me be enlightened.
   
   Sure, i only considered the normal situation, forget that when the user is using it, there is any situation.
   And let me know, in any situation, check the new input always better than check the aggregated values, even that get same result in most situation.
   
   
   I will to fixed that bug.
   
   and very thanks to you, its let me programming skills up.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dianfu commented on a change in pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
dianfu commented on a change in pull request #19126:
URL: https://github.com/apache/flink/pull/19126#discussion_r830706322



##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -1174,6 +1174,66 @@ def process_element(self, value, ctx: 'KeyedProcessFunction.Context'):
         return self.process(FilterKeyedProcessFunctionAdapter(func), self._original_data_type_info)\
             .name("Filter")
 
+    def sum(self, position_to_sum: Union[int, str]) -> 'DataStream':
+        """
+        Applies an aggregation that gives a rolling sum of the data stream at the
+        given position grouped by the given key. An independent aggregate is kept
+        per key.
+
+        Example(Tuple data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('b', 1), ('b', 5)])
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...                                type_info=Types.ROW([Types.STRING(), Types.INT()]))
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data with fields name to sum):
+        ::
+
+            >>> ds = env.from_collection(
+            ...     [('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...     type_info=Types.ROW_NAMED(["key", "value"], [Types.STRING(), Types.INT()])
+            ... )
+            >>> ds.key_by(lambda x: x[0]).sum("value")
+
+        :param position_to_sum:
+            The field position in the data points to sum, type can be int or str.
+            This is applicable to Tuple types, and :class:`pyflink.common.Row` types.
+        :return: The transformed DataStream.
+        """
+        if not isinstance(position_to_sum, int) and not isinstance(position_to_sum, str):
+            raise TypeError("The input must be of int or str type to locate the value to sum")
+
+        class SumReduceFunction(ReduceFunction):
+
+            def __init__(self, position_to_sum):
+                self._pos = position_to_sum
+
+            def reduce(self, value1, value2):
+                from numbers import Number
+                if not isinstance(value1[self._pos], Number):
+                    raise TypeError("The value to sum by given position must be of numeric type; "
+                                    f"actual {type(value1[self._pos])}, expected Number")
+                if isinstance(value1, tuple):
+                    value1_list = list(value1)
+                    value1_list[self._pos] = value1[self._pos] + value2[self._pos]
+                    value1 = tuple(value1_list)
+                elif isinstance(value1, Row):
+                    value1[self._pos] = value1[self._pos] + value2[self._pos]
+                else:
+                    raise TypeError("Sum operator only process the data of "
+                                    "Tuple type and {pyflink.common.types.Row} type. "

Review comment:
       ```suggestion
                                       "Tuple type and Row type. "
   ```

##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -1174,6 +1174,66 @@ def process_element(self, value, ctx: 'KeyedProcessFunction.Context'):
         return self.process(FilterKeyedProcessFunctionAdapter(func), self._original_data_type_info)\
             .name("Filter")
 
+    def sum(self, position_to_sum: Union[int, str]) -> 'DataStream':
+        """
+        Applies an aggregation that gives a rolling sum of the data stream at the
+        given position grouped by the given key. An independent aggregate is kept
+        per key.
+
+        Example(Tuple data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('b', 1), ('b', 5)])
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...                                type_info=Types.ROW([Types.STRING(), Types.INT()]))
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data with fields name to sum):
+        ::
+
+            >>> ds = env.from_collection(
+            ...     [('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...     type_info=Types.ROW_NAMED(["key", "value"], [Types.STRING(), Types.INT()])
+            ... )
+            >>> ds.key_by(lambda x: x[0]).sum("value")
+
+        :param position_to_sum:
+            The field position in the data points to sum, type can be int or str.
+            This is applicable to Tuple types, and :class:`pyflink.common.Row` types.
+        :return: The transformed DataStream.
+        """
+        if not isinstance(position_to_sum, int) and not isinstance(position_to_sum, str):
+            raise TypeError("The input must be of int or str type to locate the value to sum")
+
+        class SumReduceFunction(ReduceFunction):
+
+            def __init__(self, position_to_sum):
+                self._pos = position_to_sum
+
+            def reduce(self, value1, value2):
+                from numbers import Number
+                if not isinstance(value1[self._pos], Number):
+                    raise TypeError("The value to sum by given position must be of numeric type; "
+                                    f"actual {type(value1[self._pos])}, expected Number")
+                if isinstance(value1, tuple):

Review comment:
       Is is possible that the input type is a list?

##########
File path: flink-python/pyflink/datastream/tests/test_data_stream.py
##########
@@ -1072,6 +1072,44 @@ def test_reduce_with_state(self):
         expected = ['+I[a, 0]', '+I[ab, 0]', '+I[c, 1]', '+I[cd, 1]', '+I[cde, 1]']
         self.assert_equals_sorted(expected, results)
 
+    def test_keyed_sum_with_tuple_type(self):

Review comment:
       What about merge the added test cases into one, e.g. sum(xxx).sum(yyy).sum(zzz)? IT case is expensive and we should try to reduce them as much as possible.

##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -1174,6 +1174,66 @@ def process_element(self, value, ctx: 'KeyedProcessFunction.Context'):
         return self.process(FilterKeyedProcessFunctionAdapter(func), self._original_data_type_info)\
             .name("Filter")
 
+    def sum(self, position_to_sum: Union[int, str]) -> 'DataStream':
+        """
+        Applies an aggregation that gives a rolling sum of the data stream at the
+        given position grouped by the given key. An independent aggregate is kept
+        per key.
+
+        Example(Tuple data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('b', 1), ('b', 5)])
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...                                type_info=Types.ROW([Types.STRING(), Types.INT()]))
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data with fields name to sum):
+        ::
+
+            >>> ds = env.from_collection(
+            ...     [('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...     type_info=Types.ROW_NAMED(["key", "value"], [Types.STRING(), Types.INT()])
+            ... )
+            >>> ds.key_by(lambda x: x[0]).sum("value")
+
+        :param position_to_sum:
+            The field position in the data points to sum, type can be int or str.
+            This is applicable to Tuple types, and :class:`pyflink.common.Row` types.
+        :return: The transformed DataStream.
+        """
+        if not isinstance(position_to_sum, int) and not isinstance(position_to_sum, str):
+            raise TypeError("The input must be of int or str type to locate the value to sum")
+
+        class SumReduceFunction(ReduceFunction):
+
+            def __init__(self, position_to_sum):
+                self._pos = position_to_sum
+
+            def reduce(self, value1, value2):
+                from numbers import Number
+                if not isinstance(value1[self._pos], Number):

Review comment:
       I guess you should check value2 which is the input element?

##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -1174,6 +1174,66 @@ def process_element(self, value, ctx: 'KeyedProcessFunction.Context'):
         return self.process(FilterKeyedProcessFunctionAdapter(func), self._original_data_type_info)\
             .name("Filter")
 
+    def sum(self, position_to_sum: Union[int, str]) -> 'DataStream':
+        """
+        Applies an aggregation that gives a rolling sum of the data stream at the
+        given position grouped by the given key. An independent aggregate is kept
+        per key.
+
+        Example(Tuple data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('b', 1), ('b', 5)])
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...                                type_info=Types.ROW([Types.STRING(), Types.INT()]))
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data with fields name to sum):
+        ::
+
+            >>> ds = env.from_collection(
+            ...     [('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...     type_info=Types.ROW_NAMED(["key", "value"], [Types.STRING(), Types.INT()])
+            ... )
+            >>> ds.key_by(lambda x: x[0]).sum("value")
+
+        :param position_to_sum:
+            The field position in the data points to sum, type can be int or str.
+            This is applicable to Tuple types, and :class:`pyflink.common.Row` types.
+        :return: The transformed DataStream.
+        """
+        if not isinstance(position_to_sum, int) and not isinstance(position_to_sum, str):
+            raise TypeError("The input must be of int or str type to locate the value to sum")

Review comment:
       ```suggestion
               raise TypeError("The field position must be of int or str type to locate the value to sum")
   ```

##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -1174,6 +1174,66 @@ def process_element(self, value, ctx: 'KeyedProcessFunction.Context'):
         return self.process(FilterKeyedProcessFunctionAdapter(func), self._original_data_type_info)\
             .name("Filter")
 
+    def sum(self, position_to_sum: Union[int, str]) -> 'DataStream':
+        """
+        Applies an aggregation that gives a rolling sum of the data stream at the
+        given position grouped by the given key. An independent aggregate is kept
+        per key.
+
+        Example(Tuple data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('b', 1), ('b', 5)])
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...                                type_info=Types.ROW([Types.STRING(), Types.INT()]))
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data with fields name to sum):
+        ::
+
+            >>> ds = env.from_collection(
+            ...     [('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...     type_info=Types.ROW_NAMED(["key", "value"], [Types.STRING(), Types.INT()])
+            ... )
+            >>> ds.key_by(lambda x: x[0]).sum("value")
+
+        :param position_to_sum:
+            The field position in the data points to sum, type can be int or str.
+            This is applicable to Tuple types, and :class:`pyflink.common.Row` types.
+        :return: The transformed DataStream.
+        """
+        if not isinstance(position_to_sum, int) and not isinstance(position_to_sum, str):
+            raise TypeError("The input must be of int or str type to locate the value to sum")
+
+        class SumReduceFunction(ReduceFunction):
+
+            def __init__(self, position_to_sum):
+                self._pos = position_to_sum
+
+            def reduce(self, value1, value2):
+                from numbers import Number
+                if not isinstance(value1[self._pos], Number):
+                    raise TypeError("The value to sum by given position must be of numeric type; "
+                                    f"actual {type(value1[self._pos])}, expected Number")
+                if isinstance(value1, tuple):
+                    value1_list = list(value1)
+                    value1_list[self._pos] = value1[self._pos] + value2[self._pos]
+                    value1 = tuple(value1_list)
+                elif isinstance(value1, Row):
+                    value1[self._pos] = value1[self._pos] + value2[self._pos]
+                else:

Review comment:
       In the Java implementation, it also supports the following cases:
   1) the input type is array
   2) the input type is a basic type and the position is 0




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] javacaoyu commented on a change in pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
javacaoyu commented on a change in pull request #19126:
URL: https://github.com/apache/flink/pull/19126#discussion_r829785324



##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -1168,6 +1168,96 @@ def process_element(self, value, ctx: 'KeyedProcessFunction.Context'):
         return self.process(FilterKeyedProcessFunctionAdapter(func), self._original_data_type_info)\
             .name("Filter")
 
+    def sum(self, position_to_sum: Union[int, str]) -> 'DataStream':
+        """
+        Applies an aggregation that gives a rolling sum of the data stream at the
+        given position grouped by the given key. An independent aggregate is kept
+        per key.
+
+        Example(Tuple data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('b', 1), ('b', 5)])
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data to sum):
+        ::
+
+            >>> ds = self.env.from_collection([('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...                                type_info=Types.ROW([Types.STRING(), Types.INT()]))
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data with fields name to sum):
+        ::
+
+            >>> ds = self.env.from_collection(
+            ...     [('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...     type_info=Types.ROW_NAMED(["key", "value"], [Types.STRING(), Types.INT()])
+            ... )
+            >>> ds.key_by(lambda x: x[0]).sum("value")
+
+        :param position_to_sum:
+            The field position in the data points to sum, type can be int or str.
+            This is applicable to Tuple types, and {pyflink.common.types.Row} types.
+        :return: The transformed DataStream.
+        """
+        if not isinstance(position_to_sum, int) and not isinstance(position_to_sum, str):
+            raise TypeError("The input must be a int or str type for locate the value to sum")
+
+        output_type = _from_java_type(self._original_data_type_info.get_java_type_info())
+
+        class SumKeyedProcessFunctionAdapter(KeyedProcessFunction):

Review comment:
       Its a good idea
   By logic, Apply reduce as the underlying implementation of sum operation is good design i think.
   I'll try to apply ReduceFunction from the new design to see if it's easier to implement.
   
   




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dianfu commented on a change in pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
dianfu commented on a change in pull request #19126:
URL: https://github.com/apache/flink/pull/19126#discussion_r829715015



##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -1168,6 +1168,96 @@ def process_element(self, value, ctx: 'KeyedProcessFunction.Context'):
         return self.process(FilterKeyedProcessFunctionAdapter(func), self._original_data_type_info)\
             .name("Filter")
 
+    def sum(self, position_to_sum: Union[int, str]) -> 'DataStream':
+        """
+        Applies an aggregation that gives a rolling sum of the data stream at the
+        given position grouped by the given key. An independent aggregate is kept
+        per key.
+
+        Example(Tuple data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('b', 1), ('b', 5)])
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data to sum):
+        ::
+
+            >>> ds = self.env.from_collection([('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],

Review comment:
       ```suggestion
               >>> ds = env.from_collection([('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
   ```

##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -1168,6 +1168,96 @@ def process_element(self, value, ctx: 'KeyedProcessFunction.Context'):
         return self.process(FilterKeyedProcessFunctionAdapter(func), self._original_data_type_info)\
             .name("Filter")
 
+    def sum(self, position_to_sum: Union[int, str]) -> 'DataStream':
+        """
+        Applies an aggregation that gives a rolling sum of the data stream at the
+        given position grouped by the given key. An independent aggregate is kept
+        per key.
+
+        Example(Tuple data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('b', 1), ('b', 5)])
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data to sum):
+        ::
+
+            >>> ds = self.env.from_collection([('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...                                type_info=Types.ROW([Types.STRING(), Types.INT()]))
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data with fields name to sum):
+        ::
+
+            >>> ds = self.env.from_collection(
+            ...     [('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...     type_info=Types.ROW_NAMED(["key", "value"], [Types.STRING(), Types.INT()])
+            ... )
+            >>> ds.key_by(lambda x: x[0]).sum("value")
+
+        :param position_to_sum:
+            The field position in the data points to sum, type can be int or str.
+            This is applicable to Tuple types, and {pyflink.common.types.Row} types.

Review comment:
       ```suggestion
               This is applicable to Tuple types, and :class:`pyflink.common.Row` types.
   ```

##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -1168,6 +1168,96 @@ def process_element(self, value, ctx: 'KeyedProcessFunction.Context'):
         return self.process(FilterKeyedProcessFunctionAdapter(func), self._original_data_type_info)\
             .name("Filter")
 
+    def sum(self, position_to_sum: Union[int, str]) -> 'DataStream':
+        """
+        Applies an aggregation that gives a rolling sum of the data stream at the
+        given position grouped by the given key. An independent aggregate is kept
+        per key.
+
+        Example(Tuple data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('b', 1), ('b', 5)])
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data to sum):
+        ::
+
+            >>> ds = self.env.from_collection([('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...                                type_info=Types.ROW([Types.STRING(), Types.INT()]))
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data with fields name to sum):
+        ::
+
+            >>> ds = self.env.from_collection(
+            ...     [('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...     type_info=Types.ROW_NAMED(["key", "value"], [Types.STRING(), Types.INT()])
+            ... )
+            >>> ds.key_by(lambda x: x[0]).sum("value")
+
+        :param position_to_sum:
+            The field position in the data points to sum, type can be int or str.
+            This is applicable to Tuple types, and {pyflink.common.types.Row} types.
+        :return: The transformed DataStream.
+        """
+        if not isinstance(position_to_sum, int) and not isinstance(position_to_sum, str):
+            raise TypeError("The input must be a int or str type for locate the value to sum")
+
+        output_type = _from_java_type(self._original_data_type_info.get_java_type_info())
+
+        class SumKeyedProcessFunctionAdapter(KeyedProcessFunction):

Review comment:
       What about creating a ReduceFunction and then calling self.reduce? It could simple the implementation.

##########
File path: flink-python/pyflink/datastream/tests/test_data_stream.py
##########
@@ -455,6 +455,56 @@ def filter(self, value):
         expected = ['+I[c, 1]', '+I[e, 2]']
         self.assert_equals_sorted(expected, results)
 
+    def test_keyed_sum_with_tuple_type(self):
+        ds = self.env.from_collection([('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+                                      type_info=Types.TUPLE([Types.STRING(), Types.INT()]))
+        keyed_stream = ds.key_by(lambda x: x[0], key_type=Types.STRING())
+
+        keyed_stream.sum(1)\
+            .add_sink(self.test_sink)
+        self.env.execute('key_by_sum_test_with_tuple_type')
+        results = self.test_sink.get_results(False)
+        if self.__class__ == StreamingModeDataStreamTests:

Review comment:
       What about splitting the test cases into StreamingModeDataStreamTests and BatchModeDataStreamTests separately and then we could avoid this check.

##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -1168,6 +1168,96 @@ def process_element(self, value, ctx: 'KeyedProcessFunction.Context'):
         return self.process(FilterKeyedProcessFunctionAdapter(func), self._original_data_type_info)\
             .name("Filter")
 
+    def sum(self, position_to_sum: Union[int, str]) -> 'DataStream':
+        """
+        Applies an aggregation that gives a rolling sum of the data stream at the
+        given position grouped by the given key. An independent aggregate is kept
+        per key.
+
+        Example(Tuple data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('b', 1), ('b', 5)])
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data to sum):
+        ::
+
+            >>> ds = self.env.from_collection([('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...                                type_info=Types.ROW([Types.STRING(), Types.INT()]))
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data with fields name to sum):
+        ::
+
+            >>> ds = self.env.from_collection(
+            ...     [('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...     type_info=Types.ROW_NAMED(["key", "value"], [Types.STRING(), Types.INT()])
+            ... )
+            >>> ds.key_by(lambda x: x[0]).sum("value")
+
+        :param position_to_sum:
+            The field position in the data points to sum, type can be int or str.
+            This is applicable to Tuple types, and {pyflink.common.types.Row} types.
+        :return: The transformed DataStream.
+        """
+        if not isinstance(position_to_sum, int) and not isinstance(position_to_sum, str):
+            raise TypeError("The input must be a int or str type for locate the value to sum")

Review comment:
       ```suggestion
               raise TypeError("The input must be of of int or str type to locate the value to sum")
   ```

##########
File path: flink-python/pyflink/datastream/data_stream.py
##########
@@ -1168,6 +1168,96 @@ def process_element(self, value, ctx: 'KeyedProcessFunction.Context'):
         return self.process(FilterKeyedProcessFunctionAdapter(func), self._original_data_type_info)\
             .name("Filter")
 
+    def sum(self, position_to_sum: Union[int, str]) -> 'DataStream':
+        """
+        Applies an aggregation that gives a rolling sum of the data stream at the
+        given position grouped by the given key. An independent aggregate is kept
+        per key.
+
+        Example(Tuple data to sum):
+        ::
+
+            >>> ds = env.from_collection([('a', 1), ('a', 2), ('b', 1), ('b', 5)])
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data to sum):
+        ::
+
+            >>> ds = self.env.from_collection([('a', 1), ('a', 2), ('a', 3), ('b', 1), ('b', 2)],
+            ...                                type_info=Types.ROW([Types.STRING(), Types.INT()]))
+            >>> ds.key_by(lambda x: x[0]).sum(1)
+
+        Example(Row data with fields name to sum):
+        ::
+
+            >>> ds = self.env.from_collection(

Review comment:
       ```suggestion
               >>> ds = env.from_collection(
   ```




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * bc228e4b0ec9e8454e36c98502d7ca740abb3a5d Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315) 
   * a73169d7357a465b3d8b2a647c0ceaa8b435ee24 UNKNOWN
   * 5f30ad142a7c9a1edce730cad5a9abe424641eed UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] javacaoyu commented on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
javacaoyu commented on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1072148169


   @dianfu Thanks for you guidance. I will fix these issues that you raised. 
   
   
   By the way, I'm interested about implement other methods like min/min_by/max/max_by etc.
   
   When finish this PR, I can try to implement one of the methods first if you assign the jira task to me.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440






-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 494dcf7e47a1da57d157c0f1ed3305746b31f44e Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305) 
   * bc228e4b0ec9e8454e36c98502d7ca740abb3a5d Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315) 
   * a73169d7357a465b3d8b2a647c0ceaa8b435ee24 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot commented on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot commented on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2ffe8d696b5598078e5be621650cfedff3e31d2a UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * bc228e4b0ec9e8454e36c98502d7ca740abb3a5d Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315) 
   * a73169d7357a465b3d8b2a647c0ceaa8b435ee24 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33332",
       "triggerID" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33353",
       "triggerID" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33363",
       "triggerID" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8813e5db24962156396d853e770bb539b7c7e622",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "8813e5db24962156396d853e770bb539b7c7e622",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a73169d7357a465b3d8b2a647c0ceaa8b435ee24 UNKNOWN
   * bed8769aa0ba4af511384b6e0555db9514be37ec Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33363) 
   * 8813e5db24962156396d853e770bb539b7c7e622 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33332",
       "triggerID" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33353",
       "triggerID" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33363",
       "triggerID" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8813e5db24962156396d853e770bb539b7c7e622",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "8813e5db24962156396d853e770bb539b7c7e622",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a73169d7357a465b3d8b2a647c0ceaa8b435ee24 UNKNOWN
   * bed8769aa0ba4af511384b6e0555db9514be37ec Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33363) 
   * 8813e5db24962156396d853e770bb539b7c7e622 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #19126: [FLINK-26609][python] Support sum operation in KeyedStream

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #19126:
URL: https://github.com/apache/flink/pull/19126#issuecomment-1070385440


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33229",
       "triggerID" : "2ffe8d696b5598078e5be621650cfedff3e31d2a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33305",
       "triggerID" : "494dcf7e47a1da57d157c0f1ed3305746b31f44e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33315",
       "triggerID" : "bc228e4b0ec9e8454e36c98502d7ca740abb3a5d",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a73169d7357a465b3d8b2a647c0ceaa8b435ee24",
       "triggerType" : "PUSH"
     }, {
       "hash" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33332",
       "triggerID" : "5f30ad142a7c9a1edce730cad5a9abe424641eed",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33353",
       "triggerID" : "bdeb696bfd88b0a29ced843d0010016faef4d050",
       "triggerType" : "PUSH"
     }, {
       "hash" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33363",
       "triggerID" : "bed8769aa0ba4af511384b6e0555db9514be37ec",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8813e5db24962156396d853e770bb539b7c7e622",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33476",
       "triggerID" : "8813e5db24962156396d853e770bb539b7c7e622",
       "triggerType" : "PUSH"
     }, {
       "hash" : "495540809cc9a13e18b11151cdee5925cc0a12ae",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "495540809cc9a13e18b11151cdee5925cc0a12ae",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a73169d7357a465b3d8b2a647c0ceaa8b435ee24 UNKNOWN
   * 8813e5db24962156396d853e770bb539b7c7e622 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=33476) 
   * 495540809cc9a13e18b11151cdee5925cc0a12ae UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org