You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by GitBox <gi...@apache.org> on 2022/01/20 15:41:21 UTC

[GitHub] [flink] slinkydeveloper opened a new pull request #18427: [FLINK-25386][table] Harden table persisted plan

slinkydeveloper opened a new pull request #18427:
URL: https://github.com/apache/flink/pull/18427


   ## What is the purpose of the change
   
   Harden persisted plan ser/de for `DynamicTableSink` and `DynamicTableSource` specs
   
   ## Brief change log
   
   * Add `Parser` to `SerdeContext`
   * Use the serializable String representation for `ObjectIdentifier`
   * Unify the test utilities (`JsonSerdeMocks` and `JsonSerdeTestUtil`)
   * Modify structure of `DynamicTableSinkSpec` and `DynamicTableSourceSpec` specs to use `ContextResolvedTable` rather than `CatalogTableSpecBase`
   * Implement ser/de of `ResolvedCatalogTable`
   * Implement ser/de of `ContextResolvedTable`
   * Fix ser/de of `DynamicTableSourceSpec`, `DynamicTableSinkSpec` and `TemporalTableSinkSpec`
   
   Some additional hotfixes:
   
   * Better error reporting in `CatalogManager`
   * More Javadoc on `DynamicTableFactory.Context#getObjectIdentifier`
   * Add equals/hashCode to `ContextResolvedTable`, `DefaultCatalogTable`, `ResolvedCatalogTable` and `RexNodeExpression` mostly for testing purpose
   
   ## Verifying this change
   
   This change includes a rework of the tests that checks every case of the matrix of possible cases.
   
   ## Does this pull request potentially affect one of the following parts:
   
     - Dependencies (does it add or upgrade a dependency): no
     - The public API, i.e., is any changed class annotated with `@Public(Evolving)`: no
   
   ## Documentation
   
     - Does this pull request introduce a new feature? no
     - If yes, how is the feature documented? not applicable
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a69e8836becd5bbdedd183376c67dae35afc2960 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a69e8836becd5bbdedd183376c67dae35afc2960 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812) 
   * b83fd153ef3df414e6a7766e26ccd37f788d728e Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876) 
   * 367e6ab469bb0c69923e73dc458b74650bef48b1 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a69e8836becd5bbdedd183376c67dae35afc2960 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812) 
   * b83fd153ef3df414e6a7766e26ccd37f788d728e Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876) 
   * 367e6ab469bb0c69923e73dc458b74650bef48b1 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 07cec81a669917e2eb1531b2c054e04110781dfe Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173",
       "triggerID" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311",
       "triggerID" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 228aec25fa06571e6e00e886da22e2598ca632b5 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173) 
   * 4b1fa7a01306c01a597b30f45c49d220fdc59dc7 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173",
       "triggerID" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311",
       "triggerID" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "triggerType" : "PUSH"
     }, {
       "hash" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318",
       "triggerID" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a8365736913376f03c3bb64e2a927e46d9dee10a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30327",
       "triggerID" : "a8365736913376f03c3bb64e2a927e46d9dee10a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 4b1fa7a01306c01a597b30f45c49d220fdc59dc7 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311) 
   * dfd7ce9e194c47511c26ee1030e2db044f128a12 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318) 
   * a8365736913376f03c3bb64e2a927e46d9dee10a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30327) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 07cec81a669917e2eb1531b2c054e04110781dfe Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892) 
   * 7005798448a0cf4e9571c6695569b61e7e4641af UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 07cec81a669917e2eb1531b2c054e04110781dfe Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892) 
   * 7005798448a0cf4e9571c6695569b61e7e4641af Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162) 
   * 228aec25fa06571e6e00e886da22e2598ca632b5 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] twalthr commented on a change in pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
twalthr commented on a change in pull request #18427:
URL: https://github.com/apache/flink/pull/18427#discussion_r794288516



##########
File path: flink-table/flink-table-planner/src/test/scala/org/apache/flink/table/planner/utils/TableTestBase.scala
##########
@@ -795,19 +797,20 @@ abstract class TableTestUtilBase(test: TableTestBase, isStreamingMode: Boolean)
       val plannerDirPath = clazz.getResource("/").getFile.replace("/target/test-classes/", "")
       new File(s"$plannerDirPath/src/test/resources$resourceTestFilePath")
     }
-    if (file.exists()) {
+    if (!file.exists() || "true".equalsIgnoreCase(System.getenv(PLAN_TEST_FORCE_OVERWRITE))) {
+      Files.deleteIfExists(file.toPath)
+      file.getParentFile.mkdirs()
+      assertTrue(file.createNewFile())
+      val prettyJson = TableTestUtil.getPrettyJson(jsonPlanWithoutFlinkVersion)
+      Files.write(Paths.get(file.toURI), prettyJson.getBytes)

Review comment:
       Use `Paths.get(file.toURI)` also for deletion? I think it has proven to work in all setups.
   




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a69e8836becd5bbdedd183376c67dae35afc2960 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812) 
   * b83fd153ef3df414e6a7766e26ccd37f788d728e Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173",
       "triggerID" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311",
       "triggerID" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "triggerType" : "PUSH"
     }, {
       "hash" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318",
       "triggerID" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a8365736913376f03c3bb64e2a927e46d9dee10a",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a8365736913376f03c3bb64e2a927e46d9dee10a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 228aec25fa06571e6e00e886da22e2598ca632b5 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173) 
   * 4b1fa7a01306c01a597b30f45c49d220fdc59dc7 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311) 
   * dfd7ce9e194c47511c26ee1030e2db044f128a12 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318) 
   * a8365736913376f03c3bb64e2a927e46d9dee10a UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173",
       "triggerID" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311",
       "triggerID" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "triggerType" : "PUSH"
     }, {
       "hash" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318",
       "triggerID" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 228aec25fa06571e6e00e886da22e2598ca632b5 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173) 
   * 4b1fa7a01306c01a597b30f45c49d220fdc59dc7 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311) 
   * dfd7ce9e194c47511c26ee1030e2db044f128a12 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] slinkydeveloper commented on a change in pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
slinkydeveloper commented on a change in pull request #18427:
URL: https://github.com/apache/flink/pull/18427#discussion_r791014794



##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/ContextResolvedTableJsonSerializer.java
##########
@@ -0,0 +1,98 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.table.api.ValidationException;
+import org.apache.flink.table.api.config.TableConfigOptions;
+import org.apache.flink.table.api.config.TableConfigOptions.CatalogPlanCompilation;
+import org.apache.flink.table.catalog.ContextResolvedTable;
+import org.apache.flink.table.catalog.ObjectIdentifier;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonGenerator;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.SerializerProvider;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.std.StdSerializer;
+
+import java.io.IOException;
+
+/** JSON serializer for {@link ContextResolvedTable}. */
+class ContextResolvedTableJsonSerializer extends StdSerializer<ContextResolvedTable> {
+    private static final long serialVersionUID = 1L;
+
+    public static final String FIELD_NAME_IDENTIFIER = "identifier";
+    public static final String FIELD_NAME_CATALOG_TABLE = "catalogTable";
+
+    public ContextResolvedTableJsonSerializer() {
+        super(ContextResolvedTable.class);
+    }
+
+    @Override
+    public void serialize(
+            ContextResolvedTable contextResolvedTable,
+            JsonGenerator jsonGenerator,
+            SerializerProvider serializerProvider)
+            throws IOException {
+        final CatalogPlanCompilation planCompilationOption =
+                SerdeContext.get(serializerProvider)
+                        .getConfiguration()
+                        .get(TableConfigOptions.PLAN_COMPILE_CATALOG_OBJECTS);
+
+        if (contextResolvedTable.isAnonymous()
+                && planCompilationOption == CatalogPlanCompilation.IDENTIFIER) {
+            throw cannotSerializeAnonymousTable(contextResolvedTable.getIdentifier());
+        }
+
+        jsonGenerator.writeStartObject();
+
+        if (!contextResolvedTable.isAnonymous()) {
+            // Serialize object identifier
+            jsonGenerator.writeObjectField(
+                    FIELD_NAME_IDENTIFIER, contextResolvedTable.getIdentifier());
+        }
+
+        if ((contextResolvedTable.isPermanent() || contextResolvedTable.isAnonymous())
+                && planCompilationOption != CatalogPlanCompilation.IDENTIFIER) {
+            // Pass to the ResolvedCatalogTableJsonSerializer the option to serialize or not the
+            // identifier
+            serializerProvider.setAttribute(

Review comment:
       It is, I removed it




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8c843d680e45ef8b14d3abf5d75f785c889c4a33",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "8c843d680e45ef8b14d3abf5d75f785c889c4a33",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a69e8836becd5bbdedd183376c67dae35afc2960 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812) 
   * b83fd153ef3df414e6a7766e26ccd37f788d728e Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876) 
   * 367e6ab469bb0c69923e73dc458b74650bef48b1 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885) 
   * 8c843d680e45ef8b14d3abf5d75f785c889c4a33 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a69e8836becd5bbdedd183376c67dae35afc2960 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812) 
   * b83fd153ef3df414e6a7766e26ccd37f788d728e Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876) 
   * 367e6ab469bb0c69923e73dc458b74650bef48b1 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9857ee8f41ffb6220d5c4c0ad7cd4e84806f740d",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "9857ee8f41ffb6220d5c4c0ad7cd4e84806f740d",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a69e8836becd5bbdedd183376c67dae35afc2960 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812) 
   * b83fd153ef3df414e6a7766e26ccd37f788d728e Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876) 
   * 367e6ab469bb0c69923e73dc458b74650bef48b1 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885) 
   * 9857ee8f41ffb6220d5c4c0ad7cd4e84806f740d UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] slinkydeveloper commented on a change in pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
slinkydeveloper commented on a change in pull request #18427:
URL: https://github.com/apache/flink/pull/18427#discussion_r791013866



##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/ResolvedCatalogTableJsonSerializer.java
##########
@@ -0,0 +1,86 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.table.api.ValidationException;
+import org.apache.flink.table.catalog.CatalogBaseTable;
+import org.apache.flink.table.catalog.ExternalCatalogTable;
+import org.apache.flink.table.catalog.ResolvedCatalogTable;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonGenerator;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.SerializerProvider;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.std.StdSerializer;
+
+import java.io.IOException;
+
+/**
+ * This serializer can be configured via an attribute to serialize or not the options and comments,
+ * setting the attribute {@link #SERIALIZE_OPTIONS} to {@code true} or {@code false}.
+ */
+class ResolvedCatalogTableJsonSerializer extends StdSerializer<ResolvedCatalogTable> {
+    private static final long serialVersionUID = 1L;
+
+    static final String SERIALIZE_OPTIONS = "serialize_options";
+
+    public static final String RESOLVED_SCHEMA = "resolvedSchema";
+    public static final String PARTITION_KEYS = "partitionKeys";
+    public static final String OPTIONS = "options";
+    public static final String COMMENT = "comment";
+
+    public ResolvedCatalogTableJsonSerializer() {
+        super(ResolvedCatalogTable.class);
+    }
+
+    @Override
+    public void serialize(
+            ResolvedCatalogTable resolvedCatalogTable,
+            JsonGenerator jsonGenerator,
+            SerializerProvider serializerProvider)
+            throws IOException {
+        // Thia should never happen anyway, but we keep this assertion for sanity check
+        assert resolvedCatalogTable.getTableKind() == CatalogBaseTable.TableKind.TABLE;
+
+        boolean serializeOptions =

Review comment:
       I tried to parametrize the ser/de itself, as one might use it in another context for another purpose, and might have a different rationale behind serializing or not the options.
   
   To unblock the discussion, I factored out in a static method the serialization code, and from `ContextResolvedTableJsonSerializer` i call that method directly. But I still keep this option, as it's useful for testing and might be useful for setting this serializer option from an higher level (similar to how you define data time formats in jackson today)

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/ContextResolvedTableJsonSerializer.java
##########
@@ -0,0 +1,98 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.table.api.ValidationException;
+import org.apache.flink.table.api.config.TableConfigOptions;
+import org.apache.flink.table.api.config.TableConfigOptions.CatalogPlanCompilation;
+import org.apache.flink.table.catalog.ContextResolvedTable;
+import org.apache.flink.table.catalog.ObjectIdentifier;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonGenerator;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.SerializerProvider;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.std.StdSerializer;
+
+import java.io.IOException;
+
+/** JSON serializer for {@link ContextResolvedTable}. */
+class ContextResolvedTableJsonSerializer extends StdSerializer<ContextResolvedTable> {
+    private static final long serialVersionUID = 1L;
+
+    public static final String FIELD_NAME_IDENTIFIER = "identifier";
+    public static final String FIELD_NAME_CATALOG_TABLE = "catalogTable";
+
+    public ContextResolvedTableJsonSerializer() {
+        super(ContextResolvedTable.class);
+    }
+
+    @Override
+    public void serialize(
+            ContextResolvedTable contextResolvedTable,
+            JsonGenerator jsonGenerator,
+            SerializerProvider serializerProvider)
+            throws IOException {
+        final CatalogPlanCompilation planCompilationOption =
+                SerdeContext.get(serializerProvider)
+                        .getConfiguration()
+                        .get(TableConfigOptions.PLAN_COMPILE_CATALOG_OBJECTS);
+
+        if (contextResolvedTable.isAnonymous()
+                && planCompilationOption == CatalogPlanCompilation.IDENTIFIER) {
+            throw cannotSerializeAnonymousTable(contextResolvedTable.getIdentifier());
+        }
+
+        jsonGenerator.writeStartObject();
+
+        if (!contextResolvedTable.isAnonymous()) {
+            // Serialize object identifier
+            jsonGenerator.writeObjectField(
+                    FIELD_NAME_IDENTIFIER, contextResolvedTable.getIdentifier());
+        }
+
+        if ((contextResolvedTable.isPermanent() || contextResolvedTable.isAnonymous())
+                && planCompilationOption != CatalogPlanCompilation.IDENTIFIER) {
+            // Pass to the ResolvedCatalogTableJsonSerializer the option to serialize or not the
+            // identifier
+            serializerProvider.setAttribute(

Review comment:
       It is, I removed it

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/ResolvedExpressionJsonSerializer.java
##########
@@ -0,0 +1,75 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.table.api.ValidationException;
+import org.apache.flink.table.expressions.ResolvedExpression;
+import org.apache.flink.table.planner.expressions.RexNodeExpression;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonGenerator;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.SerializerProvider;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.std.StdSerializer;
+
+import java.io.IOException;
+
+class ResolvedExpressionJsonSerializer extends StdSerializer<ResolvedExpression> {
+
+    public static final String TYPE = "type";
+    public static final String TYPE_REX_NODE_EXPRESSION = "rexNodeExpression";
+    public static final String REX_NODE = "rexNode";
+    public static final String OUTPUT_DATA_TYPE = "outputDataType";
+    public static final String SERIALIZABLE_STRING = "serializableString";
+
+    protected ResolvedExpressionJsonSerializer() {
+        super(ResolvedExpression.class);
+    }
+
+    @Override
+    public void serialize(
+            ResolvedExpression resolvedExpression,
+            JsonGenerator jsonGenerator,
+            SerializerProvider serializerProvider)
+            throws IOException {
+        jsonGenerator.writeStartObject();
+
+        if (resolvedExpression instanceof RexNodeExpression) {
+            serialize((RexNodeExpression) resolvedExpression, jsonGenerator, serializerProvider);
+        } else {
+            throw new ValidationException(
+                    String.format(
+                            "Expression '%s' cannot be serialized. "
+                                    + "Currently, only SQL expressions can be serialized in the persisted plan.",
+                            resolvedExpression.asSummaryString()));
+        }
+
+        jsonGenerator.writeEndObject();
+    }
+
+    private void serialize(
+            RexNodeExpression expression,
+            JsonGenerator jsonGenerator,
+            SerializerProvider serializerProvider)
+            throws IOException {
+        jsonGenerator.writeStringField(TYPE, TYPE_REX_NODE_EXPRESSION);
+        serializerProvider.defaultSerializeField(REX_NODE, expression.getRexNode(), jsonGenerator);
+        serializerProvider.defaultSerializeField(

Review comment:
       I managed to get rid of the type, but I can't trivially remove `serializableString`. In order to remove it, I tried to infer it from the `RexNode` using `SqlImplementor.SimpleContext` when deserializing. But, as you see from the signature, I still to access to the input fields SqlNode tree, for example to serialize an input ref. At this specific location, there is no easy way to do it.  
   
   I also don't think it's safe to just omit it as, without it, the `Expression` tree becomes not serializable. I suggest we keep it in this iteration, hence we serialize the fields `rexNode` and `serializableString`, and we exclude `type` and `outputDataType`.

##########
File path: flink-table/flink-table-planner/src/test/resources/jsonplan/testGetJsonPlan.out
##########
@@ -1,76 +1,110 @@
 {
-   "flinkVersion":"",
-   "nodes":[
-      {
-         "class":"org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecTableSourceScan",
-         "scanTableSource":{
-            "identifier":{
-               "catalogName":"default_catalog",
-               "databaseName":"default_database",
-               "tableName":"MyTable"
+  "flinkVersion": "",
+  "nodes": [
+    {
+      "class": "org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecTableSourceScan",
+      "scanTableSource": {
+        "catalogTable": {
+          "identifier": "`default_catalog`.`default_database`.`MyTable`",
+          "catalogTable": {
+            "resolvedSchema": {
+              "columns": [
+                {
+                  "name": "a",
+                  "type": "physical",
+                  "dataType": "BIGINT"
+                },
+                {
+                  "name": "b",
+                  "type": "physical",
+                  "dataType": "INT"
+                },
+                {
+                  "name": "c",
+                  "type": "physical",
+                  "dataType": {
+                    "logicalType": "VARCHAR(2147483647)",

Review comment:
       This is done here https://github.com/apache/flink/pull/18506




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173",
       "triggerID" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311",
       "triggerID" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "triggerType" : "PUSH"
     }, {
       "hash" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318",
       "triggerID" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a8365736913376f03c3bb64e2a927e46d9dee10a",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30327",
       "triggerID" : "a8365736913376f03c3bb64e2a927e46d9dee10a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "82350d3f5d0dbb119914bd9fc96cba87bea89433",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "82350d3f5d0dbb119914bd9fc96cba87bea89433",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a8365736913376f03c3bb64e2a927e46d9dee10a Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30327) 
   * 82350d3f5d0dbb119914bd9fc96cba87bea89433 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] twalthr closed pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
twalthr closed pull request #18427:
URL: https://github.com/apache/flink/pull/18427


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot commented on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot commented on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017641863


   Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community
   to review your pull request. We will use this comment to track the progress of the review.
   
   
   ## Automated Checks
   Last check on commit a69e8836becd5bbdedd183376c67dae35afc2960 (Thu Jan 20 15:45:38 UTC 2022)
   
   **Warnings:**
    * No documentation files were touched! Remember to keep the Flink docs up to date!
   
   
   <sub>Mention the bot in a comment to re-run the automated checks.</sub>
   ## Review Progress
   
   * ❓ 1. The [description] looks good.
   * ❓ 2. There is [consensus] that the contribution should go into to Flink.
   * ❓ 3. Needs [attention] from.
   * ❓ 4. The change fits into the overall [architecture].
   * ❓ 5. Overall code [quality] is good.
   
   Please see the [Pull Request Review Guide](https://flink.apache.org/contributing/reviewing-prs.html) for a full explanation of the review process.<details>
    The Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot approve description` to approve one or more aspects (aspects: `description`, `consensus`, `architecture` and `quality`)
    - `@flinkbot approve all` to approve all aspects
    - `@flinkbot approve-until architecture` to approve everything until `architecture`
    - `@flinkbot attention @username1 [@username2 ..]` to require somebody's attention
    - `@flinkbot disapprove architecture` to remove an approval you gave earlier
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] twalthr commented on a change in pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
twalthr commented on a change in pull request #18427:
URL: https://github.com/apache/flink/pull/18427#discussion_r793562935



##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/spec/DynamicTableSourceSpec.java
##########
@@ -110,39 +106,73 @@ private DynamicTableSource getTableSource(FlinkContext flinkContext) {
         return tableSource;
     }
 
-    @JsonIgnore
     public ScanTableSource getScanTableSource(FlinkContext flinkContext) {
         DynamicTableSource tableSource = getTableSource(flinkContext);
         if (tableSource instanceof ScanTableSource) {
             return (ScanTableSource) tableSource;
         } else {
             throw new TableException(
                     String.format(
-                            "%s is not a ScanTableSource.\nplease check it.",
+                            "%s is not a ScanTableSource.\n" + "please check it.",

Review comment:
       if we change this line, then let's remove the new line at all?

##########
File path: flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/serde/DynamicTableSinkSpecSerdeTest.java
##########
@@ -139,39 +139,166 @@ public void testDynamicTableSinkSpecSerde() throws IOException {
                                                 put("p", "A");
                                             }
                                         })));
-        spec2.setReadableConfig(new Configuration());
 
-        Map<String, String> properties3 = new HashMap<>();
-        properties3.put("connector", "values");
-        properties3.put("schema.0.name", "a");
-        properties3.put("schema.0.data-type", "BIGINT");
-        properties3.put("schema.1.name", "b");
-        properties3.put("schema.1.data-type", "INT");
-        properties3.put("schema.2.name", "m");
-        properties3.put("schema.2.data-type", "STRING");
-        properties3.put("writable-metadata", "m:STRING");
-
-        final CatalogTable catalogTable3 = CatalogTable.fromProperties(properties3);
+        Map<String, String> options3 = new HashMap<>();
+        options3.put("connector", TestValuesTableFactory.IDENTIFIER);
+        options3.put("writable-metadata", "m:STRING");
 
         final ResolvedSchema resolvedSchema3 =
                 new ResolvedSchema(
                         Arrays.asList(
                                 Column.physical("a", DataTypes.BIGINT()),
                                 Column.physical("b", DataTypes.INT()),
-                                Column.physical("m", DataTypes.STRING())),
+                                Column.metadata("m", DataTypes.STRING(), null, false)),
                         Collections.emptyList(),
                         null);
+        final CatalogTable catalogTable3 =
+                CatalogTable.of(
+                        Schema.newBuilder().fromResolvedSchema(resolvedSchema3).build(),
+                        null,
+                        Collections.emptyList(),
+                        options3);
 
         DynamicTableSinkSpec spec3 =
                 new DynamicTableSinkSpec(
-                        ObjectIdentifier.of("default_catalog", "default_db", "MyTable"),
-                        new ResolvedCatalogTable(catalogTable3, resolvedSchema3),
+                        ContextResolvedTable.temporary(
+                                ObjectIdentifier.of(
+                                        DEFAULT_BUILTIN_CATALOG,
+                                        DEFAULT_BUILTIN_DATABASE,
+                                        "MyTable"),
+                                new ResolvedCatalogTable(catalogTable3, resolvedSchema3)),
                         Collections.singletonList(
                                 new WritingMetadataSpec(
                                         Collections.singletonList("m"),
                                         RowType.of(new BigIntType(), new IntType()))));
-        spec3.setReadableConfig(new Configuration());
 
-        return Arrays.asList(spec1, spec2, spec3);
+        return Stream.of(spec1, spec2, spec3);
+    }
+
+    @ParameterizedTest
+    @MethodSource("testDynamicTableSinkSpecSerde")
+    void testDynamicTableSinkSpecSerde(DynamicTableSinkSpec spec) throws IOException {
+        TableEnvironmentImpl tableEnv =

Review comment:
       use `PlannerMocks` instead? It is not perfect yet, but you can expose the `Configuration` (not `TableConfig`) and should be able to access everything you need.

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/ContextResolvedTableJsonDeserializer.java
##########
@@ -0,0 +1,238 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.table.api.ValidationException;
+import org.apache.flink.table.api.config.TableConfigOptions.CatalogPlanCompilation;
+import org.apache.flink.table.api.config.TableConfigOptions.CatalogPlanRestore;
+import org.apache.flink.table.catalog.CatalogManager;
+import org.apache.flink.table.catalog.Column;
+import org.apache.flink.table.catalog.ContextResolvedTable;
+import org.apache.flink.table.catalog.ObjectIdentifier;
+import org.apache.flink.table.catalog.ResolvedCatalogTable;
+import org.apache.flink.table.catalog.ResolvedSchema;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonParser;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.StdDeserializer;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode;
+
+import java.io.IOException;
+import java.util.List;
+import java.util.Objects;
+import java.util.Optional;
+
+import static org.apache.flink.table.api.config.TableConfigOptions.CatalogPlanRestore.IDENTIFIER;
+import static org.apache.flink.table.api.config.TableConfigOptions.PLAN_COMPILE_CATALOG_OBJECTS;
+import static org.apache.flink.table.api.config.TableConfigOptions.PLAN_RESTORE_CATALOG_OBJECTS;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ContextResolvedTableJsonSerializer.FIELD_NAME_CATALOG_TABLE;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ContextResolvedTableJsonSerializer.FIELD_NAME_IDENTIFIER;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ResolvedCatalogTableJsonSerializer.OPTIONS;
+
+class ContextResolvedTableJsonDeserializer extends StdDeserializer<ContextResolvedTable> {
+    private static final long serialVersionUID = 1L;
+
+    public ContextResolvedTableJsonDeserializer() {
+        super(ContextResolvedTable.class);
+    }
+
+    @Override
+    public ContextResolvedTable deserialize(JsonParser jsonParser, DeserializationContext ctx)
+            throws IOException {
+        final CatalogPlanRestore planRestoreOption =
+                SerdeContext.get(ctx).getConfiguration().get(PLAN_RESTORE_CATALOG_OBJECTS);
+        final CatalogManager catalogManager =
+                SerdeContext.get(ctx).getFlinkContext().getCatalogManager();
+        final ObjectNode objectNode = jsonParser.readValueAsTree();
+
+        // Deserialize the two fields, if available
+        final ObjectIdentifier identifier =
+                JsonSerdeUtil.deserializeOptionalField(
+                                objectNode,
+                                FIELD_NAME_IDENTIFIER,
+                                ObjectIdentifier.class,
+                                jsonParser.getCodec(),
+                                ctx)
+                        .orElse(null);
+        ResolvedCatalogTable resolvedCatalogTable =
+                JsonSerdeUtil.deserializeOptionalField(
+                                objectNode,
+                                FIELD_NAME_CATALOG_TABLE,
+                                ResolvedCatalogTable.class,
+                                jsonParser.getCodec(),
+                                ctx)
+                        .orElse(null);
+
+        if (identifier == null && resolvedCatalogTable == null) {
+            throw new ValidationException(
+                    String.format(
+                            "The input json is invalid because it doesn't contain '%s', nor the '%s'.",

Review comment:
       nit: to be consistent we should always right `JSON` instead of `json`. esp in exceptions.

##########
File path: flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/serde/DynamicTableSourceSpecSerdeTest.java
##########
@@ -238,6 +231,135 @@ public void testDynamicTableSourceSpecSerde() throws IOException {
                                                         put("p", "B");
                                                     }
                                                 }))));
-        return Arrays.asList(spec1, spec2);
+        return Stream.of(spec1, spec2);
+    }
+
+    @ParameterizedTest
+    @MethodSource("testDynamicTableSinkSpecSerde")
+    public void testDynamicTableSourceSpecSerde(DynamicTableSourceSpec spec) throws IOException {
+        TableEnvironmentImpl tableEnv =
+                (TableEnvironmentImpl) TableEnvironment.create(inStreamingMode());
+
+        CatalogManager catalogManager = tableEnv.getCatalogManager();

Review comment:
       is it possible to unify with this sink test? I see a lot of duplicate code here.

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/ResolvedCatalogTableJsonSerializer.java
##########
@@ -0,0 +1,91 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.table.api.ValidationException;
+import org.apache.flink.table.catalog.CatalogBaseTable;
+import org.apache.flink.table.catalog.ExternalCatalogTable;
+import org.apache.flink.table.catalog.ResolvedCatalogTable;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonGenerator;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.SerializerProvider;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.std.StdSerializer;
+
+import java.io.IOException;
+
+class ResolvedCatalogTableJsonSerializer extends StdSerializer<ResolvedCatalogTable> {
+    private static final long serialVersionUID = 1L;
+
+    static final String SERIALIZE_OPTIONS = "serialize_options";
+
+    public static final String RESOLVED_SCHEMA = "schema";
+    public static final String PARTITION_KEYS = "partitionKeys";
+    public static final String OPTIONS = "options";
+    public static final String COMMENT = "comment";
+
+    public ResolvedCatalogTableJsonSerializer() {
+        super(ResolvedCatalogTable.class);
+    }
+
+    @Override
+    public void serialize(
+            ResolvedCatalogTable resolvedCatalogTable,
+            JsonGenerator jsonGenerator,
+            SerializerProvider serializerProvider)
+            throws IOException {
+        boolean serializeOptions =
+                serializerProvider.getAttribute(SERIALIZE_OPTIONS) == null
+                        || (boolean) serializerProvider.getAttribute(SERIALIZE_OPTIONS);
+
+        serialize(resolvedCatalogTable, serializeOptions, jsonGenerator, serializerProvider);
+    }
+
+    static void serialize(
+            ResolvedCatalogTable resolvedCatalogTable,
+            boolean serializeOptions,
+            JsonGenerator jsonGenerator,
+            SerializerProvider serializerProvider)
+            throws IOException {
+        // Thia should never happen anyway, but we keep this assertion for sanity check

Review comment:
       typo

##########
File path: flink-table/flink-table-planner/src/test/resources/org/apache/flink/table/planner/plan/nodes/exec/stream/TableSinkJsonPlanTest_jsonplan/testWritingMetadata.out
##########
@@ -22,20 +37,35 @@
   }, {
     "class" : "org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecSink",
     "dynamicTableSink" : {
-      "identifier" : "`default_catalog`.`default_database`.`MySink`",
-      "catalogTable" : {
-        "schema.2.data-type" : "VARCHAR(2147483647)",
-        "schema.2.virtual" : "false",
-        "connector" : "values",
-        "schema.0.data-type" : "BIGINT",
-        "writable-metadata" : "m:STRING",
-        "schema.2.metadata" : "m",
-        "schema.2.name" : "m",
-        "schema.1.name" : "b",
-        "schema.0.name" : "a",
-        "schema.1.data-type" : "INT"
+      "table" : {
+        "identifier" : "`default_catalog`.`default_database`.`MySink`",
+        "resolvedTable" : {
+          "schema" : {
+            "columns" : [ {
+              "name" : "a",
+              "dataType" : "BIGINT"
+            }, {
+              "name" : "b",
+              "dataType" : "INT"
+            }, {
+              "name" : "m",
+              "kind" : "metadata",

Review comment:
       put constants in upper case

##########
File path: flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/serde/DynamicTableSinkSpecSerdeTest.java
##########
@@ -139,39 +139,166 @@ public void testDynamicTableSinkSpecSerde() throws IOException {
                                                 put("p", "A");
                                             }
                                         })));
-        spec2.setReadableConfig(new Configuration());
 
-        Map<String, String> properties3 = new HashMap<>();
-        properties3.put("connector", "values");
-        properties3.put("schema.0.name", "a");
-        properties3.put("schema.0.data-type", "BIGINT");
-        properties3.put("schema.1.name", "b");
-        properties3.put("schema.1.data-type", "INT");
-        properties3.put("schema.2.name", "m");
-        properties3.put("schema.2.data-type", "STRING");
-        properties3.put("writable-metadata", "m:STRING");
-
-        final CatalogTable catalogTable3 = CatalogTable.fromProperties(properties3);
+        Map<String, String> options3 = new HashMap<>();
+        options3.put("connector", TestValuesTableFactory.IDENTIFIER);
+        options3.put("writable-metadata", "m:STRING");
 
         final ResolvedSchema resolvedSchema3 =
                 new ResolvedSchema(
                         Arrays.asList(
                                 Column.physical("a", DataTypes.BIGINT()),
                                 Column.physical("b", DataTypes.INT()),
-                                Column.physical("m", DataTypes.STRING())),
+                                Column.metadata("m", DataTypes.STRING(), null, false)),
                         Collections.emptyList(),
                         null);
+        final CatalogTable catalogTable3 =
+                CatalogTable.of(
+                        Schema.newBuilder().fromResolvedSchema(resolvedSchema3).build(),
+                        null,
+                        Collections.emptyList(),
+                        options3);
 
         DynamicTableSinkSpec spec3 =
                 new DynamicTableSinkSpec(
-                        ObjectIdentifier.of("default_catalog", "default_db", "MyTable"),
-                        new ResolvedCatalogTable(catalogTable3, resolvedSchema3),
+                        ContextResolvedTable.temporary(
+                                ObjectIdentifier.of(
+                                        DEFAULT_BUILTIN_CATALOG,
+                                        DEFAULT_BUILTIN_DATABASE,
+                                        "MyTable"),
+                                new ResolvedCatalogTable(catalogTable3, resolvedSchema3)),
                         Collections.singletonList(
                                 new WritingMetadataSpec(
                                         Collections.singletonList("m"),
                                         RowType.of(new BigIntType(), new IntType()))));
-        spec3.setReadableConfig(new Configuration());
 
-        return Arrays.asList(spec1, spec2, spec3);
+        return Stream.of(spec1, spec2, spec3);
+    }
+
+    @ParameterizedTest
+    @MethodSource("testDynamicTableSinkSpecSerde")
+    void testDynamicTableSinkSpecSerde(DynamicTableSinkSpec spec) throws IOException {
+        TableEnvironmentImpl tableEnv =
+                (TableEnvironmentImpl) TableEnvironment.create(inStreamingMode());
+
+        CatalogManager catalogManager = tableEnv.getCatalogManager();
+        catalogManager.initSchemaResolver(true, ExpressionResolverMocks.dummyResolver());
+        catalogManager.createTable(
+                spec.getContextResolvedTable().getResolvedTable(),
+                spec.getContextResolvedTable().getIdentifier(),
+                false);
+
+        SerdeContext serdeCtx = configuredSerdeContext(catalogManager, tableEnv.getConfig());
+
+        // Re-init the spec to be permanent with correct catalog
+        spec =
+                new DynamicTableSinkSpec(
+                        ContextResolvedTable.permanent(
+                                spec.getContextResolvedTable().getIdentifier(),
+                                catalogManager.getCatalog(catalogManager.getCurrentCatalog()).get(),
+                                spec.getContextResolvedTable().getResolvedTable()),
+                        spec.getSinkAbilities());
+
+        String actualJson = toJson(serdeCtx, spec);
+        DynamicTableSinkSpec actual = toObject(serdeCtx, actualJson, DynamicTableSinkSpec.class);
+
+        assertThat(actual.getContextResolvedTable()).isEqualTo(spec.getContextResolvedTable());
+        assertThat(actual.getSinkAbilities()).isEqualTo(spec.getSinkAbilities());
+
+        assertThat(actual.getTableSink(((PlannerBase) tableEnv.getPlanner()).getFlinkContext()))
+                .isNotNull();
+    }
+
+    @Test
+    void testDynamicTableSinkSpecSerdeWithEnrichmentOptions() throws Exception {
+        // Test model
+        ObjectIdentifier identifier =
+                ObjectIdentifier.of(
+                        CatalogManagerMocks.DEFAULT_CATALOG,
+                        CatalogManagerMocks.DEFAULT_DATABASE,
+                        "my_table");
+        ResolvedSchema resolvedSchema =
+                new ResolvedSchema(
+                        Arrays.asList(
+                                Column.physical("a", DataTypes.STRING()),
+                                Column.physical("b", DataTypes.INT()),
+                                Column.physical("c", DataTypes.BOOLEAN())),
+                        Collections.emptyList(),
+                        null);
+        Schema schema = Schema.newBuilder().fromResolvedSchema(resolvedSchema).build();
+
+        String formatPrefix = FactoryUtil.getFormatPrefix(FORMAT, TestFormatFactory.IDENTIFIER);
+
+        Map<String, String> planOptions = new HashMap<>();
+        planOptions.put(CONNECTOR.key(), TestDynamicTableFactory.IDENTIFIER);
+        planOptions.put(TARGET.key(), "abc");
+        planOptions.put(BUFFER_SIZE.key(), "1000");
+        planOptions.put(FORMAT.key(), TestFormatFactory.IDENTIFIER);
+        planOptions.put(formatPrefix + DELIMITER.key(), "|");
+
+        Map<String, String> catalogOptions = new HashMap<>();
+        catalogOptions.put(CONNECTOR.key(), TestDynamicTableFactory.IDENTIFIER);
+        catalogOptions.put(TARGET.key(), "xyz");
+        catalogOptions.put(BUFFER_SIZE.key(), "2000");
+        catalogOptions.put(FORMAT.key(), TestFormatFactory.IDENTIFIER);
+        catalogOptions.put(formatPrefix + DELIMITER.key(), ",");
+
+        ResolvedCatalogTable planResolvedCatalogTable =
+                new ResolvedCatalogTable(
+                        CatalogTable.of(schema, null, Collections.emptyList(), planOptions),
+                        resolvedSchema);
+        ResolvedCatalogTable catalogResolvedCatalogTable =
+                new ResolvedCatalogTable(
+                        CatalogTable.of(schema, null, Collections.emptyList(), catalogOptions),
+                        resolvedSchema);
+
+        // Create table env
+        TableEnvironmentImpl tableEnv =
+                (TableEnvironmentImpl) TableEnvironment.create(inStreamingMode());
+
+        // Create mock catalog
+        CatalogManager catalogManager = tableEnv.getCatalogManager();
+        catalogManager.initSchemaResolver(true, ExpressionResolverMocks.dummyResolver());

Review comment:
       this should not be necessary?

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/spec/DynamicTableSourceSpec.java
##########
@@ -110,39 +106,73 @@ private DynamicTableSource getTableSource(FlinkContext flinkContext) {
         return tableSource;
     }
 
-    @JsonIgnore
     public ScanTableSource getScanTableSource(FlinkContext flinkContext) {
         DynamicTableSource tableSource = getTableSource(flinkContext);
         if (tableSource instanceof ScanTableSource) {
             return (ScanTableSource) tableSource;
         } else {
             throw new TableException(
                     String.format(
-                            "%s is not a ScanTableSource.\nplease check it.",
+                            "%s is not a ScanTableSource.\n" + "please check it.",
                             tableSource.getClass().getName()));
         }
     }
 
-    @JsonIgnore
     public LookupTableSource getLookupTableSource(FlinkContext flinkContext) {
         DynamicTableSource tableSource = getTableSource(flinkContext);
         if (tableSource instanceof LookupTableSource) {
             return (LookupTableSource) tableSource;
         } else {
             throw new TableException(
                     String.format(
-                            "%s is not a LookupTableSource.\nplease check it.",
+                            "%s is not a LookupTableSource.\n" + "please check it.",

Review comment:
       same here




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 07cec81a669917e2eb1531b2c054e04110781dfe Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892) 
   * 7005798448a0cf4e9571c6695569b61e7e4641af Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] slinkydeveloper commented on a change in pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
slinkydeveloper commented on a change in pull request #18427:
URL: https://github.com/apache/flink/pull/18427#discussion_r791792924



##########
File path: flink-table/flink-table-planner/src/test/resources/jsonplan/testGetJsonPlan.out
##########
@@ -1,76 +1,110 @@
 {
-   "flinkVersion":"",
-   "nodes":[
-      {
-         "class":"org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecTableSourceScan",
-         "scanTableSource":{
-            "identifier":{
-               "catalogName":"default_catalog",
-               "databaseName":"default_database",
-               "tableName":"MyTable"
+  "flinkVersion": "",
+  "nodes": [
+    {
+      "class": "org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecTableSourceScan",
+      "scanTableSource": {
+        "catalogTable": {
+          "identifier": "`default_catalog`.`default_database`.`MyTable`",
+          "catalogTable": {
+            "resolvedSchema": {
+              "columns": [
+                {
+                  "name": "a",
+                  "type": "physical",
+                  "dataType": "BIGINT"
+                },
+                {
+                  "name": "b",
+                  "type": "physical",
+                  "dataType": "INT"
+                },
+                {
+                  "name": "c",
+                  "type": "physical",
+                  "dataType": {
+                    "logicalType": "VARCHAR(2147483647)",

Review comment:
       This is done here https://github.com/apache/flink/pull/18506




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a69e8836becd5bbdedd183376c67dae35afc2960 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812) 
   * b83fd153ef3df414e6a7766e26ccd37f788d728e Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876) 
   * 367e6ab469bb0c69923e73dc458b74650bef48b1 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617






-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173",
       "triggerID" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311",
       "triggerID" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "triggerType" : "PUSH"
     }, {
       "hash" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318",
       "triggerID" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a8365736913376f03c3bb64e2a927e46d9dee10a",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a8365736913376f03c3bb64e2a927e46d9dee10a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 228aec25fa06571e6e00e886da22e2598ca632b5 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173) 
   * 4b1fa7a01306c01a597b30f45c49d220fdc59dc7 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311) 
   * dfd7ce9e194c47511c26ee1030e2db044f128a12 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318) 
   * a8365736913376f03c3bb64e2a927e46d9dee10a UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173",
       "triggerID" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311",
       "triggerID" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "triggerType" : "PUSH"
     }, {
       "hash" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318",
       "triggerID" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a8365736913376f03c3bb64e2a927e46d9dee10a",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a8365736913376f03c3bb64e2a927e46d9dee10a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 228aec25fa06571e6e00e886da22e2598ca632b5 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173) 
   * 4b1fa7a01306c01a597b30f45c49d220fdc59dc7 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311) 
   * dfd7ce9e194c47511c26ee1030e2db044f128a12 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318) 
   * a8365736913376f03c3bb64e2a927e46d9dee10a UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] slinkydeveloper commented on a change in pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
slinkydeveloper commented on a change in pull request #18427:
URL: https://github.com/apache/flink/pull/18427#discussion_r794289558



##########
File path: flink-table/flink-table-planner/src/test/scala/org/apache/flink/table/planner/utils/TableTestBase.scala
##########
@@ -795,19 +797,20 @@ abstract class TableTestUtilBase(test: TableTestBase, isStreamingMode: Boolean)
       val plannerDirPath = clazz.getResource("/").getFile.replace("/target/test-classes/", "")
       new File(s"$plannerDirPath/src/test/resources$resourceTestFilePath")
     }
-    if (file.exists()) {
+    if (!file.exists() || "true".equalsIgnoreCase(System.getenv(PLAN_TEST_FORCE_OVERWRITE))) {
+      Files.deleteIfExists(file.toPath)
+      file.getParentFile.mkdirs()
+      assertTrue(file.createNewFile())
+      val prettyJson = TableTestUtil.getPrettyJson(jsonPlanWithoutFlinkVersion)
+      Files.write(Paths.get(file.toURI), prettyJson.getBytes)

Review comment:
       +1




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a69e8836becd5bbdedd183376c67dae35afc2960 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812) 
   * b83fd153ef3df414e6a7766e26ccd37f788d728e Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8c843d680e45ef8b14d3abf5d75f785c889c4a33",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "8c843d680e45ef8b14d3abf5d75f785c889c4a33",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a69e8836becd5bbdedd183376c67dae35afc2960 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812) 
   * b83fd153ef3df414e6a7766e26ccd37f788d728e Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876) 
   * 367e6ab469bb0c69923e73dc458b74650bef48b1 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885) 
   * 8c843d680e45ef8b14d3abf5d75f785c889c4a33 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173",
       "triggerID" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311",
       "triggerID" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "triggerType" : "PUSH"
     }, {
       "hash" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318",
       "triggerID" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 228aec25fa06571e6e00e886da22e2598ca632b5 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173) 
   * 4b1fa7a01306c01a597b30f45c49d220fdc59dc7 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311) 
   * dfd7ce9e194c47511c26ee1030e2db044f128a12 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173",
       "triggerID" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311",
       "triggerID" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "triggerType" : "PUSH"
     }, {
       "hash" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318",
       "triggerID" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a8365736913376f03c3bb64e2a927e46d9dee10a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30327",
       "triggerID" : "a8365736913376f03c3bb64e2a927e46d9dee10a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 228aec25fa06571e6e00e886da22e2598ca632b5 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173) 
   * 4b1fa7a01306c01a597b30f45c49d220fdc59dc7 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311) 
   * dfd7ce9e194c47511c26ee1030e2db044f128a12 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318) 
   * a8365736913376f03c3bb64e2a927e46d9dee10a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30327) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] slinkydeveloper commented on a change in pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
slinkydeveloper commented on a change in pull request #18427:
URL: https://github.com/apache/flink/pull/18427#discussion_r791013866



##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/ResolvedCatalogTableJsonSerializer.java
##########
@@ -0,0 +1,86 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.table.api.ValidationException;
+import org.apache.flink.table.catalog.CatalogBaseTable;
+import org.apache.flink.table.catalog.ExternalCatalogTable;
+import org.apache.flink.table.catalog.ResolvedCatalogTable;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonGenerator;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.SerializerProvider;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.std.StdSerializer;
+
+import java.io.IOException;
+
+/**
+ * This serializer can be configured via an attribute to serialize or not the options and comments,
+ * setting the attribute {@link #SERIALIZE_OPTIONS} to {@code true} or {@code false}.
+ */
+class ResolvedCatalogTableJsonSerializer extends StdSerializer<ResolvedCatalogTable> {
+    private static final long serialVersionUID = 1L;
+
+    static final String SERIALIZE_OPTIONS = "serialize_options";
+
+    public static final String RESOLVED_SCHEMA = "resolvedSchema";
+    public static final String PARTITION_KEYS = "partitionKeys";
+    public static final String OPTIONS = "options";
+    public static final String COMMENT = "comment";
+
+    public ResolvedCatalogTableJsonSerializer() {
+        super(ResolvedCatalogTable.class);
+    }
+
+    @Override
+    public void serialize(
+            ResolvedCatalogTable resolvedCatalogTable,
+            JsonGenerator jsonGenerator,
+            SerializerProvider serializerProvider)
+            throws IOException {
+        // Thia should never happen anyway, but we keep this assertion for sanity check
+        assert resolvedCatalogTable.getTableKind() == CatalogBaseTable.TableKind.TABLE;
+
+        boolean serializeOptions =

Review comment:
       I tried to parametrize the ser/de itself, as one might use it in another context for another purpose, and might have a different rationale behind serializing or not the options.
   
   To unblock the discussion, I factored out in a static method the serialization code, and from `ContextResolvedTableJsonSerializer` i call that method directly. But I still keep this option, as it's useful for testing and might be useful for setting this serializer option from an higher level (similar to how you define data time formats in jackson today)




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] slinkydeveloper commented on a change in pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
slinkydeveloper commented on a change in pull request #18427:
URL: https://github.com/apache/flink/pull/18427#discussion_r790944919



##########
File path: flink-table/flink-table-planner/src/test/resources/jsonplan/testGetJsonPlan.out
##########
@@ -1,76 +1,110 @@
 {
-   "flinkVersion":"",
-   "nodes":[
-      {
-         "class":"org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecTableSourceScan",
-         "scanTableSource":{
-            "identifier":{
-               "catalogName":"default_catalog",
-               "databaseName":"default_database",
-               "tableName":"MyTable"
+  "flinkVersion": "",
+  "nodes": [
+    {
+      "class": "org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecTableSourceScan",
+      "scanTableSource": {
+        "catalogTable": {
+          "identifier": "`default_catalog`.`default_database`.`MyTable`",
+          "catalogTable": {
+            "resolvedSchema": {
+              "columns": [
+                {
+                  "name": "a",
+                  "type": "physical",
+                  "dataType": "BIGINT"
+                },
+                {
+                  "name": "b",
+                  "type": "physical",

Review comment:
       > type computed is implicit if there is an expression, metadata with a metadata key
   
   I rather prefer to avoid doing this. I agree that type can be default to physical if missing, but this inference doesn't sound right to me, as we don't know if at some point in the future we might want to have a supercolumn that has both an expression and a metadata field. I rather prefer to be conservative and serialize the type, just to be on the safe side. We can always remove this type in future.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 07cec81a669917e2eb1531b2c054e04110781dfe Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892) 
   * 7005798448a0cf4e9571c6695569b61e7e4641af Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 07cec81a669917e2eb1531b2c054e04110781dfe Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892) 
   * 7005798448a0cf4e9571c6695569b61e7e4641af Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162) 
   * 228aec25fa06571e6e00e886da22e2598ca632b5 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] slinkydeveloper commented on a change in pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
slinkydeveloper commented on a change in pull request #18427:
URL: https://github.com/apache/flink/pull/18427#discussion_r790887451



##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/ConverterDelegatingDeserializer.java
##########
@@ -0,0 +1,98 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonParser;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.BeanDescription;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationConfig;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.JsonDeserializer;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerModifier;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.DelegatingDeserializer;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.module.SimpleModule;
+
+import java.io.IOException;
+import java.util.Arrays;
+import java.util.Map;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Deserializer which delegates to the default {@link BeanDeserializer} and then executes custom
+ * code to perform a conversion to another final value.
+ *
+ * <p>Use the {@link Converter} when you want to use Jackson annotations for defining serializers
+ * and deserializers, but after the deserialization you need to perform an additional transformation
+ * step that doesn't depend on the original JSON, e.g. enrich the output value with info from {@link
+ * SerdeContext}.
+ */
+class ConverterDelegatingDeserializer<T, R> extends DelegatingDeserializer {

Review comment:
       This was an experiment, will remove it once i get rid of `ClassLoader` and `ReadableConfig` in the specs




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] slinkydeveloper commented on a change in pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
slinkydeveloper commented on a change in pull request #18427:
URL: https://github.com/apache/flink/pull/18427#discussion_r791765118



##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/ResolvedExpressionJsonSerializer.java
##########
@@ -0,0 +1,75 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.table.api.ValidationException;
+import org.apache.flink.table.expressions.ResolvedExpression;
+import org.apache.flink.table.planner.expressions.RexNodeExpression;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonGenerator;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.SerializerProvider;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.std.StdSerializer;
+
+import java.io.IOException;
+
+class ResolvedExpressionJsonSerializer extends StdSerializer<ResolvedExpression> {
+
+    public static final String TYPE = "type";
+    public static final String TYPE_REX_NODE_EXPRESSION = "rexNodeExpression";
+    public static final String REX_NODE = "rexNode";
+    public static final String OUTPUT_DATA_TYPE = "outputDataType";
+    public static final String SERIALIZABLE_STRING = "serializableString";
+
+    protected ResolvedExpressionJsonSerializer() {
+        super(ResolvedExpression.class);
+    }
+
+    @Override
+    public void serialize(
+            ResolvedExpression resolvedExpression,
+            JsonGenerator jsonGenerator,
+            SerializerProvider serializerProvider)
+            throws IOException {
+        jsonGenerator.writeStartObject();
+
+        if (resolvedExpression instanceof RexNodeExpression) {
+            serialize((RexNodeExpression) resolvedExpression, jsonGenerator, serializerProvider);
+        } else {
+            throw new ValidationException(
+                    String.format(
+                            "Expression '%s' cannot be serialized. "
+                                    + "Currently, only SQL expressions can be serialized in the persisted plan.",
+                            resolvedExpression.asSummaryString()));
+        }
+
+        jsonGenerator.writeEndObject();
+    }
+
+    private void serialize(
+            RexNodeExpression expression,
+            JsonGenerator jsonGenerator,
+            SerializerProvider serializerProvider)
+            throws IOException {
+        jsonGenerator.writeStringField(TYPE, TYPE_REX_NODE_EXPRESSION);
+        serializerProvider.defaultSerializeField(REX_NODE, expression.getRexNode(), jsonGenerator);
+        serializerProvider.defaultSerializeField(

Review comment:
       I managed to get rid of the type, but I can't trivially remove `serializableString`. In order to remove it, I tried to infer it from the `RexNode` using `SqlImplementor.SimpleContext` when deserializing. But, as you see from the signature, I still to access to the input fields SqlNode tree, for example to serialize an input ref. At this specific location, there is no easy way to do it.  
   
   I also don't think it's safe to just omit it as, without it, the `Expression` tree becomes not serializable. I suggest we keep it in this iteration, hence we serialize the fields `rexNode` and `serializableString`, and we exclude `type` and `outputDataType`.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173",
       "triggerID" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311",
       "triggerID" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "triggerType" : "PUSH"
     }, {
       "hash" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318",
       "triggerID" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 228aec25fa06571e6e00e886da22e2598ca632b5 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173) 
   * 4b1fa7a01306c01a597b30f45c49d220fdc59dc7 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311) 
   * dfd7ce9e194c47511c26ee1030e2db044f128a12 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot commented on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot commented on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a69e8836becd5bbdedd183376c67dae35afc2960 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * b83fd153ef3df414e6a7766e26ccd37f788d728e Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876) 
   * 367e6ab469bb0c69923e73dc458b74650bef48b1 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885) 
   * 07cec81a669917e2eb1531b2c054e04110781dfe Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 07cec81a669917e2eb1531b2c054e04110781dfe Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892) 
   * 7005798448a0cf4e9571c6695569b61e7e4641af Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162) 
   * 228aec25fa06571e6e00e886da22e2598ca632b5 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173",
       "triggerID" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311",
       "triggerID" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "triggerType" : "PUSH"
     }, {
       "hash" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318",
       "triggerID" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a8365736913376f03c3bb64e2a927e46d9dee10a",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a8365736913376f03c3bb64e2a927e46d9dee10a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 228aec25fa06571e6e00e886da22e2598ca632b5 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173) 
   * 4b1fa7a01306c01a597b30f45c49d220fdc59dc7 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311) 
   * dfd7ce9e194c47511c26ee1030e2db044f128a12 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318) 
   * a8365736913376f03c3bb64e2a927e46d9dee10a UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a69e8836becd5bbdedd183376c67dae35afc2960 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812) 
   * b83fd153ef3df414e6a7766e26ccd37f788d728e UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173",
       "triggerID" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311",
       "triggerID" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "triggerType" : "PUSH"
     }, {
       "hash" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318",
       "triggerID" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 228aec25fa06571e6e00e886da22e2598ca632b5 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173) 
   * 4b1fa7a01306c01a597b30f45c49d220fdc59dc7 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311) 
   * dfd7ce9e194c47511c26ee1030e2db044f128a12 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173",
       "triggerID" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311",
       "triggerID" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "triggerType" : "PUSH"
     }, {
       "hash" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318",
       "triggerID" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a8365736913376f03c3bb64e2a927e46d9dee10a",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a8365736913376f03c3bb64e2a927e46d9dee10a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 228aec25fa06571e6e00e886da22e2598ca632b5 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173) 
   * 4b1fa7a01306c01a597b30f45c49d220fdc59dc7 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311) 
   * dfd7ce9e194c47511c26ee1030e2db044f128a12 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318) 
   * a8365736913376f03c3bb64e2a927e46d9dee10a UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173",
       "triggerID" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311",
       "triggerID" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "triggerType" : "PUSH"
     }, {
       "hash" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318",
       "triggerID" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a8365736913376f03c3bb64e2a927e46d9dee10a",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30327",
       "triggerID" : "a8365736913376f03c3bb64e2a927e46d9dee10a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "82350d3f5d0dbb119914bd9fc96cba87bea89433",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30365",
       "triggerID" : "82350d3f5d0dbb119914bd9fc96cba87bea89433",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a8365736913376f03c3bb64e2a927e46d9dee10a Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30327) 
   * 82350d3f5d0dbb119914bd9fc96cba87bea89433 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30365) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a69e8836becd5bbdedd183376c67dae35afc2960 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812) 
   * b83fd153ef3df414e6a7766e26ccd37f788d728e Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876) 
   * 367e6ab469bb0c69923e73dc458b74650bef48b1 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a69e8836becd5bbdedd183376c67dae35afc2960 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a69e8836becd5bbdedd183376c67dae35afc2960 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812) 
   * b83fd153ef3df414e6a7766e26ccd37f788d728e Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876) 
   * 367e6ab469bb0c69923e73dc458b74650bef48b1 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885) 
   * 07cec81a669917e2eb1531b2c054e04110781dfe Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] slinkydeveloper commented on a change in pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
slinkydeveloper commented on a change in pull request #18427:
URL: https://github.com/apache/flink/pull/18427#discussion_r793634030



##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/spec/DynamicTableSourceSpec.java
##########
@@ -110,39 +106,73 @@ private DynamicTableSource getTableSource(FlinkContext flinkContext) {
         return tableSource;
     }
 
-    @JsonIgnore
     public ScanTableSource getScanTableSource(FlinkContext flinkContext) {
         DynamicTableSource tableSource = getTableSource(flinkContext);
         if (tableSource instanceof ScanTableSource) {
             return (ScanTableSource) tableSource;
         } else {
             throw new TableException(
                     String.format(
-                            "%s is not a ScanTableSource.\nplease check it.",
+                            "%s is not a ScanTableSource.\n" + "please check it.",

Review comment:
       Sometimes I forget we have an automated formatter :laughing:




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 07cec81a669917e2eb1531b2c054e04110781dfe Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892) 
   * 7005798448a0cf4e9571c6695569b61e7e4641af Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 07cec81a669917e2eb1531b2c054e04110781dfe Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892) 
   * 7005798448a0cf4e9571c6695569b61e7e4641af Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173",
       "triggerID" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 228aec25fa06571e6e00e886da22e2598ca632b5 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 07cec81a669917e2eb1531b2c054e04110781dfe Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892) 
   * 7005798448a0cf4e9571c6695569b61e7e4641af Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162) 
   * 228aec25fa06571e6e00e886da22e2598ca632b5 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173",
       "triggerID" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311",
       "triggerID" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "triggerType" : "PUSH"
     }, {
       "hash" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318",
       "triggerID" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 228aec25fa06571e6e00e886da22e2598ca632b5 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173) 
   * 4b1fa7a01306c01a597b30f45c49d220fdc59dc7 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311) 
   * dfd7ce9e194c47511c26ee1030e2db044f128a12 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9857ee8f41ffb6220d5c4c0ad7cd4e84806f740d",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "9857ee8f41ffb6220d5c4c0ad7cd4e84806f740d",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a69e8836becd5bbdedd183376c67dae35afc2960 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812) 
   * b83fd153ef3df414e6a7766e26ccd37f788d728e Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876) 
   * 367e6ab469bb0c69923e73dc458b74650bef48b1 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885) 
   * 9857ee8f41ffb6220d5c4c0ad7cd4e84806f740d UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a69e8836becd5bbdedd183376c67dae35afc2960 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812) 
   * b83fd153ef3df414e6a7766e26ccd37f788d728e Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876) 
   * 367e6ab469bb0c69923e73dc458b74650bef48b1 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 367e6ab469bb0c69923e73dc458b74650bef48b1 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885) 
   * 07cec81a669917e2eb1531b2c054e04110781dfe Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a69e8836becd5bbdedd183376c67dae35afc2960 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812) 
   * b83fd153ef3df414e6a7766e26ccd37f788d728e Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876) 
   * 367e6ab469bb0c69923e73dc458b74650bef48b1 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885) 
   * 07cec81a669917e2eb1531b2c054e04110781dfe UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173",
       "triggerID" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 7005798448a0cf4e9571c6695569b61e7e4641af Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162) 
   * 228aec25fa06571e6e00e886da22e2598ca632b5 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] twalthr commented on a change in pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
twalthr commented on a change in pull request #18427:
URL: https://github.com/apache/flink/pull/18427#discussion_r790700555



##########
File path: flink-table/flink-table-common/src/main/java/org/apache/flink/table/factories/DynamicTableFactory.java
##########
@@ -80,7 +80,25 @@
     @PublicEvolving
     interface Context {
 
-        /** Returns the identifier of the table in the {@link Catalog}. */
+        /**
+         * Returns the identifier of the table in the {@link Catalog}.
+         *
+         * <p>This identifier defines the relationship between the table instance and the associated
+         * {@link Catalog} (if any), but it doesn't uniquely identify this specific table
+         * setup/instance across a table program, as the same table might be stored in different
+         * catalogs or, in case of anonymous tables, this identifier is auto-generated
+         * non-deterministic. Because of that behaviour, We strongly suggest using this identifier

Review comment:
       `behaviour, We`
   
   Similar comment as before: Try to split long sentences. It is very uncommon in English to have long nested sentences:
   
   ```
   This identifier defines the relationship between the table instance and the associated {@link Catalog} (if any). However, it doesn't uniquely identify this specific table setup/instance across a table program. The same table might be stored in different catalogs or, in case of anonymous tables, this identifier is auto-generated and non-deterministic.
   ```

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/expressions/RexNodeExpression.java
##########
@@ -108,4 +109,23 @@ public DataType getOutputDataType() {
     public String toString() {
         return asSummaryString();
     }
+
+    @Override
+    public boolean equals(Object o) {
+        if (this == o) {
+            return true;
+        }
+        if (o == null || getClass() != o.getClass()) {
+            return false;
+        }
+        RexNodeExpression that = (RexNodeExpression) o;
+        return Objects.equals(rexNode, that.rexNode)

Review comment:
       nit: `rexNode` and `outputDataType` cannot be null and could use `equals` directly 

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/abilities/source/AggregatePushDownSpec.java
##########
@@ -204,4 +205,26 @@ public static boolean apply(
         }
         return aggExpressions;
     }
+
+    @Override
+    public boolean equals(Object o) {
+        if (this == o) {
+            return true;
+        }
+        if (o == null || getClass() != o.getClass()) {
+            return false;
+        }
+        if (!super.equals(o)) {

Review comment:
       mark the class as `final` and other classes of this commit as `final`?

##########
File path: flink-table/flink-table-common/src/main/java/org/apache/flink/table/factories/DynamicTableFactory.java
##########
@@ -80,7 +80,25 @@
     @PublicEvolving
     interface Context {
 
-        /** Returns the identifier of the table in the {@link Catalog}. */
+        /**
+         * Returns the identifier of the table in the {@link Catalog}.
+         *
+         * <p>This identifier defines the relationship between the table instance and the associated
+         * {@link Catalog} (if any), but it doesn't uniquely identify this specific table
+         * setup/instance across a table program, as the same table might be stored in different
+         * catalogs or, in case of anonymous tables, this identifier is auto-generated
+         * non-deterministic. Because of that behaviour, We strongly suggest using this identifier
+         * only for debugging purpose, and rely on user input for uniquely identifying a "table
+         * instance".
+         *
+         * <p>For example, when implementing a Kafka source using consumer groups, the user should
+         * provide the consumer group id manually rather than using this identifier as the consumer
+         * group id, so the offset tracking remains stable even if this table is anonymous, or it's
+         * moved to another {@link Catalog}.
+         *
+         * <p>Note that for anonymous tables {@link ObjectIdentifier#asSerializableString()} will
+         * fail, so we suggest to use {@link ObjectIdentifier#asSummaryString()} for debugging.

Review comment:
       `debugging` -> `printing and logging`

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/ColumnJsonSerializer.java
##########
@@ -0,0 +1,98 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.table.catalog.Column;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonGenerator;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.SerializerProvider;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.std.StdSerializer;
+
+import java.io.IOException;
+
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.JsonSerdeUtil.serializeOptionalField;
+
+class ColumnJsonSerializer extends StdSerializer<Column> {
+
+    public static final String COLUMN_TYPE = "type";
+    public static final String COLUMN_TYPE_PHYSICAL = "physical";
+    public static final String COLUMN_TYPE_COMPUTED = "computed";
+    public static final String COLUMN_TYPE_METADATA = "metadata";
+    public static final String NAME = "name";
+    public static final String DATA_TYPE = "dataType";
+    public static final String COMMENT = "comment";
+    public static final String EXPRESSION = "expression";
+    public static final String METADATA_KEY = "metadataKey";
+    public static final String IS_VIRTUAL = "isVirtual";
+
+    public ColumnJsonSerializer() {
+        super(Column.class);
+    }
+
+    @Override
+    public void serialize(
+            Column column, JsonGenerator jsonGenerator, SerializerProvider serializerProvider)
+            throws IOException {
+        jsonGenerator.writeStartObject();
+
+        // Common fields
+        jsonGenerator.writeStringField(NAME, column.getName());
+        serializeOptionalField(jsonGenerator, COMMENT, column.getComment(), serializerProvider);
+
+        if (column instanceof Column.PhysicalColumn) {
+            serialize((Column.PhysicalColumn) column, jsonGenerator, serializerProvider);
+        } else if (column instanceof Column.MetadataColumn) {
+            serialize((Column.MetadataColumn) column, jsonGenerator, serializerProvider);
+        } else {

Review comment:
       use `else if` here

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/ColumnJsonDeserializer.java
##########
@@ -0,0 +1,117 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.table.api.ValidationException;
+import org.apache.flink.table.catalog.Column;
+import org.apache.flink.table.expressions.ResolvedExpression;
+import org.apache.flink.table.types.DataType;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonParser;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.ObjectCodec;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.StdDeserializer;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode;
+
+import java.io.IOException;
+import java.util.Arrays;
+
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ColumnJsonSerializer.COLUMN_TYPE;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ColumnJsonSerializer.COLUMN_TYPE_COMPUTED;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ColumnJsonSerializer.COLUMN_TYPE_METADATA;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ColumnJsonSerializer.COLUMN_TYPE_PHYSICAL;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ColumnJsonSerializer.COMMENT;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ColumnJsonSerializer.DATA_TYPE;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ColumnJsonSerializer.EXPRESSION;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ColumnJsonSerializer.IS_VIRTUAL;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ColumnJsonSerializer.METADATA_KEY;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ColumnJsonSerializer.NAME;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.JsonSerdeUtil.deserializeOptionalField;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.JsonSerdeUtil.traverse;
+
+class ColumnJsonDeserializer extends StdDeserializer<Column> {
+
+    private static final String[] SUPPORTED_COLUMN_TYPES =

Review comment:
       nit: I would recommend `KIND` instead of `TYPE`, it makes discussions easier.

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/spec/DynamicTableSinkSpec.java
##########
@@ -36,32 +37,56 @@
 import javax.annotation.Nullable;
 
 import java.util.List;
+import java.util.Objects;
 
 /**
  * {@link DynamicTableSourceSpec} describes how to serialize/deserialize dynamic table sink table
  * and create {@link DynamicTableSink} from the deserialization result.
  */
 @JsonIgnoreProperties(ignoreUnknown = true)
 @JsonInclude(JsonInclude.Include.NON_EMPTY)
-public class DynamicTableSinkSpec extends CatalogTableSpecBase {
+public class DynamicTableSinkSpec {
 
+    public static final String FIELD_NAME_CATALOG_TABLE_SPEC = "catalogTable";

Review comment:
       `FIELD_NAME_CATALOG_TABLE` because it is not a spec

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/ResolvedCatalogTableJsonSerializer.java
##########
@@ -0,0 +1,86 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.table.api.ValidationException;
+import org.apache.flink.table.catalog.CatalogBaseTable;
+import org.apache.flink.table.catalog.ExternalCatalogTable;
+import org.apache.flink.table.catalog.ResolvedCatalogTable;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonGenerator;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.SerializerProvider;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.std.StdSerializer;
+
+import java.io.IOException;
+
+/**
+ * This serializer can be configured via an attribute to serialize or not the options and comments,
+ * setting the attribute {@link #SERIALIZE_OPTIONS} to {@code true} or {@code false}.
+ */
+class ResolvedCatalogTableJsonSerializer extends StdSerializer<ResolvedCatalogTable> {
+    private static final long serialVersionUID = 1L;
+
+    static final String SERIALIZE_OPTIONS = "serialize_options";
+
+    public static final String RESOLVED_SCHEMA = "resolvedSchema";
+    public static final String PARTITION_KEYS = "partitionKeys";
+    public static final String OPTIONS = "options";
+    public static final String COMMENT = "comment";
+
+    public ResolvedCatalogTableJsonSerializer() {
+        super(ResolvedCatalogTable.class);
+    }
+
+    @Override
+    public void serialize(
+            ResolvedCatalogTable resolvedCatalogTable,
+            JsonGenerator jsonGenerator,
+            SerializerProvider serializerProvider)
+            throws IOException {
+        // Thia should never happen anyway, but we keep this assertion for sanity check
+        assert resolvedCatalogTable.getTableKind() == CatalogBaseTable.TableKind.TABLE;
+
+        boolean serializeOptions =

Review comment:
       I find it confusing to have a second way of reading configuration. Why not using the `SerdeContext` instead? When people search for usages of the table option, they will not find the usages of a second config stack.

##########
File path: flink-table/flink-table-planner/src/test/resources/jsonplan/testGetJsonPlan.out
##########
@@ -1,76 +1,110 @@
 {
-   "flinkVersion":"",
-   "nodes":[
-      {
-         "class":"org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecTableSourceScan",
-         "scanTableSource":{
-            "identifier":{
-               "catalogName":"default_catalog",
-               "databaseName":"default_database",
-               "tableName":"MyTable"
+  "flinkVersion": "",
+  "nodes": [
+    {
+      "class": "org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecTableSourceScan",
+      "scanTableSource": {
+        "catalogTable": {
+          "identifier": "`default_catalog`.`default_database`.`MyTable`",
+          "catalogTable": {
+            "resolvedSchema": {
+              "columns": [
+                {
+                  "name": "a",
+                  "type": "physical",
+                  "dataType": "BIGINT"
+                },
+                {
+                  "name": "b",
+                  "type": "physical",

Review comment:
       let's still try to comply with `CatalogPropertiesUtil#serializeCatalogTable`. E.g. we can omit `"type": "physical"`

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/ContextResolvedTableJsonDeserializer.java
##########
@@ -0,0 +1,220 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.table.api.ValidationException;
+import org.apache.flink.table.api.config.TableConfigOptions;
+import org.apache.flink.table.api.config.TableConfigOptions.CatalogPlanRestore;
+import org.apache.flink.table.catalog.CatalogManager;
+import org.apache.flink.table.catalog.ContextResolvedTable;
+import org.apache.flink.table.catalog.ObjectIdentifier;
+import org.apache.flink.table.catalog.ResolvedCatalogTable;
+import org.apache.flink.table.catalog.ResolvedSchema;
+import org.apache.flink.table.types.DataType;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonParser;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.StdDeserializer;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode;
+
+import java.io.IOException;
+import java.util.List;
+import java.util.Objects;
+import java.util.Optional;
+import java.util.stream.Collectors;
+
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ContextResolvedTableJsonSerializer.FIELD_NAME_CATALOG_TABLE;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ContextResolvedTableJsonSerializer.FIELD_NAME_IDENTIFIER;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ResolvedCatalogTableJsonSerializer.OPTIONS;
+
+class ContextResolvedTableJsonDeserializer extends StdDeserializer<ContextResolvedTable> {
+    private static final long serialVersionUID = 1L;
+
+    public ContextResolvedTableJsonDeserializer() {
+        super(ContextResolvedTable.class);
+    }
+
+    @Override
+    public ContextResolvedTable deserialize(JsonParser jsonParser, DeserializationContext ctx)
+            throws IOException {
+        final CatalogPlanRestore planRestoreOption =
+                SerdeContext.get(ctx)
+                        .getConfiguration()
+                        .get(TableConfigOptions.PLAN_RESTORE_CATALOG_OBJECTS);
+        final CatalogManager catalogManager =
+                SerdeContext.get(ctx).getFlinkContext().getCatalogManager();
+        final ObjectNode objectNode = jsonParser.readValueAsTree();
+
+        // Deserialize the two fields, if available
+        final ObjectIdentifier identifier =
+                JsonSerdeUtil.deserializeOptionalField(
+                                objectNode,
+                                FIELD_NAME_IDENTIFIER,
+                                ObjectIdentifier.class,
+                                jsonParser.getCodec(),
+                                ctx)
+                        .orElse(null);
+        ResolvedCatalogTable resolvedCatalogTable =
+                JsonSerdeUtil.deserializeOptionalField(
+                                objectNode,
+                                FIELD_NAME_CATALOG_TABLE,
+                                ResolvedCatalogTable.class,
+                                jsonParser.getCodec(),
+                                ctx)
+                        .orElse(null);
+
+        if (identifier == null && resolvedCatalogTable == null) {
+            throw new ValidationException(
+                    String.format(
+                            "The input json is invalid because it doesn't contain '%s', nor the '%s'.",
+                            FIELD_NAME_IDENTIFIER, FIELD_NAME_CATALOG_TABLE));
+        }
+
+        if (identifier == null) {
+            if (isLookupForced(planRestoreOption)) {
+                throw missingIdentifier();
+            }
+            return ContextResolvedTable.anonymous(resolvedCatalogTable);
+        }
+
+        Optional<ContextResolvedTable> contextResolvedTableFromCatalog =
+                isLookupEnabled(planRestoreOption)
+                        ? catalogManager.getTable(identifier)
+                        : Optional.empty();
+
+        // If we have a schema from the plan and from the catalog, we need to check they match.
+        if (contextResolvedTableFromCatalog.isPresent() && resolvedCatalogTable != null) {
+            ResolvedSchema schemaFromPlan = resolvedCatalogTable.getResolvedSchema();
+            ResolvedSchema schemaFromCatalog =
+                    contextResolvedTableFromCatalog.get().getResolvedSchema();
+            if (!areResolvedSchemasEqual(schemaFromPlan, schemaFromCatalog)) {
+                throw schemaNotMatching(identifier, schemaFromPlan, schemaFromCatalog);
+            }
+        }
+
+        if (resolvedCatalogTable == null || isLookupForced(planRestoreOption)) {
+            if (!isLookupEnabled(planRestoreOption)) {
+                throw lookupDisabled(identifier);
+            }
+            // We use what is stored inside the catalog
+            return contextResolvedTableFromCatalog.orElseThrow(
+                    () -> missingTableFromCatalog(identifier));
+        }
+
+        if (contextResolvedTableFromCatalog.isPresent()) {
+            // If no config map is present, then the ContextResolvedTable was serialized with
+            // SCHEMA, so we just need to return the catalog query result
+            if (objectNode.at("/" + FIELD_NAME_CATALOG_TABLE + "/" + OPTIONS).isMissingNode()) {
+                return contextResolvedTableFromCatalog.get();
+            }
+
+            return contextResolvedTableFromCatalog
+                    .flatMap(ContextResolvedTable::getCatalog)
+                    .map(c -> ContextResolvedTable.permanent(identifier, c, resolvedCatalogTable))
+                    .orElseGet(
+                            () -> ContextResolvedTable.temporary(identifier, resolvedCatalogTable));
+        }
+
+        return ContextResolvedTable.temporary(identifier, resolvedCatalogTable);
+    }
+
+    private boolean areResolvedSchemasEqual(
+            ResolvedSchema schemaFromPlan, ResolvedSchema schemaFromCatalog) {
+        // For schema equality we check:
+        //  * Columns size and order
+        //  * For each column: name, kind (class) and type
+        //  * Check partition keys set equality
+        @SuppressWarnings("rawtypes")
+        List<Tuple3<String, Class, DataType>> columnsFromPlan =
+                schemaFromPlan.getColumns().stream()
+                        .map(c -> Tuple3.of(c.getName(), (Class) c.getClass(), c.getDataType()))
+                        .collect(Collectors.toList());
+
+        @SuppressWarnings("rawtypes")
+        List<Tuple3<String, Class, DataType>> columnsFromCatalog =
+                schemaFromCatalog.getColumns().stream()
+                        .map(c -> Tuple3.of(c.getName(), (Class) c.getClass(), c.getDataType()))
+                        .collect(Collectors.toList());
+
+        return Objects.equals(columnsFromPlan, columnsFromCatalog)
+                && Objects.equals(
+                        schemaFromPlan.getPrimaryKey(), schemaFromCatalog.getPrimaryKey());
+    }
+
+    private boolean isLookupForced(CatalogPlanRestore planRestoreOption) {
+        return planRestoreOption == CatalogPlanRestore.IDENTIFIER;
+    }
+
+    private boolean isLookupEnabled(CatalogPlanRestore planRestoreOption) {
+        return planRestoreOption != CatalogPlanRestore.ALL_ENFORCED;
+    }
+
+    static ValidationException missingIdentifier() {
+        return new ValidationException(
+                String.format(
+                        "The ContextResolvedTable cannot be deserialized, as no identifier is present within the json, "
+                                + "but lookup is forced by '%s' == '%s'. "
+                                + "Either allow restoring table from the catalog with '%s' == '%s' | '%s' or make sure you don't use anonymous tables when generating the plan.",
+                        TableConfigOptions.PLAN_RESTORE_CATALOG_OBJECTS.key(),
+                        CatalogPlanRestore.IDENTIFIER.name(),
+                        TableConfigOptions.PLAN_RESTORE_CATALOG_OBJECTS.key(),
+                        CatalogPlanRestore.ALL.name(),
+                        CatalogPlanRestore.ALL_ENFORCED.name()));
+    }
+
+    static ValidationException lookupDisabled(ObjectIdentifier objectIdentifier) {
+        return new ValidationException(
+                String.format(
+                        "The ContextResolvedTable with identifier %s does not contain any %s field, "
+                                + "but lookup is disabled because option '%s' == '%s'. "
+                                + "Either enable the catalog lookup with '%s' == '%s' | '%s' or regenerate the plan with '%s' != '%s'.",
+                        objectIdentifier,
+                        FIELD_NAME_CATALOG_TABLE,
+                        TableConfigOptions.PLAN_RESTORE_CATALOG_OBJECTS.key(),
+                        CatalogPlanRestore.ALL_ENFORCED.name(),
+                        TableConfigOptions.PLAN_RESTORE_CATALOG_OBJECTS.key(),
+                        CatalogPlanRestore.IDENTIFIER.name(),
+                        CatalogPlanRestore.ALL.name(),
+                        TableConfigOptions.PLAN_COMPILE_CATALOG_OBJECTS.key(),
+                        TableConfigOptions.CatalogPlanCompilation.IDENTIFIER.name()));
+    }
+
+    static ValidationException schemaNotMatching(
+            ObjectIdentifier objectIdentifier,
+            ResolvedSchema schemaFromPlan,
+            ResolvedSchema schemaFromCatalog) {
+        return new ValidationException(
+                String.format(
+                        "The schema of the table '%s' from the persisted plan does not match the schema loaded from the catalog: '%s' != '%s'. "
+                                + "Have you modified the table schema in the catalog before restoring the plan?.",
+                        objectIdentifier, schemaFromPlan, schemaFromCatalog));
+    }
+
+    static ValidationException missingTableFromCatalog(ObjectIdentifier objectIdentifier) {
+        return new ValidationException(
+                String.format(
+                        "CatalogManager cannot resolve the table with identifier %s and ContextResolvedTable does not contain any %s field. "

Review comment:
       Don't expose internal classes such as `CatalogManager` or `ContextResolvedTable` in exceptions. Also above.

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/ContextResolvedTableJsonSerializer.java
##########
@@ -0,0 +1,98 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.table.api.ValidationException;
+import org.apache.flink.table.api.config.TableConfigOptions;
+import org.apache.flink.table.api.config.TableConfigOptions.CatalogPlanCompilation;
+import org.apache.flink.table.catalog.ContextResolvedTable;
+import org.apache.flink.table.catalog.ObjectIdentifier;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonGenerator;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.SerializerProvider;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.std.StdSerializer;
+
+import java.io.IOException;
+
+/** JSON serializer for {@link ContextResolvedTable}. */
+class ContextResolvedTableJsonSerializer extends StdSerializer<ContextResolvedTable> {
+    private static final long serialVersionUID = 1L;
+
+    public static final String FIELD_NAME_IDENTIFIER = "identifier";
+    public static final String FIELD_NAME_CATALOG_TABLE = "catalogTable";
+
+    public ContextResolvedTableJsonSerializer() {
+        super(ContextResolvedTable.class);
+    }
+
+    @Override
+    public void serialize(
+            ContextResolvedTable contextResolvedTable,
+            JsonGenerator jsonGenerator,
+            SerializerProvider serializerProvider)
+            throws IOException {
+        final CatalogPlanCompilation planCompilationOption =
+                SerdeContext.get(serializerProvider)
+                        .getConfiguration()
+                        .get(TableConfigOptions.PLAN_COMPILE_CATALOG_OBJECTS);
+
+        if (contextResolvedTable.isAnonymous()
+                && planCompilationOption == CatalogPlanCompilation.IDENTIFIER) {
+            throw cannotSerializeAnonymousTable(contextResolvedTable.getIdentifier());
+        }
+
+        jsonGenerator.writeStartObject();
+
+        if (!contextResolvedTable.isAnonymous()) {
+            // Serialize object identifier
+            jsonGenerator.writeObjectField(
+                    FIELD_NAME_IDENTIFIER, contextResolvedTable.getIdentifier());
+        }
+
+        if ((contextResolvedTable.isPermanent() || contextResolvedTable.isAnonymous())
+                && planCompilationOption != CatalogPlanCompilation.IDENTIFIER) {
+            // Pass to the ResolvedCatalogTableJsonSerializer the option to serialize or not the
+            // identifier
+            serializerProvider.setAttribute(

Review comment:
       I really find it weird that a serializer sets attributes.

##########
File path: flink-table/flink-table-planner/src/test/resources/org/apache/flink/table/planner/plan/nodes/exec/stream/GroupWindowAggregateJsonPlanTest_jsonplan/testEventTimeHopWindow.out
##########
@@ -3,28 +3,117 @@
   "nodes" : [ {
     "class" : "org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecTableSourceScan",
     "scanTableSource" : {
-      "identifier" : {
-        "catalogName" : "default_catalog",
-        "databaseName" : "default_database",
-        "tableName" : "MyTable"
-      },
       "catalogTable" : {
-        "schema.watermark.0.strategy.expr" : "`rowtime` - INTERVAL '1' SECOND",
-        "schema.4.expr" : "PROCTIME()",
-        "schema.0.data-type" : "INT",
-        "schema.2.name" : "c",
-        "schema.1.name" : "b",
-        "schema.4.name" : "proctime",
-        "schema.1.data-type" : "BIGINT",
-        "schema.3.data-type" : "TIMESTAMP(3)",
-        "schema.2.data-type" : "VARCHAR(2147483647)",
-        "schema.3.name" : "rowtime",
-        "connector" : "values",
-        "schema.watermark.0.rowtime" : "rowtime",
-        "schema.watermark.0.strategy.data-type" : "TIMESTAMP(3)",
-        "schema.3.expr" : "TO_TIMESTAMP(`c`)",
-        "schema.4.data-type" : "TIMESTAMP(3) WITH LOCAL TIME ZONE NOT NULL",
-        "schema.0.name" : "a"
+        "identifier" : "`default_catalog`.`default_database`.`MyTable`",
+        "catalogTable" : {
+          "resolvedSchema" : {
+            "columns" : [ {
+              "name" : "a",
+              "type" : "physical",
+              "dataType" : "INT"
+            }, {
+              "name" : "b",
+              "type" : "physical",
+              "dataType" : "BIGINT"
+            }, {
+              "name" : "c",
+              "type" : "physical",
+              "dataType" : {
+                "logicalType" : "VARCHAR(2147483647)",
+                "conversionClass" : "java.lang.String"
+              }
+            }, {
+              "name" : "rowtime",
+              "type" : "computed",
+              "expression" : {
+                "type" : "rexNodeExpression",

Review comment:
       let's remove `"type" : "rexNodeExpression"` until we have something else than `RexNodeExpression`

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/ResolvedExpressionJsonSerializer.java
##########
@@ -0,0 +1,75 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.table.api.ValidationException;
+import org.apache.flink.table.expressions.ResolvedExpression;
+import org.apache.flink.table.planner.expressions.RexNodeExpression;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonGenerator;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.SerializerProvider;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.std.StdSerializer;
+
+import java.io.IOException;
+
+class ResolvedExpressionJsonSerializer extends StdSerializer<ResolvedExpression> {
+
+    public static final String TYPE = "type";
+    public static final String TYPE_REX_NODE_EXPRESSION = "rexNodeExpression";
+    public static final String REX_NODE = "rexNode";
+    public static final String OUTPUT_DATA_TYPE = "outputDataType";
+    public static final String SERIALIZABLE_STRING = "serializableString";
+
+    protected ResolvedExpressionJsonSerializer() {
+        super(ResolvedExpression.class);
+    }
+
+    @Override
+    public void serialize(
+            ResolvedExpression resolvedExpression,
+            JsonGenerator jsonGenerator,
+            SerializerProvider serializerProvider)
+            throws IOException {
+        jsonGenerator.writeStartObject();
+
+        if (resolvedExpression instanceof RexNodeExpression) {
+            serialize((RexNodeExpression) resolvedExpression, jsonGenerator, serializerProvider);
+        } else {
+            throw new ValidationException(
+                    String.format(
+                            "Expression '%s' cannot be serialized. "
+                                    + "Currently, only SQL expressions can be serialized in the persisted plan.",
+                            resolvedExpression.asSummaryString()));
+        }
+
+        jsonGenerator.writeEndObject();
+    }
+
+    private void serialize(
+            RexNodeExpression expression,
+            JsonGenerator jsonGenerator,
+            SerializerProvider serializerProvider)
+            throws IOException {
+        jsonGenerator.writeStringField(TYPE, TYPE_REX_NODE_EXPRESSION);
+        serializerProvider.defaultSerializeField(REX_NODE, expression.getRexNode(), jsonGenerator);
+        serializerProvider.defaultSerializeField(

Review comment:
       Can't we even derive the type from the `RexNode`? I have the feeling we are duplicating a lot. Actually `serializableString` and `rexNode` are also kind of duplictated. Given that they are actually not required anymore as they are persisted in the operator following the plan.

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/spec/DynamicTableSinkSpec.java
##########
@@ -36,32 +37,56 @@
 import javax.annotation.Nullable;
 
 import java.util.List;
+import java.util.Objects;
 
 /**
  * {@link DynamicTableSourceSpec} describes how to serialize/deserialize dynamic table sink table
  * and create {@link DynamicTableSink} from the deserialization result.
  */
 @JsonIgnoreProperties(ignoreUnknown = true)
 @JsonInclude(JsonInclude.Include.NON_EMPTY)
-public class DynamicTableSinkSpec extends CatalogTableSpecBase {
+public class DynamicTableSinkSpec {
 
+    public static final String FIELD_NAME_CATALOG_TABLE_SPEC = "catalogTable";
     public static final String FIELD_NAME_SINK_ABILITY_SPECS = "sinkAbilitySpecs";
 
-    @JsonIgnore private DynamicTableSink tableSink;
-
-    @JsonProperty(FIELD_NAME_SINK_ABILITY_SPECS)
+    private final ContextResolvedTable contextResolvedTable;
     private final @Nullable List<SinkAbilitySpec> sinkAbilitySpecs;
 
+    @JsonIgnore private DynamicTableSink tableSink;
+    @JsonIgnore private ClassLoader classLoader;

Review comment:
       why do we need class loader and configuration here? can't we access them from other contexts. I seems wrong to me that every instance has references to those.

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/ColumnJsonSerializer.java
##########
@@ -0,0 +1,98 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.table.catalog.Column;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonGenerator;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.SerializerProvider;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.std.StdSerializer;
+
+import java.io.IOException;
+
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.JsonSerdeUtil.serializeOptionalField;
+
+class ColumnJsonSerializer extends StdSerializer<Column> {
+
+    public static final String COLUMN_TYPE = "type";
+    public static final String COLUMN_TYPE_PHYSICAL = "physical";
+    public static final String COLUMN_TYPE_COMPUTED = "computed";
+    public static final String COLUMN_TYPE_METADATA = "metadata";
+    public static final String NAME = "name";
+    public static final String DATA_TYPE = "dataType";
+    public static final String COMMENT = "comment";
+    public static final String EXPRESSION = "expression";
+    public static final String METADATA_KEY = "metadataKey";
+    public static final String IS_VIRTUAL = "isVirtual";
+
+    public ColumnJsonSerializer() {
+        super(Column.class);
+    }
+
+    @Override
+    public void serialize(
+            Column column, JsonGenerator jsonGenerator, SerializerProvider serializerProvider)
+            throws IOException {
+        jsonGenerator.writeStartObject();
+
+        // Common fields
+        jsonGenerator.writeStringField(NAME, column.getName());
+        serializeOptionalField(jsonGenerator, COMMENT, column.getComment(), serializerProvider);
+
+        if (column instanceof Column.PhysicalColumn) {
+            serialize((Column.PhysicalColumn) column, jsonGenerator, serializerProvider);
+        } else if (column instanceof Column.MetadataColumn) {
+            serialize((Column.MetadataColumn) column, jsonGenerator, serializerProvider);
+        } else {
+            serialize((Column.ComputedColumn) column, jsonGenerator, serializerProvider);
+        }
+
+        jsonGenerator.writeEndObject();
+    }
+
+    private void serialize(

Review comment:
       `static`

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/ColumnJsonDeserializer.java
##########
@@ -0,0 +1,117 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.table.api.ValidationException;
+import org.apache.flink.table.catalog.Column;
+import org.apache.flink.table.expressions.ResolvedExpression;
+import org.apache.flink.table.types.DataType;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonParser;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.ObjectCodec;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.StdDeserializer;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode;
+
+import java.io.IOException;
+import java.util.Arrays;
+
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ColumnJsonSerializer.COLUMN_TYPE;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ColumnJsonSerializer.COLUMN_TYPE_COMPUTED;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ColumnJsonSerializer.COLUMN_TYPE_METADATA;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ColumnJsonSerializer.COLUMN_TYPE_PHYSICAL;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ColumnJsonSerializer.COMMENT;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ColumnJsonSerializer.DATA_TYPE;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ColumnJsonSerializer.EXPRESSION;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ColumnJsonSerializer.IS_VIRTUAL;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ColumnJsonSerializer.METADATA_KEY;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ColumnJsonSerializer.NAME;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.JsonSerdeUtil.deserializeOptionalField;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.JsonSerdeUtil.traverse;
+
+class ColumnJsonDeserializer extends StdDeserializer<Column> {
+
+    private static final String[] SUPPORTED_COLUMN_TYPES =
+            new String[] {COLUMN_TYPE_PHYSICAL, COLUMN_TYPE_COMPUTED, COLUMN_TYPE_METADATA};
+
+    public ColumnJsonDeserializer() {
+        super(Column.class);
+    }
+
+    @Override
+    public Column deserialize(JsonParser jsonParser, DeserializationContext ctx)
+            throws IOException {
+        ObjectNode jsonNode = jsonParser.readValueAsTree();
+        String columnName = jsonNode.required(NAME).asText();
+        String columnType = jsonNode.required(COLUMN_TYPE).asText();
+
+        Column column;
+        switch (columnType) {
+            case COLUMN_TYPE_PHYSICAL:
+                column =
+                        deserializePhysicalColumn(columnName, jsonNode, jsonParser.getCodec(), ctx);
+                break;
+            case COLUMN_TYPE_COMPUTED:
+                column =
+                        deserializeComputedColumn(columnName, jsonNode, jsonParser.getCodec(), ctx);
+                break;
+            case COLUMN_TYPE_METADATA:
+                column =
+                        deserializeMetadataColumn(columnName, jsonNode, jsonParser.getCodec(), ctx);
+                break;
+            default:
+                throw new ValidationException(
+                        String.format(
+                                "Cannot recognize column type '%s'. Allowed types: %s.",
+                                columnType, Arrays.toString(SUPPORTED_COLUMN_TYPES)));
+        }
+        return column.withComment(
+                deserializeOptionalField(
+                                jsonNode, COMMENT, String.class, jsonParser.getCodec(), ctx)
+                        .orElse(null));
+    }
+
+    public Column.PhysicalColumn deserializePhysicalColumn(

Review comment:
       `private static`

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/spec/DynamicTableSourceSpec.java
##########
@@ -136,13 +137,74 @@ public LookupTableSource getLookupTableSource(FlinkContext flinkContext) {
         }
     }
 
-    public void setTableSource(DynamicTableSource tableSource) {
-        this.tableSource = tableSource;
+    @JsonGetter(FIELD_NAME_CATALOG_TABLE_SPEC)
+    public ContextResolvedTable getContextResolvedTable() {
+        return contextResolvedTable;
     }
 
-    @JsonIgnore
+    @JsonGetter(FIELD_NAME_SOURCE_ABILITY_SPECS)
     @Nullable
     public List<SourceAbilitySpec> getSourceAbilitySpecs() {
         return sourceAbilitySpecs;
     }
+
+    @JsonIgnore
+    public ClassLoader getClassLoader() {
+        return classLoader;
+    }
+
+    @JsonIgnore
+    public ReadableConfig getReadableConfig() {
+        return configuration;
+    }
+
+    public void setTableSource(DynamicTableSource tableSource) {
+        this.tableSource = tableSource;
+    }
+
+    public void setClassLoader(ClassLoader classLoader) {

Review comment:
       we should definitely try to make specs immutable.

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/ObjectIdentifierJsonDeserializer.java
##########
@@ -42,15 +38,28 @@ public ObjectIdentifierJsonDeserializer() {
 
     @Override
     public ObjectIdentifier deserialize(JsonParser jsonParser, DeserializationContext ctx)
-            throws IOException, JsonProcessingException {
-        final JsonNode identifierNode = jsonParser.readValueAsTree();
-        return deserialize(identifierNode);
+            throws IOException {
+        return deserialize(jsonParser.getValueAsString(), SerdeContext.get(ctx));
     }
 
-    public static ObjectIdentifier deserialize(JsonNode identifierNode) {
+    static ObjectIdentifier deserialize(String identifierStr, SerdeContext ctx) {

Review comment:
       This could have been a JIRA issue and PR on its own. It is better to fork PRs instead of having these 13K PRs that are hard to review.

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/ResolvedCatalogTableJsonSerializer.java
##########
@@ -0,0 +1,86 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.table.api.ValidationException;
+import org.apache.flink.table.catalog.CatalogBaseTable;
+import org.apache.flink.table.catalog.ExternalCatalogTable;
+import org.apache.flink.table.catalog.ResolvedCatalogTable;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonGenerator;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.SerializerProvider;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.std.StdSerializer;
+
+import java.io.IOException;
+
+/**
+ * This serializer can be configured via an attribute to serialize or not the options and comments,
+ * setting the attribute {@link #SERIALIZE_OPTIONS} to {@code true} or {@code false}.
+ */
+class ResolvedCatalogTableJsonSerializer extends StdSerializer<ResolvedCatalogTable> {
+    private static final long serialVersionUID = 1L;
+
+    static final String SERIALIZE_OPTIONS = "serialize_options";
+
+    public static final String RESOLVED_SCHEMA = "resolvedSchema";
+    public static final String PARTITION_KEYS = "partitionKeys";
+    public static final String OPTIONS = "options";
+    public static final String COMMENT = "comment";
+
+    public ResolvedCatalogTableJsonSerializer() {
+        super(ResolvedCatalogTable.class);
+    }
+
+    @Override
+    public void serialize(
+            ResolvedCatalogTable resolvedCatalogTable,
+            JsonGenerator jsonGenerator,
+            SerializerProvider serializerProvider)
+            throws IOException {
+        // Thia should never happen anyway, but we keep this assertion for sanity check
+        assert resolvedCatalogTable.getTableKind() == CatalogBaseTable.TableKind.TABLE;
+
+        boolean serializeOptions =
+                serializerProvider.getAttribute(SERIALIZE_OPTIONS) == null
+                        || (boolean) serializerProvider.getAttribute(SERIALIZE_OPTIONS);
+
+        jsonGenerator.writeStartObject();
+
+        if (resolvedCatalogTable.getOrigin() instanceof ExternalCatalogTable) {
+            throw new ValidationException(
+                    "Cannot serialize the table as it's an external inline table. "
+                            + "This might be caused by a usage of "
+                            + "StreamTableEnvironment#fromDataStream or TableResult#collect, "
+                            + "which are not supported by the persisted plan");

Review comment:
       nit: dot at the end of exceptions

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/spec/DynamicTableSourceSpec.java
##########
@@ -81,19 +84,19 @@ private DynamicTableSource getTableSource(FlinkContext flinkContext) {
 
             tableSource =
                     FactoryUtil.createDynamicTableSource(
-                            // TODO Support creating from a catalog
                             factory,
-                            objectIdentifier,
-                            catalogTable,
+                            contextResolvedTable.getIdentifier(),
+                            contextResolvedTable.getResolvedTable(),
+                            SpecUtil.loadOptionsFromCatalogTable(

Review comment:
       we should not add too many utils. I think this method would be a good candidate for a upper class `DynamicTableSpec`?

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/ResolvedExpressionJsonSerializer.java
##########
@@ -0,0 +1,75 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.table.api.ValidationException;
+import org.apache.flink.table.expressions.ResolvedExpression;
+import org.apache.flink.table.planner.expressions.RexNodeExpression;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonGenerator;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.SerializerProvider;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.std.StdSerializer;
+
+import java.io.IOException;
+
+class ResolvedExpressionJsonSerializer extends StdSerializer<ResolvedExpression> {
+
+    public static final String TYPE = "type";
+    public static final String TYPE_REX_NODE_EXPRESSION = "rexNodeExpression";
+    public static final String REX_NODE = "rexNode";
+    public static final String OUTPUT_DATA_TYPE = "outputDataType";
+    public static final String SERIALIZABLE_STRING = "serializableString";
+
+    protected ResolvedExpressionJsonSerializer() {
+        super(ResolvedExpression.class);
+    }
+
+    @Override
+    public void serialize(
+            ResolvedExpression resolvedExpression,
+            JsonGenerator jsonGenerator,
+            SerializerProvider serializerProvider)
+            throws IOException {
+        jsonGenerator.writeStartObject();
+
+        if (resolvedExpression instanceof RexNodeExpression) {
+            serialize((RexNodeExpression) resolvedExpression, jsonGenerator, serializerProvider);
+        } else {
+            throw new ValidationException(
+                    String.format(
+                            "Expression '%s' cannot be serialized. "
+                                    + "Currently, only SQL expressions can be serialized in the persisted plan.",
+                            resolvedExpression.asSummaryString()));
+        }
+
+        jsonGenerator.writeEndObject();
+    }
+
+    private void serialize(
+            RexNodeExpression expression,
+            JsonGenerator jsonGenerator,
+            SerializerProvider serializerProvider)
+            throws IOException {
+        jsonGenerator.writeStringField(TYPE, TYPE_REX_NODE_EXPRESSION);
+        serializerProvider.defaultSerializeField(REX_NODE, expression.getRexNode(), jsonGenerator);
+        serializerProvider.defaultSerializeField(

Review comment:
       Only serialize the logical type, this should be enough. It might was a mistake to let `Expression` return `DataType`. But we should not let this mistake bubble into the persisted plan.

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/UniqueConstraintMixin.java
##########
@@ -0,0 +1,55 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.table.catalog.Constraint.ConstraintType;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.annotation.JsonCreator;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.annotation.JsonInclude;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.annotation.JsonProperty;
+
+import java.util.List;
+
+abstract class UniqueConstraintMixin {

Review comment:
       link to the class that this mixin references in the JavaDocs

##########
File path: flink-table/flink-table-planner/src/test/resources/jsonplan/testGetJsonPlan.out
##########
@@ -1,76 +1,110 @@
 {
-   "flinkVersion":"",
-   "nodes":[
-      {
-         "class":"org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecTableSourceScan",
-         "scanTableSource":{
-            "identifier":{
-               "catalogName":"default_catalog",
-               "databaseName":"default_database",
-               "tableName":"MyTable"
+  "flinkVersion": "",
+  "nodes": [
+    {
+      "class": "org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecTableSourceScan",
+      "scanTableSource": {
+        "catalogTable": {
+          "identifier": "`default_catalog`.`default_database`.`MyTable`",
+          "catalogTable": {
+            "resolvedSchema": {
+              "columns": [
+                {
+                  "name": "a",
+                  "type": "physical",
+                  "dataType": "BIGINT"
+                },
+                {
+                  "name": "b",
+                  "type": "physical",

Review comment:
       type computed is implicit if there is an expression, metadata with a metadata key

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/ContextResolvedTableJsonDeserializer.java
##########
@@ -0,0 +1,220 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.api.java.tuple.Tuple3;
+import org.apache.flink.table.api.ValidationException;
+import org.apache.flink.table.api.config.TableConfigOptions;
+import org.apache.flink.table.api.config.TableConfigOptions.CatalogPlanRestore;
+import org.apache.flink.table.catalog.CatalogManager;
+import org.apache.flink.table.catalog.ContextResolvedTable;
+import org.apache.flink.table.catalog.ObjectIdentifier;
+import org.apache.flink.table.catalog.ResolvedCatalogTable;
+import org.apache.flink.table.catalog.ResolvedSchema;
+import org.apache.flink.table.types.DataType;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonParser;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.StdDeserializer;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode;
+
+import java.io.IOException;
+import java.util.List;
+import java.util.Objects;
+import java.util.Optional;
+import java.util.stream.Collectors;
+
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ContextResolvedTableJsonSerializer.FIELD_NAME_CATALOG_TABLE;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ContextResolvedTableJsonSerializer.FIELD_NAME_IDENTIFIER;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ResolvedCatalogTableJsonSerializer.OPTIONS;
+
+class ContextResolvedTableJsonDeserializer extends StdDeserializer<ContextResolvedTable> {
+    private static final long serialVersionUID = 1L;
+
+    public ContextResolvedTableJsonDeserializer() {
+        super(ContextResolvedTable.class);
+    }
+
+    @Override
+    public ContextResolvedTable deserialize(JsonParser jsonParser, DeserializationContext ctx)
+            throws IOException {
+        final CatalogPlanRestore planRestoreOption =
+                SerdeContext.get(ctx)
+                        .getConfiguration()
+                        .get(TableConfigOptions.PLAN_RESTORE_CATALOG_OBJECTS);
+        final CatalogManager catalogManager =
+                SerdeContext.get(ctx).getFlinkContext().getCatalogManager();
+        final ObjectNode objectNode = jsonParser.readValueAsTree();
+
+        // Deserialize the two fields, if available
+        final ObjectIdentifier identifier =
+                JsonSerdeUtil.deserializeOptionalField(
+                                objectNode,
+                                FIELD_NAME_IDENTIFIER,
+                                ObjectIdentifier.class,
+                                jsonParser.getCodec(),
+                                ctx)
+                        .orElse(null);
+        ResolvedCatalogTable resolvedCatalogTable =
+                JsonSerdeUtil.deserializeOptionalField(
+                                objectNode,
+                                FIELD_NAME_CATALOG_TABLE,
+                                ResolvedCatalogTable.class,
+                                jsonParser.getCodec(),
+                                ctx)
+                        .orElse(null);
+
+        if (identifier == null && resolvedCatalogTable == null) {
+            throw new ValidationException(
+                    String.format(
+                            "The input json is invalid because it doesn't contain '%s', nor the '%s'.",
+                            FIELD_NAME_IDENTIFIER, FIELD_NAME_CATALOG_TABLE));
+        }
+
+        if (identifier == null) {
+            if (isLookupForced(planRestoreOption)) {
+                throw missingIdentifier();
+            }
+            return ContextResolvedTable.anonymous(resolvedCatalogTable);
+        }
+
+        Optional<ContextResolvedTable> contextResolvedTableFromCatalog =
+                isLookupEnabled(planRestoreOption)
+                        ? catalogManager.getTable(identifier)
+                        : Optional.empty();
+
+        // If we have a schema from the plan and from the catalog, we need to check they match.
+        if (contextResolvedTableFromCatalog.isPresent() && resolvedCatalogTable != null) {
+            ResolvedSchema schemaFromPlan = resolvedCatalogTable.getResolvedSchema();
+            ResolvedSchema schemaFromCatalog =
+                    contextResolvedTableFromCatalog.get().getResolvedSchema();
+            if (!areResolvedSchemasEqual(schemaFromPlan, schemaFromCatalog)) {
+                throw schemaNotMatching(identifier, schemaFromPlan, schemaFromCatalog);
+            }
+        }
+
+        if (resolvedCatalogTable == null || isLookupForced(planRestoreOption)) {
+            if (!isLookupEnabled(planRestoreOption)) {
+                throw lookupDisabled(identifier);
+            }
+            // We use what is stored inside the catalog
+            return contextResolvedTableFromCatalog.orElseThrow(
+                    () -> missingTableFromCatalog(identifier));
+        }
+
+        if (contextResolvedTableFromCatalog.isPresent()) {
+            // If no config map is present, then the ContextResolvedTable was serialized with
+            // SCHEMA, so we just need to return the catalog query result
+            if (objectNode.at("/" + FIELD_NAME_CATALOG_TABLE + "/" + OPTIONS).isMissingNode()) {
+                return contextResolvedTableFromCatalog.get();
+            }
+
+            return contextResolvedTableFromCatalog
+                    .flatMap(ContextResolvedTable::getCatalog)
+                    .map(c -> ContextResolvedTable.permanent(identifier, c, resolvedCatalogTable))
+                    .orElseGet(
+                            () -> ContextResolvedTable.temporary(identifier, resolvedCatalogTable));
+        }
+
+        return ContextResolvedTable.temporary(identifier, resolvedCatalogTable);
+    }
+
+    private boolean areResolvedSchemasEqual(
+            ResolvedSchema schemaFromPlan, ResolvedSchema schemaFromCatalog) {
+        // For schema equality we check:
+        //  * Columns size and order
+        //  * For each column: name, kind (class) and type
+        //  * Check partition keys set equality
+        @SuppressWarnings("rawtypes")
+        List<Tuple3<String, Class, DataType>> columnsFromPlan =

Review comment:
       can we avoid using `Tuple3`, this is a class from the DataStream API and actually there is a reason why Java has no tuples. usually, there is always a better alternative. A `for` loop with "early out" in this case.

##########
File path: flink-table/flink-table-planner/src/test/resources/jsonplan/testGetJsonPlan.out
##########
@@ -1,76 +1,110 @@
 {
-   "flinkVersion":"",
-   "nodes":[
-      {
-         "class":"org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecTableSourceScan",
-         "scanTableSource":{
-            "identifier":{
-               "catalogName":"default_catalog",
-               "databaseName":"default_database",
-               "tableName":"MyTable"
+  "flinkVersion": "",
+  "nodes": [
+    {
+      "class": "org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecTableSourceScan",
+      "scanTableSource": {
+        "catalogTable": {
+          "identifier": "`default_catalog`.`default_database`.`MyTable`",
+          "catalogTable": {
+            "resolvedSchema": {
+              "columns": [
+                {
+                  "name": "a",
+                  "type": "physical",
+                  "dataType": "BIGINT"
+                },
+                {
+                  "name": "b",
+                  "type": "physical",
+                  "dataType": "INT"
+                },
+                {
+                  "name": "c",
+                  "type": "physical",
+                  "dataType": {
+                    "logicalType": "VARCHAR(2147483647)",

Review comment:
       let's only serialize the logical type

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/ConverterDelegatingDeserializer.java
##########
@@ -0,0 +1,98 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonParser;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.BeanDescription;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationConfig;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.JsonDeserializer;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerModifier;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.DelegatingDeserializer;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.module.SimpleModule;
+
+import java.io.IOException;
+import java.util.Arrays;
+import java.util.Map;
+import java.util.function.Function;
+import java.util.stream.Collectors;
+
+/**
+ * Deserializer which delegates to the default {@link BeanDeserializer} and then executes custom
+ * code to perform a conversion to another final value.
+ *
+ * <p>Use the {@link Converter} when you want to use Jackson annotations for defining serializers
+ * and deserializers, but after the deserialization you need to perform an additional transformation
+ * step that doesn't depend on the original JSON, e.g. enrich the output value with info from {@link
+ * SerdeContext}.
+ */
+class ConverterDelegatingDeserializer<T, R> extends DelegatingDeserializer {

Review comment:
       I'm pretty sure we don't need this class. Let's have an offline chat about it.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] slinkydeveloper commented on a change in pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
slinkydeveloper commented on a change in pull request #18427:
URL: https://github.com/apache/flink/pull/18427#discussion_r793682035



##########
File path: flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/serde/DynamicTableSourceSpecSerdeTest.java
##########
@@ -238,6 +231,135 @@ public void testDynamicTableSourceSpecSerde() throws IOException {
                                                         put("p", "B");
                                                     }
                                                 }))));
-        return Arrays.asList(spec1, spec2);
+        return Stream.of(spec1, spec2);
+    }
+
+    @ParameterizedTest
+    @MethodSource("testDynamicTableSinkSpecSerde")
+    public void testDynamicTableSourceSpecSerde(DynamicTableSourceSpec spec) throws IOException {
+        TableEnvironmentImpl tableEnv =
+                (TableEnvironmentImpl) TableEnvironment.create(inStreamingMode());
+
+        CatalogManager catalogManager = tableEnv.getCatalogManager();

Review comment:
       I tried to unify a bit some mocking, but there are some small differences between the two. Abstracting those will just make the test unreadable.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173",
       "triggerID" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311",
       "triggerID" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "triggerType" : "PUSH"
     }, {
       "hash" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318",
       "triggerID" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a8365736913376f03c3bb64e2a927e46d9dee10a",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30327",
       "triggerID" : "a8365736913376f03c3bb64e2a927e46d9dee10a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a8365736913376f03c3bb64e2a927e46d9dee10a Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30327) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173",
       "triggerID" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311",
       "triggerID" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "triggerType" : "PUSH"
     }, {
       "hash" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318",
       "triggerID" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a8365736913376f03c3bb64e2a927e46d9dee10a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30327",
       "triggerID" : "a8365736913376f03c3bb64e2a927e46d9dee10a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "82350d3f5d0dbb119914bd9fc96cba87bea89433",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30365",
       "triggerID" : "82350d3f5d0dbb119914bd9fc96cba87bea89433",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 82350d3f5d0dbb119914bd9fc96cba87bea89433 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30365) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] twalthr commented on a change in pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
twalthr commented on a change in pull request #18427:
URL: https://github.com/apache/flink/pull/18427#discussion_r793593197



##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/ContextResolvedTableJsonDeserializer.java
##########
@@ -0,0 +1,238 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.table.api.ValidationException;
+import org.apache.flink.table.api.config.TableConfigOptions.CatalogPlanCompilation;
+import org.apache.flink.table.api.config.TableConfigOptions.CatalogPlanRestore;
+import org.apache.flink.table.catalog.CatalogManager;
+import org.apache.flink.table.catalog.Column;
+import org.apache.flink.table.catalog.ContextResolvedTable;
+import org.apache.flink.table.catalog.ObjectIdentifier;
+import org.apache.flink.table.catalog.ResolvedCatalogTable;
+import org.apache.flink.table.catalog.ResolvedSchema;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonParser;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.StdDeserializer;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ObjectNode;
+
+import java.io.IOException;
+import java.util.List;
+import java.util.Objects;
+import java.util.Optional;
+
+import static org.apache.flink.table.api.config.TableConfigOptions.CatalogPlanRestore.IDENTIFIER;
+import static org.apache.flink.table.api.config.TableConfigOptions.PLAN_COMPILE_CATALOG_OBJECTS;
+import static org.apache.flink.table.api.config.TableConfigOptions.PLAN_RESTORE_CATALOG_OBJECTS;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ContextResolvedTableJsonSerializer.FIELD_NAME_CATALOG_TABLE;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ContextResolvedTableJsonSerializer.FIELD_NAME_IDENTIFIER;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.ResolvedCatalogTableJsonSerializer.OPTIONS;
+
+class ContextResolvedTableJsonDeserializer extends StdDeserializer<ContextResolvedTable> {
+    private static final long serialVersionUID = 1L;
+
+    public ContextResolvedTableJsonDeserializer() {
+        super(ContextResolvedTable.class);
+    }
+
+    @Override
+    public ContextResolvedTable deserialize(JsonParser jsonParser, DeserializationContext ctx)
+            throws IOException {
+        final CatalogPlanRestore planRestoreOption =
+                SerdeContext.get(ctx).getConfiguration().get(PLAN_RESTORE_CATALOG_OBJECTS);
+        final CatalogManager catalogManager =
+                SerdeContext.get(ctx).getFlinkContext().getCatalogManager();
+        final ObjectNode objectNode = jsonParser.readValueAsTree();
+
+        // Deserialize the two fields, if available
+        final ObjectIdentifier identifier =
+                JsonSerdeUtil.deserializeOptionalField(
+                                objectNode,
+                                FIELD_NAME_IDENTIFIER,
+                                ObjectIdentifier.class,
+                                jsonParser.getCodec(),
+                                ctx)
+                        .orElse(null);
+        ResolvedCatalogTable resolvedCatalogTable =
+                JsonSerdeUtil.deserializeOptionalField(
+                                objectNode,
+                                FIELD_NAME_CATALOG_TABLE,
+                                ResolvedCatalogTable.class,
+                                jsonParser.getCodec(),
+                                ctx)
+                        .orElse(null);
+
+        if (identifier == null && resolvedCatalogTable == null) {
+            throw new ValidationException(
+                    String.format(
+                            "The input json is invalid because it doesn't contain '%s', nor the '%s'.",

Review comment:
       (at multiple locations)




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173",
       "triggerID" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 228aec25fa06571e6e00e886da22e2598ca632b5 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173) 
   * 4b1fa7a01306c01a597b30f45c49d220fdc59dc7 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173",
       "triggerID" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311",
       "triggerID" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "triggerType" : "PUSH"
     }, {
       "hash" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 228aec25fa06571e6e00e886da22e2598ca632b5 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173) 
   * 4b1fa7a01306c01a597b30f45c49d220fdc59dc7 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311) 
   * dfd7ce9e194c47511c26ee1030e2db044f128a12 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18427: [FLINK-25386][table] Harden table persisted plan

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18427:
URL: https://github.com/apache/flink/pull/18427#issuecomment-1017644617


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29812",
       "triggerID" : "a69e8836becd5bbdedd183376c67dae35afc2960",
       "triggerType" : "PUSH"
     }, {
       "hash" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29876",
       "triggerID" : "b83fd153ef3df414e6a7766e26ccd37f788d728e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29885",
       "triggerID" : "367e6ab469bb0c69923e73dc458b74650bef48b1",
       "triggerType" : "PUSH"
     }, {
       "hash" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29892",
       "triggerID" : "07cec81a669917e2eb1531b2c054e04110781dfe",
       "triggerType" : "PUSH"
     }, {
       "hash" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30162",
       "triggerID" : "7005798448a0cf4e9571c6695569b61e7e4641af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30173",
       "triggerID" : "228aec25fa06571e6e00e886da22e2598ca632b5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30311",
       "triggerID" : "4b1fa7a01306c01a597b30f45c49d220fdc59dc7",
       "triggerType" : "PUSH"
     }, {
       "hash" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318",
       "triggerID" : "dfd7ce9e194c47511c26ee1030e2db044f128a12",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a8365736913376f03c3bb64e2a927e46d9dee10a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30327",
       "triggerID" : "a8365736913376f03c3bb64e2a927e46d9dee10a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * dfd7ce9e194c47511c26ee1030e2db044f128a12 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30318) 
   * a8365736913376f03c3bb64e2a927e46d9dee10a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30327) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org