You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by GitBox <gi...@apache.org> on 2020/05/19 12:41:11 UTC

[GitHub] [flink] SteNicholas opened a new pull request #12246: [FLINK-17303][python] Return TableResult for Python TableEnvironment

SteNicholas opened a new pull request #12246:
URL: https://github.com/apache/flink/pull/12246


   ## What is the purpose of the change
   
   *This pull request will introduce python `TableResult` class for python tableEnvironment to make sure consistent with Java, which could get `JobClient` (to associates the submitted Flink job), or print the execution result.*
   
   ## Brief change log
   
     - *Add `TableResult` class consistent with Java, including `get_job_client`, `get_table_schema`, `get_result_kind` and print methods for the representation of the statement execution result.*
     - *Add `JobClient` class that is scoped to a specific job, including `get_job_id`, `get_job_status`, `cancel`, `stop_with_savepoint`, `trigger_savepoint`, `get_accumulators` and `get_job_execution_result` for job execution result.*
   
   ## Verifying this change
   
     - *Add `test_execute_sql` method for `StreamSqlTests` to verify table result of DDL statement execution test case.*
   
   ## Does this pull request potentially affect one of the following parts:
   
     - Dependencies (does it add or upgrade a dependency): (yes / **no**)
     - The public API, i.e., is any changed class annotated with `@Public(Evolving)`: (yes / **no**)
     - The serializers: (yes / **no** / don't know)
     - The runtime per-record code paths (performance sensitive): (yes / **no** / don't know)
     - Anything that affects deployment or recovery: JobManager (and its components), Checkpointing, Kubernetes/Yarn/Mesos, ZooKeeper: (yes / **no** / don't know)
     - The S3 file system connector: (yes / **no** / don't know)
   
   ## Documentation
   
     - Does this pull request introduce a new feature? (yes / **no**)
     - If yes, how is the feature documented? (**not applicable** / docs / JavaDocs / not documented)


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] hequn8128 commented on pull request #12246: [FLINK-17303][python] Return TableResult for Python TableEnvironment

Posted by GitBox <gi...@apache.org>.
hequn8128 commented on pull request #12246:
URL: https://github.com/apache/flink/pull/12246#issuecomment-632431669


   @flinkbot run azure


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12246: [FLINK-17303][python] Return TableResult for Python TableEnvironment

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12246:
URL: https://github.com/apache/flink/pull/12246#issuecomment-630803193


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ce447aa89adaab2738a5235b41c289626877a09",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=1850",
       "triggerID" : "2ce447aa89adaab2738a5235b41c289626877a09",
       "triggerType" : "PUSH"
     }, {
       "hash" : "911e459fe53b61aa74ce3bc3d0761651eb7f61fb",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=1893",
       "triggerID" : "911e459fe53b61aa74ce3bc3d0761651eb7f61fb",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 911e459fe53b61aa74ce3bc3d0761651eb7f61fb Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=1893) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12246: [FLINK-17303][python] Return TableResult for Python TableEnvironment

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12246:
URL: https://github.com/apache/flink/pull/12246#issuecomment-630803193


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ce447aa89adaab2738a5235b41c289626877a09",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=1850",
       "triggerID" : "2ce447aa89adaab2738a5235b41c289626877a09",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2ce447aa89adaab2738a5235b41c289626877a09 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=1850) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] hequn8128 commented on a change in pull request #12246: [FLINK-17303][python] Return TableResult for Python TableEnvironment

Posted by GitBox <gi...@apache.org>.
hequn8128 commented on a change in pull request #12246:
URL: https://github.com/apache/flink/pull/12246#discussion_r428525745



##########
File path: flink-python/pyflink/table/tests/test_sql.py
##########
@@ -58,6 +56,67 @@ def test_sql_query(self):
         expected = ['2,Hi,Hello', '3,Hello,Hello']
         self.assert_equals(actual, expected)
 
+    def test_execute_sql(self):
+        t_env = self.t_env
+        table_result = t_env.execute_sql("create table tbl"
+                                         "("
+                                         "   a bigint,"
+                                         "   b int,"
+                                         "   c varchar"
+                                         ") with ("
+                                         "  'connector' = 'COLLECTION',"
+                                         "   'is-bounded' = 'false'"
+                                         ")")
+        self.assertIsNotNone(table_result)

Review comment:
       Remove this line. If table_result is None, the following tests would failed.

##########
File path: flink-python/pyflink/table/tests/test_sql.py
##########
@@ -58,6 +56,67 @@ def test_sql_query(self):
         expected = ['2,Hi,Hello', '3,Hello,Hello']
         self.assert_equals(actual, expected)
 
+    def test_execute_sql(self):
+        t_env = self.t_env
+        table_result = t_env.execute_sql("create table tbl"
+                                         "("
+                                         "   a bigint,"
+                                         "   b int,"
+                                         "   c varchar"
+                                         ") with ("
+                                         "  'connector' = 'COLLECTION',"
+                                         "   'is-bounded' = 'false'"
+                                         ")")
+        self.assertIsNotNone(table_result)
+        self.assertIsNone(table_result.get_job_client())
+        self.assertIsNotNone(table_result.get_table_schema())
+        self.assert_equals(table_result.get_table_schema().get_field_names(), ["result"])
+        self.assertIsNotNone(table_result.get_result_kind())
+        self.assertEqual(table_result.get_result_kind(), ResultKind.SUCCESS)
+        table_result.print()
+
+        table_result = t_env.execute_sql("alter table tbl set ('k1' = 'a', 'k2' = 'b')")
+        self.assertIsNotNone(table_result)
+        self.assertIsNone(table_result.get_job_client())
+        self.assertIsNotNone(table_result.get_table_schema())
+        self.assert_equals(table_result.get_table_schema().get_field_names(), ["result"])
+        self.assertIsNotNone(table_result.get_result_kind())
+        self.assertEqual(table_result.get_result_kind(), ResultKind.SUCCESS)
+        table_result.print()
+
+        field_names = ["k1", "k2", "c"]
+        field_types = [DataTypes.BIGINT(), DataTypes.INT(), DataTypes.STRING()]
+        t_env.register_table_sink(
+            "sinks",
+            source_sink_utils.TestAppendSink(field_names, field_types))
+        table_result = t_env.execute_sql("insert into sinks select * from tbl")
+        self.assertIsNotNone(table_result)
+        self.assertIsNotNone(table_result.get_job_client())
+        job_status_feature = table_result.get_job_client().get_job_status()
+        job_execution_result_feature = table_result.get_job_client().get_job_execution_result(
+            get_gateway().jvm.Thread.currentThread().getContextClassLoader())
+        job_execution_result = job_execution_result_feature.result()
+        self.assertIsNotNone(job_execution_result)
+        self.assertIsNotNone(job_execution_result.get_job_id())
+        self.assertIsNotNone(job_execution_result.get_job_execution_result())
+        job_status = job_status_feature.result()
+        self.assertIsNotNone(job_status)
+        self.assertIsNotNone(table_result.get_table_schema())
+        self.assert_equals(table_result.get_table_schema().get_field_names(),
+                           ["default_catalog.default_database.sinks"])
+        self.assertIsNotNone(table_result.get_result_kind())
+        self.assertEqual(table_result.get_result_kind(), ResultKind.SUCCESS_WITH_CONTENT)
+        table_result.print()
+
+        table_result = t_env.execute_sql("drop table tbl")
+        self.assertIsNotNone(table_result)
+        self.assertIsNone(table_result.get_job_client())
+        self.assertIsNotNone(table_result.get_table_schema())

Review comment:
       Remove this line

##########
File path: flink-python/pyflink/table/tests/test_sql.py
##########
@@ -58,6 +56,67 @@ def test_sql_query(self):
         expected = ['2,Hi,Hello', '3,Hello,Hello']
         self.assert_equals(actual, expected)
 
+    def test_execute_sql(self):
+        t_env = self.t_env
+        table_result = t_env.execute_sql("create table tbl"
+                                         "("
+                                         "   a bigint,"
+                                         "   b int,"
+                                         "   c varchar"
+                                         ") with ("
+                                         "  'connector' = 'COLLECTION',"
+                                         "   'is-bounded' = 'false'"
+                                         ")")
+        self.assertIsNotNone(table_result)
+        self.assertIsNone(table_result.get_job_client())
+        self.assertIsNotNone(table_result.get_table_schema())
+        self.assert_equals(table_result.get_table_schema().get_field_names(), ["result"])
+        self.assertIsNotNone(table_result.get_result_kind())
+        self.assertEqual(table_result.get_result_kind(), ResultKind.SUCCESS)
+        table_result.print()
+
+        table_result = t_env.execute_sql("alter table tbl set ('k1' = 'a', 'k2' = 'b')")
+        self.assertIsNotNone(table_result)
+        self.assertIsNone(table_result.get_job_client())
+        self.assertIsNotNone(table_result.get_table_schema())
+        self.assert_equals(table_result.get_table_schema().get_field_names(), ["result"])
+        self.assertIsNotNone(table_result.get_result_kind())

Review comment:
       Remove this line.

##########
File path: flink-python/pyflink/table/tests/test_sql.py
##########
@@ -58,6 +56,67 @@ def test_sql_query(self):
         expected = ['2,Hi,Hello', '3,Hello,Hello']
         self.assert_equals(actual, expected)
 
+    def test_execute_sql(self):
+        t_env = self.t_env
+        table_result = t_env.execute_sql("create table tbl"
+                                         "("
+                                         "   a bigint,"
+                                         "   b int,"
+                                         "   c varchar"
+                                         ") with ("
+                                         "  'connector' = 'COLLECTION',"
+                                         "   'is-bounded' = 'false'"
+                                         ")")
+        self.assertIsNotNone(table_result)
+        self.assertIsNone(table_result.get_job_client())
+        self.assertIsNotNone(table_result.get_table_schema())
+        self.assert_equals(table_result.get_table_schema().get_field_names(), ["result"])
+        self.assertIsNotNone(table_result.get_result_kind())
+        self.assertEqual(table_result.get_result_kind(), ResultKind.SUCCESS)
+        table_result.print()
+
+        table_result = t_env.execute_sql("alter table tbl set ('k1' = 'a', 'k2' = 'b')")
+        self.assertIsNotNone(table_result)
+        self.assertIsNone(table_result.get_job_client())
+        self.assertIsNotNone(table_result.get_table_schema())
+        self.assert_equals(table_result.get_table_schema().get_field_names(), ["result"])
+        self.assertIsNotNone(table_result.get_result_kind())
+        self.assertEqual(table_result.get_result_kind(), ResultKind.SUCCESS)
+        table_result.print()
+
+        field_names = ["k1", "k2", "c"]
+        field_types = [DataTypes.BIGINT(), DataTypes.INT(), DataTypes.STRING()]
+        t_env.register_table_sink(
+            "sinks",
+            source_sink_utils.TestAppendSink(field_names, field_types))
+        table_result = t_env.execute_sql("insert into sinks select * from tbl")
+        self.assertIsNotNone(table_result)
+        self.assertIsNotNone(table_result.get_job_client())
+        job_status_feature = table_result.get_job_client().get_job_status()
+        job_execution_result_feature = table_result.get_job_client().get_job_execution_result(
+            get_gateway().jvm.Thread.currentThread().getContextClassLoader())
+        job_execution_result = job_execution_result_feature.result()
+        self.assertIsNotNone(job_execution_result)
+        self.assertIsNotNone(job_execution_result.get_job_id())
+        self.assertIsNotNone(job_execution_result.get_job_execution_result())
+        job_status = job_status_feature.result()
+        self.assertIsNotNone(job_status)
+        self.assertIsNotNone(table_result.get_table_schema())
+        self.assert_equals(table_result.get_table_schema().get_field_names(),
+                           ["default_catalog.default_database.sinks"])
+        self.assertIsNotNone(table_result.get_result_kind())
+        self.assertEqual(table_result.get_result_kind(), ResultKind.SUCCESS_WITH_CONTENT)
+        table_result.print()
+
+        table_result = t_env.execute_sql("drop table tbl")
+        self.assertIsNotNone(table_result)

Review comment:
       Remove this line

##########
File path: flink-python/pyflink/table/tests/test_sql.py
##########
@@ -58,6 +56,67 @@ def test_sql_query(self):
         expected = ['2,Hi,Hello', '3,Hello,Hello']
         self.assert_equals(actual, expected)
 
+    def test_execute_sql(self):
+        t_env = self.t_env
+        table_result = t_env.execute_sql("create table tbl"
+                                         "("
+                                         "   a bigint,"
+                                         "   b int,"
+                                         "   c varchar"
+                                         ") with ("
+                                         "  'connector' = 'COLLECTION',"
+                                         "   'is-bounded' = 'false'"
+                                         ")")
+        self.assertIsNotNone(table_result)
+        self.assertIsNone(table_result.get_job_client())
+        self.assertIsNotNone(table_result.get_table_schema())
+        self.assert_equals(table_result.get_table_schema().get_field_names(), ["result"])
+        self.assertIsNotNone(table_result.get_result_kind())
+        self.assertEqual(table_result.get_result_kind(), ResultKind.SUCCESS)
+        table_result.print()
+
+        table_result = t_env.execute_sql("alter table tbl set ('k1' = 'a', 'k2' = 'b')")
+        self.assertIsNotNone(table_result)
+        self.assertIsNone(table_result.get_job_client())
+        self.assertIsNotNone(table_result.get_table_schema())
+        self.assert_equals(table_result.get_table_schema().get_field_names(), ["result"])
+        self.assertIsNotNone(table_result.get_result_kind())
+        self.assertEqual(table_result.get_result_kind(), ResultKind.SUCCESS)
+        table_result.print()
+
+        field_names = ["k1", "k2", "c"]
+        field_types = [DataTypes.BIGINT(), DataTypes.INT(), DataTypes.STRING()]
+        t_env.register_table_sink(
+            "sinks",
+            source_sink_utils.TestAppendSink(field_names, field_types))
+        table_result = t_env.execute_sql("insert into sinks select * from tbl")
+        self.assertIsNotNone(table_result)
+        self.assertIsNotNone(table_result.get_job_client())
+        job_status_feature = table_result.get_job_client().get_job_status()
+        job_execution_result_feature = table_result.get_job_client().get_job_execution_result(
+            get_gateway().jvm.Thread.currentThread().getContextClassLoader())
+        job_execution_result = job_execution_result_feature.result()
+        self.assertIsNotNone(job_execution_result)

Review comment:
       Remove this line.

##########
File path: flink-python/pyflink/table/tests/test_sql.py
##########
@@ -58,6 +56,67 @@ def test_sql_query(self):
         expected = ['2,Hi,Hello', '3,Hello,Hello']
         self.assert_equals(actual, expected)
 
+    def test_execute_sql(self):
+        t_env = self.t_env
+        table_result = t_env.execute_sql("create table tbl"
+                                         "("
+                                         "   a bigint,"
+                                         "   b int,"
+                                         "   c varchar"
+                                         ") with ("
+                                         "  'connector' = 'COLLECTION',"
+                                         "   'is-bounded' = 'false'"
+                                         ")")
+        self.assertIsNotNone(table_result)
+        self.assertIsNone(table_result.get_job_client())
+        self.assertIsNotNone(table_result.get_table_schema())

Review comment:
       Remove this line. If table_schema is None, the test in the next line would be failed.

##########
File path: flink-python/pyflink/table/tests/test_sql.py
##########
@@ -58,6 +56,67 @@ def test_sql_query(self):
         expected = ['2,Hi,Hello', '3,Hello,Hello']
         self.assert_equals(actual, expected)
 
+    def test_execute_sql(self):
+        t_env = self.t_env
+        table_result = t_env.execute_sql("create table tbl"
+                                         "("
+                                         "   a bigint,"
+                                         "   b int,"
+                                         "   c varchar"
+                                         ") with ("
+                                         "  'connector' = 'COLLECTION',"
+                                         "   'is-bounded' = 'false'"
+                                         ")")
+        self.assertIsNotNone(table_result)
+        self.assertIsNone(table_result.get_job_client())
+        self.assertIsNotNone(table_result.get_table_schema())
+        self.assert_equals(table_result.get_table_schema().get_field_names(), ["result"])
+        self.assertIsNotNone(table_result.get_result_kind())

Review comment:
       Remove this line. The test in the next line has also checked the value for the result kind.

##########
File path: flink-python/pyflink/table/tests/test_sql.py
##########
@@ -58,6 +56,67 @@ def test_sql_query(self):
         expected = ['2,Hi,Hello', '3,Hello,Hello']
         self.assert_equals(actual, expected)
 
+    def test_execute_sql(self):
+        t_env = self.t_env
+        table_result = t_env.execute_sql("create table tbl"
+                                         "("
+                                         "   a bigint,"
+                                         "   b int,"
+                                         "   c varchar"
+                                         ") with ("
+                                         "  'connector' = 'COLLECTION',"
+                                         "   'is-bounded' = 'false'"
+                                         ")")
+        self.assertIsNotNone(table_result)
+        self.assertIsNone(table_result.get_job_client())
+        self.assertIsNotNone(table_result.get_table_schema())
+        self.assert_equals(table_result.get_table_schema().get_field_names(), ["result"])
+        self.assertIsNotNone(table_result.get_result_kind())
+        self.assertEqual(table_result.get_result_kind(), ResultKind.SUCCESS)
+        table_result.print()
+
+        table_result = t_env.execute_sql("alter table tbl set ('k1' = 'a', 'k2' = 'b')")
+        self.assertIsNotNone(table_result)

Review comment:
       Remove this line. If table_result is None, the following tests would failed.

##########
File path: flink-python/pyflink/table/tests/test_sql.py
##########
@@ -58,6 +56,67 @@ def test_sql_query(self):
         expected = ['2,Hi,Hello', '3,Hello,Hello']
         self.assert_equals(actual, expected)
 
+    def test_execute_sql(self):
+        t_env = self.t_env
+        table_result = t_env.execute_sql("create table tbl"
+                                         "("
+                                         "   a bigint,"
+                                         "   b int,"
+                                         "   c varchar"
+                                         ") with ("
+                                         "  'connector' = 'COLLECTION',"
+                                         "   'is-bounded' = 'false'"
+                                         ")")
+        self.assertIsNotNone(table_result)
+        self.assertIsNone(table_result.get_job_client())
+        self.assertIsNotNone(table_result.get_table_schema())
+        self.assert_equals(table_result.get_table_schema().get_field_names(), ["result"])
+        self.assertIsNotNone(table_result.get_result_kind())
+        self.assertEqual(table_result.get_result_kind(), ResultKind.SUCCESS)
+        table_result.print()
+
+        table_result = t_env.execute_sql("alter table tbl set ('k1' = 'a', 'k2' = 'b')")
+        self.assertIsNotNone(table_result)
+        self.assertIsNone(table_result.get_job_client())
+        self.assertIsNotNone(table_result.get_table_schema())

Review comment:
       Remove this line.

##########
File path: flink-python/pyflink/table/tests/test_sql.py
##########
@@ -58,6 +56,67 @@ def test_sql_query(self):
         expected = ['2,Hi,Hello', '3,Hello,Hello']
         self.assert_equals(actual, expected)
 
+    def test_execute_sql(self):
+        t_env = self.t_env
+        table_result = t_env.execute_sql("create table tbl"
+                                         "("
+                                         "   a bigint,"
+                                         "   b int,"
+                                         "   c varchar"
+                                         ") with ("
+                                         "  'connector' = 'COLLECTION',"
+                                         "   'is-bounded' = 'false'"
+                                         ")")
+        self.assertIsNotNone(table_result)
+        self.assertIsNone(table_result.get_job_client())
+        self.assertIsNotNone(table_result.get_table_schema())
+        self.assert_equals(table_result.get_table_schema().get_field_names(), ["result"])
+        self.assertIsNotNone(table_result.get_result_kind())
+        self.assertEqual(table_result.get_result_kind(), ResultKind.SUCCESS)
+        table_result.print()
+
+        table_result = t_env.execute_sql("alter table tbl set ('k1' = 'a', 'k2' = 'b')")
+        self.assertIsNotNone(table_result)
+        self.assertIsNone(table_result.get_job_client())
+        self.assertIsNotNone(table_result.get_table_schema())
+        self.assert_equals(table_result.get_table_schema().get_field_names(), ["result"])
+        self.assertIsNotNone(table_result.get_result_kind())
+        self.assertEqual(table_result.get_result_kind(), ResultKind.SUCCESS)
+        table_result.print()
+
+        field_names = ["k1", "k2", "c"]
+        field_types = [DataTypes.BIGINT(), DataTypes.INT(), DataTypes.STRING()]
+        t_env.register_table_sink(
+            "sinks",
+            source_sink_utils.TestAppendSink(field_names, field_types))
+        table_result = t_env.execute_sql("insert into sinks select * from tbl")
+        self.assertIsNotNone(table_result)
+        self.assertIsNotNone(table_result.get_job_client())
+        job_status_feature = table_result.get_job_client().get_job_status()
+        job_execution_result_feature = table_result.get_job_client().get_job_execution_result(
+            get_gateway().jvm.Thread.currentThread().getContextClassLoader())
+        job_execution_result = job_execution_result_feature.result()
+        self.assertIsNotNone(job_execution_result)
+        self.assertIsNotNone(job_execution_result.get_job_id())
+        self.assertIsNotNone(job_execution_result.get_job_execution_result())
+        job_status = job_status_feature.result()
+        self.assertIsNotNone(job_status)
+        self.assertIsNotNone(table_result.get_table_schema())
+        self.assert_equals(table_result.get_table_schema().get_field_names(),
+                           ["default_catalog.default_database.sinks"])
+        self.assertIsNotNone(table_result.get_result_kind())
+        self.assertEqual(table_result.get_result_kind(), ResultKind.SUCCESS_WITH_CONTENT)
+        table_result.print()
+
+        table_result = t_env.execute_sql("drop table tbl")
+        self.assertIsNotNone(table_result)
+        self.assertIsNone(table_result.get_job_client())
+        self.assertIsNotNone(table_result.get_table_schema())
+        self.assert_equals(table_result.get_table_schema().get_field_names(), ["result"])
+        self.assertIsNotNone(table_result.get_result_kind())

Review comment:
       Remove this line

##########
File path: flink-python/pyflink/table/tests/test_sql.py
##########
@@ -58,6 +56,67 @@ def test_sql_query(self):
         expected = ['2,Hi,Hello', '3,Hello,Hello']
         self.assert_equals(actual, expected)
 
+    def test_execute_sql(self):
+        t_env = self.t_env
+        table_result = t_env.execute_sql("create table tbl"
+                                         "("
+                                         "   a bigint,"
+                                         "   b int,"
+                                         "   c varchar"
+                                         ") with ("
+                                         "  'connector' = 'COLLECTION',"
+                                         "   'is-bounded' = 'false'"
+                                         ")")
+        self.assertIsNotNone(table_result)
+        self.assertIsNone(table_result.get_job_client())
+        self.assertIsNotNone(table_result.get_table_schema())
+        self.assert_equals(table_result.get_table_schema().get_field_names(), ["result"])
+        self.assertIsNotNone(table_result.get_result_kind())
+        self.assertEqual(table_result.get_result_kind(), ResultKind.SUCCESS)
+        table_result.print()
+
+        table_result = t_env.execute_sql("alter table tbl set ('k1' = 'a', 'k2' = 'b')")
+        self.assertIsNotNone(table_result)
+        self.assertIsNone(table_result.get_job_client())
+        self.assertIsNotNone(table_result.get_table_schema())
+        self.assert_equals(table_result.get_table_schema().get_field_names(), ["result"])
+        self.assertIsNotNone(table_result.get_result_kind())
+        self.assertEqual(table_result.get_result_kind(), ResultKind.SUCCESS)
+        table_result.print()
+
+        field_names = ["k1", "k2", "c"]
+        field_types = [DataTypes.BIGINT(), DataTypes.INT(), DataTypes.STRING()]
+        t_env.register_table_sink(
+            "sinks",
+            source_sink_utils.TestAppendSink(field_names, field_types))
+        table_result = t_env.execute_sql("insert into sinks select * from tbl")
+        self.assertIsNotNone(table_result)

Review comment:
       Remove this line.

##########
File path: flink-python/pyflink/table/tests/test_sql.py
##########
@@ -58,6 +56,67 @@ def test_sql_query(self):
         expected = ['2,Hi,Hello', '3,Hello,Hello']
         self.assert_equals(actual, expected)
 
+    def test_execute_sql(self):
+        t_env = self.t_env
+        table_result = t_env.execute_sql("create table tbl"
+                                         "("
+                                         "   a bigint,"
+                                         "   b int,"
+                                         "   c varchar"
+                                         ") with ("
+                                         "  'connector' = 'COLLECTION',"
+                                         "   'is-bounded' = 'false'"
+                                         ")")
+        self.assertIsNotNone(table_result)
+        self.assertIsNone(table_result.get_job_client())
+        self.assertIsNotNone(table_result.get_table_schema())
+        self.assert_equals(table_result.get_table_schema().get_field_names(), ["result"])
+        self.assertIsNotNone(table_result.get_result_kind())
+        self.assertEqual(table_result.get_result_kind(), ResultKind.SUCCESS)
+        table_result.print()
+
+        table_result = t_env.execute_sql("alter table tbl set ('k1' = 'a', 'k2' = 'b')")
+        self.assertIsNotNone(table_result)
+        self.assertIsNone(table_result.get_job_client())
+        self.assertIsNotNone(table_result.get_table_schema())
+        self.assert_equals(table_result.get_table_schema().get_field_names(), ["result"])
+        self.assertIsNotNone(table_result.get_result_kind())
+        self.assertEqual(table_result.get_result_kind(), ResultKind.SUCCESS)
+        table_result.print()
+
+        field_names = ["k1", "k2", "c"]
+        field_types = [DataTypes.BIGINT(), DataTypes.INT(), DataTypes.STRING()]
+        t_env.register_table_sink(
+            "sinks",
+            source_sink_utils.TestAppendSink(field_names, field_types))
+        table_result = t_env.execute_sql("insert into sinks select * from tbl")
+        self.assertIsNotNone(table_result)
+        self.assertIsNotNone(table_result.get_job_client())

Review comment:
       Remove this line

##########
File path: flink-python/pyflink/table/tests/test_sql.py
##########
@@ -58,6 +56,67 @@ def test_sql_query(self):
         expected = ['2,Hi,Hello', '3,Hello,Hello']
         self.assert_equals(actual, expected)
 
+    def test_execute_sql(self):
+        t_env = self.t_env
+        table_result = t_env.execute_sql("create table tbl"
+                                         "("
+                                         "   a bigint,"
+                                         "   b int,"
+                                         "   c varchar"
+                                         ") with ("
+                                         "  'connector' = 'COLLECTION',"
+                                         "   'is-bounded' = 'false'"
+                                         ")")
+        self.assertIsNotNone(table_result)
+        self.assertIsNone(table_result.get_job_client())
+        self.assertIsNotNone(table_result.get_table_schema())
+        self.assert_equals(table_result.get_table_schema().get_field_names(), ["result"])
+        self.assertIsNotNone(table_result.get_result_kind())
+        self.assertEqual(table_result.get_result_kind(), ResultKind.SUCCESS)
+        table_result.print()
+
+        table_result = t_env.execute_sql("alter table tbl set ('k1' = 'a', 'k2' = 'b')")
+        self.assertIsNotNone(table_result)
+        self.assertIsNone(table_result.get_job_client())
+        self.assertIsNotNone(table_result.get_table_schema())
+        self.assert_equals(table_result.get_table_schema().get_field_names(), ["result"])
+        self.assertIsNotNone(table_result.get_result_kind())
+        self.assertEqual(table_result.get_result_kind(), ResultKind.SUCCESS)
+        table_result.print()
+
+        field_names = ["k1", "k2", "c"]
+        field_types = [DataTypes.BIGINT(), DataTypes.INT(), DataTypes.STRING()]
+        t_env.register_table_sink(
+            "sinks",
+            source_sink_utils.TestAppendSink(field_names, field_types))
+        table_result = t_env.execute_sql("insert into sinks select * from tbl")
+        self.assertIsNotNone(table_result)
+        self.assertIsNotNone(table_result.get_job_client())
+        job_status_feature = table_result.get_job_client().get_job_status()
+        job_execution_result_feature = table_result.get_job_client().get_job_execution_result(
+            get_gateway().jvm.Thread.currentThread().getContextClassLoader())
+        job_execution_result = job_execution_result_feature.result()
+        self.assertIsNotNone(job_execution_result)
+        self.assertIsNotNone(job_execution_result.get_job_id())
+        self.assertIsNotNone(job_execution_result.get_job_execution_result())
+        job_status = job_status_feature.result()
+        self.assertIsNotNone(job_status)

Review comment:
       Remove tests about job status. The cluster would exist before calling the `get_job_status` while leads to test failures.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot commented on pull request #12246: [FLINK-17303][python] Return TableResult for Python TableEnvironment

Posted by GitBox <gi...@apache.org>.
flinkbot commented on pull request #12246:
URL: https://github.com/apache/flink/pull/12246#issuecomment-630803193


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ce447aa89adaab2738a5235b41c289626877a09",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2ce447aa89adaab2738a5235b41c289626877a09",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2ce447aa89adaab2738a5235b41c289626877a09 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12246: [FLINK-17303][python] Return TableResult for Python TableEnvironment

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12246:
URL: https://github.com/apache/flink/pull/12246#issuecomment-630803193


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ce447aa89adaab2738a5235b41c289626877a09",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=1850",
       "triggerID" : "2ce447aa89adaab2738a5235b41c289626877a09",
       "triggerType" : "PUSH"
     }, {
       "hash" : "911e459fe53b61aa74ce3bc3d0761651eb7f61fb",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=1893",
       "triggerID" : "911e459fe53b61aa74ce3bc3d0761651eb7f61fb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "be75b66a69ad7616f5277cfb6355556e7b135b75",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=1993",
       "triggerID" : "be75b66a69ad7616f5277cfb6355556e7b135b75",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * be75b66a69ad7616f5277cfb6355556e7b135b75 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=1993) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] hequn8128 closed pull request #12246: [FLINK-17303][python] Return TableResult for Python TableEnvironment

Posted by GitBox <gi...@apache.org>.
hequn8128 closed pull request #12246:
URL: https://github.com/apache/flink/pull/12246


   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] SteNicholas commented on pull request #12246: [FLINK-17303][python] Return TableResult for Python TableEnvironment

Posted by GitBox <gi...@apache.org>.
SteNicholas commented on pull request #12246:
URL: https://github.com/apache/flink/pull/12246#issuecomment-632591106


   @flinkbot run azure


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12246: [FLINK-17303][python] Return TableResult for Python TableEnvironment

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12246:
URL: https://github.com/apache/flink/pull/12246#issuecomment-630803193


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ce447aa89adaab2738a5235b41c289626877a09",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=1850",
       "triggerID" : "2ce447aa89adaab2738a5235b41c289626877a09",
       "triggerType" : "PUSH"
     }, {
       "hash" : "911e459fe53b61aa74ce3bc3d0761651eb7f61fb",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=1893",
       "triggerID" : "911e459fe53b61aa74ce3bc3d0761651eb7f61fb",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2ce447aa89adaab2738a5235b41c289626877a09 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=1850) 
   * 911e459fe53b61aa74ce3bc3d0761651eb7f61fb Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=1893) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12246: [FLINK-17303][python] Return TableResult for Python TableEnvironment

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12246:
URL: https://github.com/apache/flink/pull/12246#issuecomment-630803193


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ce447aa89adaab2738a5235b41c289626877a09",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=1850",
       "triggerID" : "2ce447aa89adaab2738a5235b41c289626877a09",
       "triggerType" : "PUSH"
     }, {
       "hash" : "911e459fe53b61aa74ce3bc3d0761651eb7f61fb",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "911e459fe53b61aa74ce3bc3d0761651eb7f61fb",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2ce447aa89adaab2738a5235b41c289626877a09 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=1850) 
   * 911e459fe53b61aa74ce3bc3d0761651eb7f61fb UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot commented on pull request #12246: [FLINK-17303][python] Return TableResult for Python TableEnvironment

Posted by GitBox <gi...@apache.org>.
flinkbot commented on pull request #12246:
URL: https://github.com/apache/flink/pull/12246#issuecomment-630794628


   Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community
   to review your pull request. We will use this comment to track the progress of the review.
   
   
   ## Automated Checks
   Last check on commit 2ce447aa89adaab2738a5235b41c289626877a09 (Tue May 19 12:48:41 UTC 2020)
   
   **Warnings:**
    * No documentation files were touched! Remember to keep the Flink docs up to date!
   
   
   <sub>Mention the bot in a comment to re-run the automated checks.</sub>
   ## Review Progress
   
   * ❓ 1. The [description] looks good.
   * ❓ 2. There is [consensus] that the contribution should go into to Flink.
   * ❓ 3. Needs [attention] from.
   * ❓ 4. The change fits into the overall [architecture].
   * ❓ 5. Overall code [quality] is good.
   
   Please see the [Pull Request Review Guide](https://flink.apache.org/contributing/reviewing-prs.html) for a full explanation of the review process.<details>
    The Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot approve description` to approve one or more aspects (aspects: `description`, `consensus`, `architecture` and `quality`)
    - `@flinkbot approve all` to approve all aspects
    - `@flinkbot approve-until architecture` to approve everything until `architecture`
    - `@flinkbot attention @username1 [@username2 ..]` to require somebody's attention
    - `@flinkbot disapprove architecture` to remove an approval you gave earlier
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12246: [FLINK-17303][python] Return TableResult for Python TableEnvironment

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12246:
URL: https://github.com/apache/flink/pull/12246#issuecomment-630803193


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ce447aa89adaab2738a5235b41c289626877a09",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=1850",
       "triggerID" : "2ce447aa89adaab2738a5235b41c289626877a09",
       "triggerType" : "PUSH"
     }, {
       "hash" : "911e459fe53b61aa74ce3bc3d0761651eb7f61fb",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=1893",
       "triggerID" : "911e459fe53b61aa74ce3bc3d0761651eb7f61fb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "be75b66a69ad7616f5277cfb6355556e7b135b75",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "be75b66a69ad7616f5277cfb6355556e7b135b75",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 911e459fe53b61aa74ce3bc3d0761651eb7f61fb Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=1893) 
   * be75b66a69ad7616f5277cfb6355556e7b135b75 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12246: [FLINK-17303][python] Return TableResult for Python TableEnvironment

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12246:
URL: https://github.com/apache/flink/pull/12246#issuecomment-630803193


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ce447aa89adaab2738a5235b41c289626877a09",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=1850",
       "triggerID" : "2ce447aa89adaab2738a5235b41c289626877a09",
       "triggerType" : "PUSH"
     }, {
       "hash" : "911e459fe53b61aa74ce3bc3d0761651eb7f61fb",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=1893",
       "triggerID" : "911e459fe53b61aa74ce3bc3d0761651eb7f61fb",
       "triggerType" : "PUSH"
     }, {
       "hash" : "be75b66a69ad7616f5277cfb6355556e7b135b75",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=1993",
       "triggerID" : "be75b66a69ad7616f5277cfb6355556e7b135b75",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 911e459fe53b61aa74ce3bc3d0761651eb7f61fb Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=1893) 
   * be75b66a69ad7616f5277cfb6355556e7b135b75 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=1993) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #12246: [FLINK-17303][python] Return TableResult for Python TableEnvironment

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #12246:
URL: https://github.com/apache/flink/pull/12246#issuecomment-630803193


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "2ce447aa89adaab2738a5235b41c289626877a09",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=1850",
       "triggerID" : "2ce447aa89adaab2738a5235b41c289626877a09",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2ce447aa89adaab2738a5235b41c289626877a09 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=1850) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org