You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by GitBox <gi...@apache.org> on 2020/08/18 12:28:06 UTC

[GitHub] [flink] hequn8128 opened a new pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

hequn8128 opened a new pull request #13193:
URL: https://github.com/apache/flink/pull/13193


   ## What is the purpose of the change
   
   This pull request adds dedicated connector documentation for Python Table API.
   
   The reason for writing connectors documentation for Python users separately is that using connectors on PyFlink is a little different from using them on Java/Scala, e.g. how to add the connector jars in Python program. These documents will only introduce the Python-only part of connectors usage. 
   
   
   ## Brief change log
   
     - Adds dedicated connector documentation for Python Table API
   
   ## Verifying this change
   
   This change is a trivial rework / code cleanup without any test coverage.
   
   ## Does this pull request potentially affect one of the following parts:
   
     - Dependencies (does it add or upgrade a dependency): (no)
     - The public API, i.e., is any changed class annotated with `@Public(Evolving)`: (no)
     - The serializers: (no)
     - The runtime per-record code paths (performance sensitive): (no)
     - Anything that affects deployment or recovery: JobManager (and its components), Checkpointing, Kubernetes/Yarn/Mesos, ZooKeeper: (no)
     - The S3 file system connector: (no)
   
   ## Documentation
   
     - Does this pull request introduce a new feature? (no)
   
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13193:
URL: https://github.com/apache/flink/pull/13193#issuecomment-675460565


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "3218091590ebac41d0351ff99aad0b2e6dc29cb6",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5682",
       "triggerID" : "3218091590ebac41d0351ff99aad0b2e6dc29cb6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec8c4b3b4e25ecbe5e1397e30bd4aef7900c38af",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5726",
       "triggerID" : "ec8c4b3b4e25ecbe5e1397e30bd4aef7900c38af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ca3029fb50b58d12bbfb87fbe79012195ba0b10",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "6ca3029fb50b58d12bbfb87fbe79012195ba0b10",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec8c4b3b4e25ecbe5e1397e30bd4aef7900c38af Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5726) 
   * 6ca3029fb50b58d12bbfb87fbe79012195ba0b10 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] hequn8128 commented on pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
hequn8128 commented on pull request #13193:
URL: https://github.com/apache/flink/pull/13193#issuecomment-676854960


   @sjwiesman Thank you for your nice review. All comments have been addressed. 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13193:
URL: https://github.com/apache/flink/pull/13193#issuecomment-675460565


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "3218091590ebac41d0351ff99aad0b2e6dc29cb6",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5682",
       "triggerID" : "3218091590ebac41d0351ff99aad0b2e6dc29cb6",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 3218091590ebac41d0351ff99aad0b2e6dc29cb6 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5682) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] hequn8128 commented on a change in pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
hequn8128 commented on a change in pull request #13193:
URL: https://github.com/apache/flink/pull/13193#discussion_r475063433



##########
File path: docs/dev/python/user-guide/table/python_table_api_connectors.md
##########
@@ -0,0 +1,194 @@
+---
+title: "Connectors"
+nav-parent_id: python_tableapi
+nav-pos: 130
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+
+This page describes how to use connectors in PyFlink and highlights the different parts between using connectors in PyFlink vs Java/Scala. 
+
+* This will be replaced by the TOC
+{:toc}
+
+<span class="label label-info">Note</span>For general connector information and common configuration, please refer to the corresponding [Java/Scala documentation]({{ site.baseurl }}/dev/table/connectors/index.html). 
+
+## Download connector and format jars
+
+For both connectors and formats, implementations are available as jars that need to be specified as job [dependencies]({{ site.baseurl }}/dev/python/user-guide/table/dependency_management.html).

Review comment:
       Make sense!




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot commented on pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
flinkbot commented on pull request #13193:
URL: https://github.com/apache/flink/pull/13193#issuecomment-675460565


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "3218091590ebac41d0351ff99aad0b2e6dc29cb6",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "3218091590ebac41d0351ff99aad0b2e6dc29cb6",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 3218091590ebac41d0351ff99aad0b2e6dc29cb6 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13193:
URL: https://github.com/apache/flink/pull/13193#issuecomment-675460565


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "3218091590ebac41d0351ff99aad0b2e6dc29cb6",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5682",
       "triggerID" : "3218091590ebac41d0351ff99aad0b2e6dc29cb6",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 3218091590ebac41d0351ff99aad0b2e6dc29cb6 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5682) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] hequn8128 commented on pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
hequn8128 commented on pull request #13193:
URL: https://github.com/apache/flink/pull/13193#issuecomment-678613335


   @morsapaes Hi, nice to also have your suggestions. I have addressed all your comments and updated the PR.  


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13193:
URL: https://github.com/apache/flink/pull/13193#issuecomment-675460565


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "3218091590ebac41d0351ff99aad0b2e6dc29cb6",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5682",
       "triggerID" : "3218091590ebac41d0351ff99aad0b2e6dc29cb6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec8c4b3b4e25ecbe5e1397e30bd4aef7900c38af",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5726",
       "triggerID" : "ec8c4b3b4e25ecbe5e1397e30bd4aef7900c38af",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec8c4b3b4e25ecbe5e1397e30bd4aef7900c38af Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5726) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dianfu commented on a change in pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
dianfu commented on a change in pull request #13193:
URL: https://github.com/apache/flink/pull/13193#discussion_r477978848



##########
File path: docs/dev/python/user-guide/table/python_table_api_connectors.zh.md
##########
@@ -0,0 +1,194 @@
+---
+title: "Connectors"
+nav-parent_id: python_tableapi
+nav-pos: 130
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+
+This page describes how to use connectors in PyFlink and highlights the details to be aware of when using Flink connectors in Python programs.
+
+* This will be replaced by the TOC
+{:toc}
+
+<span class="label label-info">Note</span>For general connector information and common configuration, please refer to the corresponding [Java/Scala documentation]({{ site.baseurl }}/zh/dev/table/connectors/index.html). 
+
+## Download connector and format jars
+
+Since Flink is a Java/Scala-based project, for both connectors and formats, implementations are available as jars that need to be specified as job [dependencies]({{ site.baseurl }}/dev/python/user-guide/table/dependency_management.html).

Review comment:
       It refers to the English doc in Chinese doc. I suggest to use the new style for links: 
   [dependencies]({% link dev/python/user-guide/table/dependency_management.zh.md %})
   
   It will report errors during build if the links in Chinese doc refer to English doc for the new style.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13193:
URL: https://github.com/apache/flink/pull/13193#issuecomment-675460565


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "3218091590ebac41d0351ff99aad0b2e6dc29cb6",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5682",
       "triggerID" : "3218091590ebac41d0351ff99aad0b2e6dc29cb6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec8c4b3b4e25ecbe5e1397e30bd4aef7900c38af",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5726",
       "triggerID" : "ec8c4b3b4e25ecbe5e1397e30bd4aef7900c38af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ca3029fb50b58d12bbfb87fbe79012195ba0b10",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5785",
       "triggerID" : "6ca3029fb50b58d12bbfb87fbe79012195ba0b10",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ec8c4b3b4e25ecbe5e1397e30bd4aef7900c38af Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5726) 
   * 6ca3029fb50b58d12bbfb87fbe79012195ba0b10 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5785) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] hequn8128 commented on pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
hequn8128 commented on pull request #13193:
URL: https://github.com/apache/flink/pull/13193#issuecomment-681942105


   @dianfu Thanks a lot for the review. I have addressed the comments and updated the PR. 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13193:
URL: https://github.com/apache/flink/pull/13193#issuecomment-675460565


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "3218091590ebac41d0351ff99aad0b2e6dc29cb6",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5682",
       "triggerID" : "3218091590ebac41d0351ff99aad0b2e6dc29cb6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec8c4b3b4e25ecbe5e1397e30bd4aef7900c38af",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5726",
       "triggerID" : "ec8c4b3b4e25ecbe5e1397e30bd4aef7900c38af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ca3029fb50b58d12bbfb87fbe79012195ba0b10",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5785",
       "triggerID" : "6ca3029fb50b58d12bbfb87fbe79012195ba0b10",
       "triggerType" : "PUSH"
     }, {
       "hash" : "dae6f793cca0d71b2aaaf860759a6f03506f9cc0",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5929",
       "triggerID" : "dae6f793cca0d71b2aaaf860759a6f03506f9cc0",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 6ca3029fb50b58d12bbfb87fbe79012195ba0b10 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5785) 
   * dae6f793cca0d71b2aaaf860759a6f03506f9cc0 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5929) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13193:
URL: https://github.com/apache/flink/pull/13193#issuecomment-675460565


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "3218091590ebac41d0351ff99aad0b2e6dc29cb6",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5682",
       "triggerID" : "3218091590ebac41d0351ff99aad0b2e6dc29cb6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec8c4b3b4e25ecbe5e1397e30bd4aef7900c38af",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ec8c4b3b4e25ecbe5e1397e30bd4aef7900c38af",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 3218091590ebac41d0351ff99aad0b2e6dc29cb6 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5682) 
   * ec8c4b3b4e25ecbe5e1397e30bd4aef7900c38af UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dianfu commented on pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
dianfu commented on pull request #13193:
URL: https://github.com/apache/flink/pull/13193#issuecomment-682331400


   @hequn8128 Thanks for the update. LGTM.
   
   @sjwiesman @morsapaes could you take a further look at of the latest PR? Thanks a lot!


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dianfu commented on pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
dianfu commented on pull request #13193:
URL: https://github.com/apache/flink/pull/13193#issuecomment-682372101


   Thanks @morsapaes for the detailed review and sharing the info. 


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13193:
URL: https://github.com/apache/flink/pull/13193#issuecomment-675460565


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "3218091590ebac41d0351ff99aad0b2e6dc29cb6",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5682",
       "triggerID" : "3218091590ebac41d0351ff99aad0b2e6dc29cb6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec8c4b3b4e25ecbe5e1397e30bd4aef7900c38af",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5726",
       "triggerID" : "ec8c4b3b4e25ecbe5e1397e30bd4aef7900c38af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ca3029fb50b58d12bbfb87fbe79012195ba0b10",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5785",
       "triggerID" : "6ca3029fb50b58d12bbfb87fbe79012195ba0b10",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 6ca3029fb50b58d12bbfb87fbe79012195ba0b10 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5785) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dianfu closed pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
dianfu closed pull request #13193:
URL: https://github.com/apache/flink/pull/13193


   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] sjwiesman commented on a change in pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
sjwiesman commented on a change in pull request #13193:
URL: https://github.com/apache/flink/pull/13193#discussion_r472299531



##########
File path: docs/dev/python/user-guide/table/python_table_api_connectors.md
##########
@@ -0,0 +1,194 @@
+---
+title: "Connectors"
+nav-parent_id: python_tableapi
+nav-pos: 130
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+
+* This will be replaced by the TOC
+{:toc}
+
+This page describes how to use connectors in PyFlink. The main purpose of this page is to highlight the different parts between using connectors in PyFlink and Java/Scala. Below, we will guide you how to use connectors through an explicit example in which Kafka and Json format are used.
+
+<span class="label label-info">Note</span> For the common parts of using connectors between PyFlink and Java/Scala, you can refer to the [Java/Scala document]({{ site.baseurl }}/dev/table/connectors/index.html) for more details. 

Review comment:
       ```suggestion
   This page describes how to use connectors in PyFlink and highlights the different parts between using connectors in PyFlink vs Java/Scala. 
   
   * This will be replaced by the TOC
   {:toc}
   
   <span class="label label-info">Note</span>For general connector information and common configuration, please refer to the corresponding [Java/Scala documentation]({{ site.baseurl }}/dev/table/connectors/index.html). 
   ```

##########
File path: docs/dev/python/user-guide/table/python_table_api_connectors.md
##########
@@ -0,0 +1,194 @@
+---
+title: "Connectors"
+nav-parent_id: python_tableapi
+nav-pos: 130
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+
+* This will be replaced by the TOC
+{:toc}
+
+This page describes how to use connectors in PyFlink. The main purpose of this page is to highlight the different parts between using connectors in PyFlink and Java/Scala. Below, we will guide you how to use connectors through an explicit example in which Kafka and Json format are used.
+
+<span class="label label-info">Note</span> For the common parts of using connectors between PyFlink and Java/Scala, you can refer to the [Java/Scala document]({{ site.baseurl }}/dev/table/connectors/index.html) for more details. 
+
+## Download connector and format jars
+
+Suppose you are using Kafka connector and Json format, you need first download the [Kafka]({{ site.baseurl }}/dev/table/connectors/kafka.html) and [Json](https://repo.maven.apache.org/maven2/org/apache/flink/flink-json/) jars. Once the connector and format jars are downloaded to local, specify them with the [Dependency Management]({{ site.baseurl }}/dev/python/user-guide/table/dependency_management.html) APIs.

Review comment:
       ```suggestion
   For both connectors and formats, implementations are available as jars that need to be specified as job [dependencies]({{ site.baseurl }}/dev/python/user-guide/table/dependency_management.html).
   ```
   
   

##########
File path: docs/dev/python/user-guide/table/python_table_api_connectors.md
##########
@@ -0,0 +1,194 @@
+---
+title: "Connectors"
+nav-parent_id: python_tableapi
+nav-pos: 130
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+
+* This will be replaced by the TOC
+{:toc}
+
+This page describes how to use connectors in PyFlink. The main purpose of this page is to highlight the different parts between using connectors in PyFlink and Java/Scala. Below, we will guide you how to use connectors through an explicit example in which Kafka and Json format are used.
+
+<span class="label label-info">Note</span> For the common parts of using connectors between PyFlink and Java/Scala, you can refer to the [Java/Scala document]({{ site.baseurl }}/dev/table/connectors/index.html) for more details. 
+
+## Download connector and format jars
+
+Suppose you are using Kafka connector and Json format, you need first download the [Kafka]({{ site.baseurl }}/dev/table/connectors/kafka.html) and [Json](https://repo.maven.apache.org/maven2/org/apache/flink/flink-json/) jars. Once the connector and format jars are downloaded to local, specify them with the [Dependency Management]({{ site.baseurl }}/dev/python/user-guide/table/dependency_management.html) APIs.
+
+{% highlight python %}
+
+table_env.get_config().get_configuration().set_string("pipeline.jars", "file:///my/jar/path/connector.jar;file:///my/jar/path/json.jar")
+
+{% endhighlight %}
+
+## How to use connectors
+
+In the Table API of PyFlink, DDL is recommended to define source and sink. You can use the `execute_sql()` method on `TableEnvironment` to register source and sink with DDL. After that, you can select from the source table and insert into the sink table.
+
+{% highlight python %}
+
+source_ddl = """
+        CREATE TABLE source_table(
+            a VARCHAR,
+            b INT
+        ) WITH (
+          'connector.type' = 'kafka',
+          'connector.version' = 'universal',
+          'connector.topic' = 'source_topic',
+          'connector.properties.bootstrap.servers' = 'kafka:9092',
+          'connector.properties.group.id' = 'test_3',
+          'connector.startup-mode' = 'latest-offset',
+          'format.type' = 'json'
+        )
+        """
+
+sink_ddl = """
+        CREATE TABLE sink_table(
+            a VARCHAR
+        ) WITH (
+          'connector.type' = 'kafka',
+          'connector.version' = 'universal',
+          'connector.topic' = 'sink_topic',
+          'connector.properties.bootstrap.servers' = 'kafka:9092',
+          'format.type' = 'json'
+        )
+        """
+
+t_env.execute_sql(source_ddl)
+t_env.execute_sql(sink_ddl)
+
+t_env.sql_query("select a from source_table") \
+    .insert_into("sink_table")
+    
+{% endhighlight %}
+
+Below is a complete example of how to use the Kafka and Json format in PyFlink.
+
+{% highlight python %}
+
+from pyflink.datastream import StreamExecutionEnvironment, TimeCharacteristic
+from pyflink.table import StreamTableEnvironment, EnvironmentSettings
+
+
+def log_processing():
+    env = StreamExecutionEnvironment.get_execution_environment()
+    env_settings = EnvironmentSettings.Builder().use_blink_planner().build()
+    t_env = StreamTableEnvironment.create(stream_execution_environment=env, environment_settings=env_settings)
+    t_env.get_config().get_configuration().set_boolean("python.fn-execution.memory.managed", True)
+    # specify connector and format jars
+    t_env.get_config().get_configuration().set_string("pipeline.jars", "file:///my/jar/path/connector.jar;file:///my/jar/path/json.jar")
+    
+    source_ddl = """
+            CREATE TABLE source_table(
+                a VARCHAR,
+                b INT
+            ) WITH (
+              'connector.type' = 'kafka',
+              'connector.version' = 'universal',
+              'connector.topic' = 'source_topic',
+              'connector.properties.bootstrap.servers' = 'kafka:9092',
+              'connector.properties.group.id' = 'test_3',
+              'connector.startup-mode' = 'latest-offset',
+              'format.type' = 'json'
+            )
+            """
+
+    sink_ddl = """
+            CREATE TABLE sink_table(
+                a VARCHAR
+            ) WITH (
+              'connector.type' = 'kafka',
+              'connector.version' = 'universal',
+              'connector.topic' = 'sink_topic',
+              'connector.properties.bootstrap.servers' = 'kafka:9092',
+              'format.type' = 'json'
+            )
+            """
+
+    t_env.execute_sql(source_ddl)
+    t_env.execute_sql(sink_ddl)
+
+    t_env.sql_query("select a from source_table") \
+        .insert_into("sink_table")
+
+    t_env.execute("payment_demo")
+
+
+if __name__ == '__main__':
+    log_processing()
+{% endhighlight %}
+
+
+## Predefined Sources and Sinks
+
+A few basic data sources and sinks are built into Flink and are always available. The predefined data sources include reading from Pandas DataFrame, or ingesting data from collections. The predefined data sinks support writing to Pandas DataFrame.
+
+### from/to Pandas
+
+It supports to convert between PyFlink Table and Pandas DataFrame.

Review comment:
       ```suggestion
   PyFlink Tables support conversion to and from Pandas DataFrame.
   ```

##########
File path: docs/dev/python/user-guide/table/python_table_api_connectors.md
##########
@@ -0,0 +1,194 @@
+---
+title: "Connectors"
+nav-parent_id: python_tableapi
+nav-pos: 130
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+
+* This will be replaced by the TOC
+{:toc}
+
+This page describes how to use connectors in PyFlink. The main purpose of this page is to highlight the different parts between using connectors in PyFlink and Java/Scala. Below, we will guide you how to use connectors through an explicit example in which Kafka and Json format are used.
+
+<span class="label label-info">Note</span> For the common parts of using connectors between PyFlink and Java/Scala, you can refer to the [Java/Scala document]({{ site.baseurl }}/dev/table/connectors/index.html) for more details. 
+
+## Download connector and format jars
+
+Suppose you are using Kafka connector and Json format, you need first download the [Kafka]({{ site.baseurl }}/dev/table/connectors/kafka.html) and [Json](https://repo.maven.apache.org/maven2/org/apache/flink/flink-json/) jars. Once the connector and format jars are downloaded to local, specify them with the [Dependency Management]({{ site.baseurl }}/dev/python/user-guide/table/dependency_management.html) APIs.
+
+{% highlight python %}
+
+table_env.get_config().get_configuration().set_string("pipeline.jars", "file:///my/jar/path/connector.jar;file:///my/jar/path/json.jar")
+
+{% endhighlight %}
+
+## How to use connectors
+
+In the Table API of PyFlink, DDL is recommended to define source and sink. You can use the `execute_sql()` method on `TableEnvironment` to register source and sink with DDL. After that, you can select from the source table and insert into the sink table.
+
+{% highlight python %}
+
+source_ddl = """
+        CREATE TABLE source_table(
+            a VARCHAR,
+            b INT
+        ) WITH (
+          'connector.type' = 'kafka',
+          'connector.version' = 'universal',
+          'connector.topic' = 'source_topic',
+          'connector.properties.bootstrap.servers' = 'kafka:9092',
+          'connector.properties.group.id' = 'test_3',
+          'connector.startup-mode' = 'latest-offset',
+          'format.type' = 'json'
+        )
+        """
+
+sink_ddl = """
+        CREATE TABLE sink_table(
+            a VARCHAR
+        ) WITH (
+          'connector.type' = 'kafka',
+          'connector.version' = 'universal',
+          'connector.topic' = 'sink_topic',
+          'connector.properties.bootstrap.servers' = 'kafka:9092',
+          'format.type' = 'json'
+        )
+        """
+
+t_env.execute_sql(source_ddl)
+t_env.execute_sql(sink_ddl)
+
+t_env.sql_query("select a from source_table") \
+    .insert_into("sink_table")
+    
+{% endhighlight %}
+
+Below is a complete example of how to use the Kafka and Json format in PyFlink.
+
+{% highlight python %}
+
+from pyflink.datastream import StreamExecutionEnvironment, TimeCharacteristic
+from pyflink.table import StreamTableEnvironment, EnvironmentSettings
+
+
+def log_processing():
+    env = StreamExecutionEnvironment.get_execution_environment()
+    env_settings = EnvironmentSettings.Builder().use_blink_planner().build()
+    t_env = StreamTableEnvironment.create(stream_execution_environment=env, environment_settings=env_settings)
+    t_env.get_config().get_configuration().set_boolean("python.fn-execution.memory.managed", True)
+    # specify connector and format jars
+    t_env.get_config().get_configuration().set_string("pipeline.jars", "file:///my/jar/path/connector.jar;file:///my/jar/path/json.jar")
+    
+    source_ddl = """
+            CREATE TABLE source_table(
+                a VARCHAR,
+                b INT
+            ) WITH (
+              'connector.type' = 'kafka',
+              'connector.version' = 'universal',
+              'connector.topic' = 'source_topic',
+              'connector.properties.bootstrap.servers' = 'kafka:9092',
+              'connector.properties.group.id' = 'test_3',
+              'connector.startup-mode' = 'latest-offset',
+              'format.type' = 'json'
+            )
+            """
+
+    sink_ddl = """
+            CREATE TABLE sink_table(
+                a VARCHAR
+            ) WITH (
+              'connector.type' = 'kafka',
+              'connector.version' = 'universal',
+              'connector.topic' = 'sink_topic',
+              'connector.properties.bootstrap.servers' = 'kafka:9092',
+              'format.type' = 'json'
+            )
+            """
+
+    t_env.execute_sql(source_ddl)
+    t_env.execute_sql(sink_ddl)
+
+    t_env.sql_query("select a from source_table") \
+        .insert_into("sink_table")
+
+    t_env.execute("payment_demo")
+
+
+if __name__ == '__main__':
+    log_processing()
+{% endhighlight %}
+
+
+## Predefined Sources and Sinks
+
+A few basic data sources and sinks are built into Flink and are always available. The predefined data sources include reading from Pandas DataFrame, or ingesting data from collections. The predefined data sinks support writing to Pandas DataFrame.
+
+### from/to Pandas
+
+It supports to convert between PyFlink Table and Pandas DataFrame.
+
+{% highlight python %}
+
+import pandas as pd
+import numpy as np
+
+# Create a PyFlink Table
+pdf = pd.DataFrame(np.random.rand(1000, 2))
+table = t_env.from_pandas(pdf, ["a", "b"]).filter("a > 0.5")
+
+# Convert the PyFlink Table to a Pandas DataFrame
+pdf = table.to_pandas()
+{% endhighlight %}
+
+### from_elements()
+
+`from_elements()` is used to creates a table from a collection of elements. The elements types must be acceptable atomic types or acceptable composite types. 
+
+{% highlight python %}
+
+table_env.from_elements([(1, 'Hi'), (2, 'Hello')])
+
+# use the second parameter to specify custom field names
+table_env.from_elements([(1, 'Hi'), (2, 'Hello')], ['a', 'b'])
+
+# use the second parameter to specify custom table schema
+table_env.from_elements([(1, 'Hi'), (2, 'Hello')],
+                        DataTypes.ROW([DataTypes.FIELD("a", DataTypes.INT()),
+                                       DataTypes.FIELD("b", DataTypes.STRING())]))
+{% endhighlight %}
+
+The above query returns a Table like:
+
+{% highlight python %}
++----+-------+
+| a  |   b   |
++====+=======+
+| 1  |  Hi   |
++----+-------+
+| 2  | Hello |
++----+-------+
+{% endhighlight %}
+
+## User-defined sources & sinks
+
+In some cases, you may want to defined your own sources and sinks. Currently, Python sources and sinks are not supported. However, you can write Java/Scala TableFactory and use your own sources and sinks in DDL. More details can be found in the [Java/Scala document]({{ site.baseurl }}/dev/table/sourceSinks.html).

Review comment:
       ```suggestion
   In some cases, you may want to define custom sources and sinks. Currently, sources and sinks must be implemented in Java/Scala but you can define a TableFactory to support their use via DDL. More details can be found in the [Java/Scala document]({{ site.baseurl }}/dev/table/sourceSinks.html).
   ```

##########
File path: docs/dev/python/user-guide/table/python_table_api_connectors.md
##########
@@ -0,0 +1,194 @@
+---
+title: "Connectors"
+nav-parent_id: python_tableapi
+nav-pos: 130
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+
+* This will be replaced by the TOC
+{:toc}
+
+This page describes how to use connectors in PyFlink. The main purpose of this page is to highlight the different parts between using connectors in PyFlink and Java/Scala. Below, we will guide you how to use connectors through an explicit example in which Kafka and Json format are used.
+
+<span class="label label-info">Note</span> For the common parts of using connectors between PyFlink and Java/Scala, you can refer to the [Java/Scala document]({{ site.baseurl }}/dev/table/connectors/index.html) for more details. 
+
+## Download connector and format jars
+
+Suppose you are using Kafka connector and Json format, you need first download the [Kafka]({{ site.baseurl }}/dev/table/connectors/kafka.html) and [Json](https://repo.maven.apache.org/maven2/org/apache/flink/flink-json/) jars. Once the connector and format jars are downloaded to local, specify them with the [Dependency Management]({{ site.baseurl }}/dev/python/user-guide/table/dependency_management.html) APIs.
+
+{% highlight python %}
+
+table_env.get_config().get_configuration().set_string("pipeline.jars", "file:///my/jar/path/connector.jar;file:///my/jar/path/json.jar")
+
+{% endhighlight %}
+
+## How to use connectors
+
+In the Table API of PyFlink, DDL is recommended to define source and sink. You can use the `execute_sql()` method on `TableEnvironment` to register source and sink with DDL. After that, you can select from the source table and insert into the sink table.

Review comment:
       ```suggestion
   In PyFink's Table API, DDL the recomended way to define sources and sinks, executed via the `execute_sql()` method on the `TableEnvironment`. This makes the table available for use by the application.
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] morsapaes commented on pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
morsapaes commented on pull request #13193:
URL: https://github.com/apache/flink/pull/13193#issuecomment-682366908


   Looks good, thanks for the changes @hequn8128 and for the additional pair of eyes @dianfu ! Seth is out this week, so it's likely he'll only have the chance to check these PRs from Monday.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13193:
URL: https://github.com/apache/flink/pull/13193#issuecomment-675460565


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "3218091590ebac41d0351ff99aad0b2e6dc29cb6",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5682",
       "triggerID" : "3218091590ebac41d0351ff99aad0b2e6dc29cb6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec8c4b3b4e25ecbe5e1397e30bd4aef7900c38af",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5726",
       "triggerID" : "ec8c4b3b4e25ecbe5e1397e30bd4aef7900c38af",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 3218091590ebac41d0351ff99aad0b2e6dc29cb6 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5682) 
   * ec8c4b3b4e25ecbe5e1397e30bd4aef7900c38af Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5726) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] sjwiesman commented on a change in pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
sjwiesman commented on a change in pull request #13193:
URL: https://github.com/apache/flink/pull/13193#discussion_r473290415



##########
File path: docs/dev/python/user-guide/table/python_table_api_connectors.md
##########
@@ -0,0 +1,194 @@
+---
+title: "Connectors"
+nav-parent_id: python_tableapi
+nav-pos: 130
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+
+* This will be replaced by the TOC
+{:toc}
+
+This page describes how to use connectors in PyFlink. The main purpose of this page is to highlight the different parts between using connectors in PyFlink and Java/Scala. Below, we will guide you how to use connectors through an explicit example in which Kafka and Json format are used.
+
+<span class="label label-info">Note</span> For the common parts of using connectors between PyFlink and Java/Scala, you can refer to the [Java/Scala document]({{ site.baseurl }}/dev/table/connectors/index.html) for more details. 
+
+## Download connector and format jars
+
+Suppose you are using Kafka connector and Json format, you need first download the [Kafka]({{ site.baseurl }}/dev/table/connectors/kafka.html) and [Json](https://repo.maven.apache.org/maven2/org/apache/flink/flink-json/) jars. Once the connector and format jars are downloaded to local, specify them with the [Dependency Management]({{ site.baseurl }}/dev/python/user-guide/table/dependency_management.html) APIs.
+
+{% highlight python %}
+
+table_env.get_config().get_configuration().set_string("pipeline.jars", "file:///my/jar/path/connector.jar;file:///my/jar/path/json.jar")
+
+{% endhighlight %}
+
+## How to use connectors
+
+In the Table API of PyFlink, DDL is recommended to define source and sink. You can use the `execute_sql()` method on `TableEnvironment` to register source and sink with DDL. After that, you can select from the source table and insert into the sink table.

Review comment:
       Yes, that's better




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13193:
URL: https://github.com/apache/flink/pull/13193#issuecomment-675460565


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "3218091590ebac41d0351ff99aad0b2e6dc29cb6",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5682",
       "triggerID" : "3218091590ebac41d0351ff99aad0b2e6dc29cb6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec8c4b3b4e25ecbe5e1397e30bd4aef7900c38af",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5726",
       "triggerID" : "ec8c4b3b4e25ecbe5e1397e30bd4aef7900c38af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ca3029fb50b58d12bbfb87fbe79012195ba0b10",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5785",
       "triggerID" : "6ca3029fb50b58d12bbfb87fbe79012195ba0b10",
       "triggerType" : "PUSH"
     }, {
       "hash" : "dae6f793cca0d71b2aaaf860759a6f03506f9cc0",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "dae6f793cca0d71b2aaaf860759a6f03506f9cc0",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 6ca3029fb50b58d12bbfb87fbe79012195ba0b10 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5785) 
   * dae6f793cca0d71b2aaaf860759a6f03506f9cc0 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dianfu commented on a change in pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
dianfu commented on a change in pull request #13193:
URL: https://github.com/apache/flink/pull/13193#discussion_r477948856



##########
File path: docs/dev/python/user-guide/table/python_table_api_connectors.md
##########
@@ -0,0 +1,194 @@
+---
+title: "Connectors"
+nav-parent_id: python_tableapi
+nav-pos: 130
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+
+This page describes how to use connectors in PyFlink and highlights the details to be aware of when using Flink connectors in Python programs.
+
+* This will be replaced by the TOC
+{:toc}
+
+<span class="label label-info">Note</span>For general connector information and common configuration, please refer to the corresponding [Java/Scala documentation]({{ site.baseurl }}/dev/table/connectors/index.html). 

Review comment:
       ```suggestion
   <span class="label label-info">Note</span> For general connector information and common configuration, please refer to the corresponding [Java/Scala documentation]({{ site.baseurl }}/dev/table/connectors/index.html). 
   ```

##########
File path: docs/dev/python/user-guide/table/python_table_api_connectors.md
##########
@@ -0,0 +1,194 @@
+---
+title: "Connectors"
+nav-parent_id: python_tableapi
+nav-pos: 130
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+
+This page describes how to use connectors in PyFlink and highlights the details to be aware of when using Flink connectors in Python programs.
+
+* This will be replaced by the TOC
+{:toc}
+
+<span class="label label-info">Note</span>For general connector information and common configuration, please refer to the corresponding [Java/Scala documentation]({{ site.baseurl }}/dev/table/connectors/index.html). 
+
+## Download connector and format jars
+
+Since Flink is a Java/Scala-based project, for both connectors and formats, implementations are available as jars that need to be specified as job [dependencies]({{ site.baseurl }}/dev/python/user-guide/table/dependency_management.html).
+
+{% highlight python %}
+
+table_env.get_config().get_configuration().set_string("pipeline.jars", "file:///my/jar/path/connector.jar;file:///my/jar/path/json.jar")
+
+{% endhighlight %}
+
+## How to use connectors
+
+In PyFink's Table API, DDL is the recommended way to define sources and sinks, executed via the `execute_sql()` method on the `TableEnvironment`. This makes the table available for use by the application.
+
+{% highlight python %}
+
+source_ddl = """
+        CREATE TABLE source_table(
+            a VARCHAR,
+            b INT
+        ) WITH (
+          'connector.type' = 'kafka',

Review comment:
       [FLIP-122](https://cwiki.apache.org/confluence/display/FLINK/FLIP-122%3A+New+Connector+Property+Keys+for+New+Factory) has introduced a series of new properties for the connectors. For example, for kafka connector, 'connector.type' becomes 'connector'. I suggest to use the new properties proposed in FLIP-122. What do you think?

##########
File path: docs/dev/python/user-guide/table/python_table_api_connectors.zh.md
##########
@@ -0,0 +1,194 @@
+---
+title: "Connectors"
+nav-parent_id: python_tableapi
+nav-pos: 130
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+
+This page describes how to use connectors in PyFlink and highlights the details to be aware of when using Flink connectors in Python programs.
+
+* This will be replaced by the TOC
+{:toc}
+
+<span class="label label-info">Note</span>For general connector information and common configuration, please refer to the corresponding [Java/Scala documentation]({{ site.baseurl }}/zh/dev/table/connectors/index.html). 
+
+## Download connector and format jars
+
+Since Flink is a Java/Scala-based project, for both connectors and formats, implementations are available as jars that need to be specified as job [dependencies]({{ site.baseurl }}/dev/python/user-guide/table/dependency_management.html).

Review comment:
       It refers to the English doc in Chinese doc. I suggest to use the new style for links: 
   [dependencies]({% link dev/python/user-guide/table/dependency_management.zh.md %})
   
   It will report errors during build if it the links in Chinese doc refer to English doc for the new style.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dianfu commented on pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
dianfu commented on pull request #13193:
URL: https://github.com/apache/flink/pull/13193#issuecomment-694591362


   @sjwiesman @morsapaes Thanks a lot for the review and thanks @hequn8128 for the PR. Merging...


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] morsapaes commented on a change in pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
morsapaes commented on a change in pull request #13193:
URL: https://github.com/apache/flink/pull/13193#discussion_r474666933



##########
File path: docs/dev/python/user-guide/table/python_table_api_connectors.md
##########
@@ -0,0 +1,194 @@
+---
+title: "Connectors"
+nav-parent_id: python_tableapi
+nav-pos: 130
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+
+This page describes how to use connectors in PyFlink and highlights the different parts between using connectors in PyFlink vs Java/Scala. 
+
+* This will be replaced by the TOC
+{:toc}
+
+<span class="label label-info">Note</span>For general connector information and common configuration, please refer to the corresponding [Java/Scala documentation]({{ site.baseurl }}/dev/table/connectors/index.html). 
+
+## Download connector and format jars
+
+For both connectors and formats, implementations are available as jars that need to be specified as job [dependencies]({{ site.baseurl }}/dev/python/user-guide/table/dependency_management.html).

Review comment:
       Here, it would be nice to add a short sentence explaining that Flink is a Java/Scala-based project, just for the sake of completeness and so users who are not familiar with Flink understand why they have to deal with JARs in a Python program.

##########
File path: docs/dev/python/user-guide/table/python_table_api_connectors.md
##########
@@ -0,0 +1,194 @@
+---
+title: "Connectors"
+nav-parent_id: python_tableapi
+nav-pos: 130
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+
+This page describes how to use connectors in PyFlink and highlights the different parts between using connectors in PyFlink vs Java/Scala. 
+
+* This will be replaced by the TOC
+{:toc}
+
+<span class="label label-info">Note</span>For general connector information and common configuration, please refer to the corresponding [Java/Scala documentation]({{ site.baseurl }}/dev/table/connectors/index.html). 
+
+## Download connector and format jars
+
+For both connectors and formats, implementations are available as jars that need to be specified as job [dependencies]({{ site.baseurl }}/dev/python/user-guide/table/dependency_management.html).
+
+{% highlight python %}
+
+table_env.get_config().get_configuration().set_string("pipeline.jars", "file:///my/jar/path/connector.jar;file:///my/jar/path/json.jar")
+
+{% endhighlight %}
+
+## How to use connectors
+
+In PyFink's Table API, DDL is the recommended way to define sources and sinks, executed via the `execute_sql()` method on the `TableEnvironment`. This makes the table available for use by the application.
+
+{% highlight python %}
+
+source_ddl = """
+        CREATE TABLE source_table(
+            a VARCHAR,
+            b INT
+        ) WITH (
+          'connector.type' = 'kafka',
+          'connector.version' = 'universal',
+          'connector.topic' = 'source_topic',
+          'connector.properties.bootstrap.servers' = 'kafka:9092',
+          'connector.properties.group.id' = 'test_3',
+          'connector.startup-mode' = 'latest-offset',
+          'format.type' = 'json'
+        )
+        """
+
+sink_ddl = """
+        CREATE TABLE sink_table(
+            a VARCHAR
+        ) WITH (
+          'connector.type' = 'kafka',
+          'connector.version' = 'universal',
+          'connector.topic' = 'sink_topic',
+          'connector.properties.bootstrap.servers' = 'kafka:9092',
+          'format.type' = 'json'
+        )
+        """
+
+t_env.execute_sql(source_ddl)
+t_env.execute_sql(sink_ddl)
+
+t_env.sql_query("select a from source_table") \

Review comment:
       ```suggestion
   t_env.sql_query("SELECT a FROM source_table") \
   ```

##########
File path: docs/dev/python/user-guide/table/python_table_api_connectors.md
##########
@@ -0,0 +1,194 @@
+---
+title: "Connectors"
+nav-parent_id: python_tableapi
+nav-pos: 130
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+
+This page describes how to use connectors in PyFlink and highlights the different parts between using connectors in PyFlink vs Java/Scala. 
+
+* This will be replaced by the TOC
+{:toc}
+
+<span class="label label-info">Note</span>For general connector information and common configuration, please refer to the corresponding [Java/Scala documentation]({{ site.baseurl }}/dev/table/connectors/index.html). 
+
+## Download connector and format jars
+
+For both connectors and formats, implementations are available as jars that need to be specified as job [dependencies]({{ site.baseurl }}/dev/python/user-guide/table/dependency_management.html).
+
+{% highlight python %}
+
+table_env.get_config().get_configuration().set_string("pipeline.jars", "file:///my/jar/path/connector.jar;file:///my/jar/path/json.jar")
+
+{% endhighlight %}
+
+## How to use connectors
+
+In PyFink's Table API, DDL is the recommended way to define sources and sinks, executed via the `execute_sql()` method on the `TableEnvironment`. This makes the table available for use by the application.
+
+{% highlight python %}
+
+source_ddl = """
+        CREATE TABLE source_table(
+            a VARCHAR,
+            b INT
+        ) WITH (
+          'connector.type' = 'kafka',
+          'connector.version' = 'universal',
+          'connector.topic' = 'source_topic',
+          'connector.properties.bootstrap.servers' = 'kafka:9092',
+          'connector.properties.group.id' = 'test_3',
+          'connector.startup-mode' = 'latest-offset',
+          'format.type' = 'json'
+        )
+        """
+
+sink_ddl = """
+        CREATE TABLE sink_table(
+            a VARCHAR
+        ) WITH (
+          'connector.type' = 'kafka',
+          'connector.version' = 'universal',
+          'connector.topic' = 'sink_topic',
+          'connector.properties.bootstrap.servers' = 'kafka:9092',
+          'format.type' = 'json'
+        )
+        """
+
+t_env.execute_sql(source_ddl)
+t_env.execute_sql(sink_ddl)
+
+t_env.sql_query("select a from source_table") \
+    .insert_into("sink_table")
+    
+{% endhighlight %}
+
+Below is a complete example of how to use the Kafka and Json format in PyFlink.
+
+{% highlight python %}
+
+from pyflink.datastream import StreamExecutionEnvironment, TimeCharacteristic
+from pyflink.table import StreamTableEnvironment, EnvironmentSettings
+
+
+def log_processing():
+    env = StreamExecutionEnvironment.get_execution_environment()
+    env_settings = EnvironmentSettings.Builder().use_blink_planner().build()
+    t_env = StreamTableEnvironment.create(stream_execution_environment=env, environment_settings=env_settings)
+    t_env.get_config().get_configuration().set_boolean("python.fn-execution.memory.managed", True)
+    # specify connector and format jars
+    t_env.get_config().get_configuration().set_string("pipeline.jars", "file:///my/jar/path/connector.jar;file:///my/jar/path/json.jar")
+    
+    source_ddl = """
+            CREATE TABLE source_table(
+                a VARCHAR,
+                b INT
+            ) WITH (
+              'connector.type' = 'kafka',
+              'connector.version' = 'universal',
+              'connector.topic' = 'source_topic',
+              'connector.properties.bootstrap.servers' = 'kafka:9092',
+              'connector.properties.group.id' = 'test_3',
+              'connector.startup-mode' = 'latest-offset',
+              'format.type' = 'json'
+            )
+            """
+
+    sink_ddl = """
+            CREATE TABLE sink_table(
+                a VARCHAR
+            ) WITH (
+              'connector.type' = 'kafka',
+              'connector.version' = 'universal',
+              'connector.topic' = 'sink_topic',
+              'connector.properties.bootstrap.servers' = 'kafka:9092',
+              'format.type' = 'json'
+            )
+            """
+
+    t_env.execute_sql(source_ddl)
+    t_env.execute_sql(sink_ddl)
+
+    t_env.sql_query("select a from source_table") \
+        .insert_into("sink_table")
+
+    t_env.execute("payment_demo")
+
+
+if __name__ == '__main__':
+    log_processing()
+{% endhighlight %}
+
+
+## Predefined Sources and Sinks
+
+A few basic data sources and sinks are built into Flink and are always available. The predefined data sources include reading from Pandas DataFrame, or ingesting data from collections. The predefined data sinks support writing to Pandas DataFrame.
+
+### from/to Pandas
+
+PyFlink Tables support conversion to and from Pandas DataFrame.
+
+{% highlight python %}
+
+import pandas as pd
+import numpy as np
+
+# Create a PyFlink Table
+pdf = pd.DataFrame(np.random.rand(1000, 2))
+table = t_env.from_pandas(pdf, ["a", "b"]).filter("a > 0.5")
+
+# Convert the PyFlink Table to a Pandas DataFrame
+pdf = table.to_pandas()
+{% endhighlight %}
+
+### from_elements()
+
+`from_elements()` is used to creates a table from a collection of elements. The elements types must be acceptable atomic types or acceptable composite types. 
+
+{% highlight python %}
+
+table_env.from_elements([(1, 'Hi'), (2, 'Hello')])
+
+# use the second parameter to specify custom field names
+table_env.from_elements([(1, 'Hi'), (2, 'Hello')], ['a', 'b'])
+
+# use the second parameter to specify custom table schema
+table_env.from_elements([(1, 'Hi'), (2, 'Hello')],
+                        DataTypes.ROW([DataTypes.FIELD("a", DataTypes.INT()),
+                                       DataTypes.FIELD("b", DataTypes.STRING())]))
+{% endhighlight %}
+
+The above query returns a Table like:
+
+{% highlight python %}
++----+-------+
+| a  |   b   |
++====+=======+
+| 1  |  Hi   |
++----+-------+
+| 2  | Hello |
++----+-------+
+{% endhighlight %}
+
+## User-defined sources & sinks
+
+In some cases, you may want to define custom sources and sinks. Currently, sources and sinks must be implemented in Java/Scala but you can define a TableFactory to support their use via DDL. More details can be found in the [Java/Scala document]({{ site.baseurl }}/dev/table/sourceSinks.html).

Review comment:
       ```suggestion
   In some cases, you may want to define custom sources and sinks. Currently, sources and sinks must be implemented in Java/Scala, but you can define a `TableFactory` to support their use via DDL. More details can be found in the [Java/Scala documentation]({{ site.baseurl }}/dev/table/sourceSinks.html).
   ```

##########
File path: docs/dev/python/user-guide/table/python_table_api_connectors.md
##########
@@ -0,0 +1,194 @@
+---
+title: "Connectors"
+nav-parent_id: python_tableapi
+nav-pos: 130
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+
+This page describes how to use connectors in PyFlink and highlights the different parts between using connectors in PyFlink vs Java/Scala. 
+
+* This will be replaced by the TOC
+{:toc}
+
+<span class="label label-info">Note</span>For general connector information and common configuration, please refer to the corresponding [Java/Scala documentation]({{ site.baseurl }}/dev/table/connectors/index.html). 
+
+## Download connector and format jars
+
+For both connectors and formats, implementations are available as jars that need to be specified as job [dependencies]({{ site.baseurl }}/dev/python/user-guide/table/dependency_management.html).
+
+{% highlight python %}
+
+table_env.get_config().get_configuration().set_string("pipeline.jars", "file:///my/jar/path/connector.jar;file:///my/jar/path/json.jar")
+
+{% endhighlight %}
+
+## How to use connectors
+
+In PyFink's Table API, DDL is the recommended way to define sources and sinks, executed via the `execute_sql()` method on the `TableEnvironment`. This makes the table available for use by the application.
+
+{% highlight python %}
+
+source_ddl = """
+        CREATE TABLE source_table(
+            a VARCHAR,
+            b INT
+        ) WITH (
+          'connector.type' = 'kafka',
+          'connector.version' = 'universal',
+          'connector.topic' = 'source_topic',
+          'connector.properties.bootstrap.servers' = 'kafka:9092',
+          'connector.properties.group.id' = 'test_3',
+          'connector.startup-mode' = 'latest-offset',
+          'format.type' = 'json'
+        )
+        """
+
+sink_ddl = """
+        CREATE TABLE sink_table(
+            a VARCHAR
+        ) WITH (
+          'connector.type' = 'kafka',
+          'connector.version' = 'universal',
+          'connector.topic' = 'sink_topic',
+          'connector.properties.bootstrap.servers' = 'kafka:9092',
+          'format.type' = 'json'
+        )
+        """
+
+t_env.execute_sql(source_ddl)
+t_env.execute_sql(sink_ddl)
+
+t_env.sql_query("select a from source_table") \
+    .insert_into("sink_table")
+    
+{% endhighlight %}
+
+Below is a complete example of how to use the Kafka and Json format in PyFlink.
+
+{% highlight python %}
+
+from pyflink.datastream import StreamExecutionEnvironment, TimeCharacteristic
+from pyflink.table import StreamTableEnvironment, EnvironmentSettings
+
+
+def log_processing():
+    env = StreamExecutionEnvironment.get_execution_environment()
+    env_settings = EnvironmentSettings.Builder().use_blink_planner().build()
+    t_env = StreamTableEnvironment.create(stream_execution_environment=env, environment_settings=env_settings)
+    t_env.get_config().get_configuration().set_boolean("python.fn-execution.memory.managed", True)
+    # specify connector and format jars
+    t_env.get_config().get_configuration().set_string("pipeline.jars", "file:///my/jar/path/connector.jar;file:///my/jar/path/json.jar")
+    
+    source_ddl = """
+            CREATE TABLE source_table(
+                a VARCHAR,
+                b INT
+            ) WITH (
+              'connector.type' = 'kafka',
+              'connector.version' = 'universal',
+              'connector.topic' = 'source_topic',
+              'connector.properties.bootstrap.servers' = 'kafka:9092',
+              'connector.properties.group.id' = 'test_3',
+              'connector.startup-mode' = 'latest-offset',
+              'format.type' = 'json'
+            )
+            """
+
+    sink_ddl = """
+            CREATE TABLE sink_table(
+                a VARCHAR
+            ) WITH (
+              'connector.type' = 'kafka',
+              'connector.version' = 'universal',
+              'connector.topic' = 'sink_topic',
+              'connector.properties.bootstrap.servers' = 'kafka:9092',
+              'format.type' = 'json'
+            )
+            """
+
+    t_env.execute_sql(source_ddl)
+    t_env.execute_sql(sink_ddl)
+
+    t_env.sql_query("select a from source_table") \
+        .insert_into("sink_table")
+
+    t_env.execute("payment_demo")
+
+
+if __name__ == '__main__':
+    log_processing()
+{% endhighlight %}
+
+
+## Predefined Sources and Sinks
+
+A few basic data sources and sinks are built into Flink and are always available. The predefined data sources include reading from Pandas DataFrame, or ingesting data from collections. The predefined data sinks support writing to Pandas DataFrame.

Review comment:
       ```suggestion
   Some data sources and sinks are built into Flink and are available out-of-the-box. These predefined data sources include reading from Pandas DataFrame, or ingesting data from collections. The predefined data sinks support writing to Pandas DataFrame.
   ```

##########
File path: docs/dev/python/user-guide/table/python_table_api_connectors.md
##########
@@ -0,0 +1,194 @@
+---
+title: "Connectors"
+nav-parent_id: python_tableapi
+nav-pos: 130
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+
+This page describes how to use connectors in PyFlink and highlights the different parts between using connectors in PyFlink vs Java/Scala. 

Review comment:
       ```suggestion
   This page describes how to use connectors in PyFlink and highlights the details to be aware of when using Flink connectors in Python programs.
   ```

##########
File path: docs/dev/python/user-guide/table/python_table_api_connectors.md
##########
@@ -0,0 +1,194 @@
+---
+title: "Connectors"
+nav-parent_id: python_tableapi
+nav-pos: 130
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+
+This page describes how to use connectors in PyFlink and highlights the different parts between using connectors in PyFlink vs Java/Scala. 
+
+* This will be replaced by the TOC
+{:toc}
+
+<span class="label label-info">Note</span>For general connector information and common configuration, please refer to the corresponding [Java/Scala documentation]({{ site.baseurl }}/dev/table/connectors/index.html). 
+
+## Download connector and format jars
+
+For both connectors and formats, implementations are available as jars that need to be specified as job [dependencies]({{ site.baseurl }}/dev/python/user-guide/table/dependency_management.html).
+
+{% highlight python %}
+
+table_env.get_config().get_configuration().set_string("pipeline.jars", "file:///my/jar/path/connector.jar;file:///my/jar/path/json.jar")
+
+{% endhighlight %}
+
+## How to use connectors
+
+In PyFink's Table API, DDL is the recommended way to define sources and sinks, executed via the `execute_sql()` method on the `TableEnvironment`. This makes the table available for use by the application.
+
+{% highlight python %}
+
+source_ddl = """
+        CREATE TABLE source_table(
+            a VARCHAR,
+            b INT
+        ) WITH (
+          'connector.type' = 'kafka',
+          'connector.version' = 'universal',
+          'connector.topic' = 'source_topic',
+          'connector.properties.bootstrap.servers' = 'kafka:9092',
+          'connector.properties.group.id' = 'test_3',
+          'connector.startup-mode' = 'latest-offset',
+          'format.type' = 'json'
+        )
+        """
+
+sink_ddl = """
+        CREATE TABLE sink_table(
+            a VARCHAR
+        ) WITH (
+          'connector.type' = 'kafka',
+          'connector.version' = 'universal',
+          'connector.topic' = 'sink_topic',
+          'connector.properties.bootstrap.servers' = 'kafka:9092',
+          'format.type' = 'json'
+        )
+        """
+
+t_env.execute_sql(source_ddl)
+t_env.execute_sql(sink_ddl)
+
+t_env.sql_query("select a from source_table") \
+    .insert_into("sink_table")
+    
+{% endhighlight %}
+
+Below is a complete example of how to use the Kafka and Json format in PyFlink.
+
+{% highlight python %}
+
+from pyflink.datastream import StreamExecutionEnvironment, TimeCharacteristic
+from pyflink.table import StreamTableEnvironment, EnvironmentSettings
+
+
+def log_processing():
+    env = StreamExecutionEnvironment.get_execution_environment()
+    env_settings = EnvironmentSettings.Builder().use_blink_planner().build()
+    t_env = StreamTableEnvironment.create(stream_execution_environment=env, environment_settings=env_settings)
+    t_env.get_config().get_configuration().set_boolean("python.fn-execution.memory.managed", True)
+    # specify connector and format jars
+    t_env.get_config().get_configuration().set_string("pipeline.jars", "file:///my/jar/path/connector.jar;file:///my/jar/path/json.jar")
+    
+    source_ddl = """
+            CREATE TABLE source_table(
+                a VARCHAR,
+                b INT
+            ) WITH (
+              'connector.type' = 'kafka',
+              'connector.version' = 'universal',
+              'connector.topic' = 'source_topic',
+              'connector.properties.bootstrap.servers' = 'kafka:9092',
+              'connector.properties.group.id' = 'test_3',
+              'connector.startup-mode' = 'latest-offset',
+              'format.type' = 'json'
+            )
+            """
+
+    sink_ddl = """
+            CREATE TABLE sink_table(
+                a VARCHAR
+            ) WITH (
+              'connector.type' = 'kafka',
+              'connector.version' = 'universal',
+              'connector.topic' = 'sink_topic',
+              'connector.properties.bootstrap.servers' = 'kafka:9092',
+              'format.type' = 'json'
+            )
+            """
+
+    t_env.execute_sql(source_ddl)
+    t_env.execute_sql(sink_ddl)
+
+    t_env.sql_query("select a from source_table") \
+        .insert_into("sink_table")
+
+    t_env.execute("payment_demo")
+
+
+if __name__ == '__main__':
+    log_processing()
+{% endhighlight %}
+
+
+## Predefined Sources and Sinks
+
+A few basic data sources and sinks are built into Flink and are always available. The predefined data sources include reading from Pandas DataFrame, or ingesting data from collections. The predefined data sinks support writing to Pandas DataFrame.
+
+### from/to Pandas
+
+PyFlink Tables support conversion to and from Pandas DataFrame.
+
+{% highlight python %}
+
+import pandas as pd
+import numpy as np
+
+# Create a PyFlink Table
+pdf = pd.DataFrame(np.random.rand(1000, 2))
+table = t_env.from_pandas(pdf, ["a", "b"]).filter("a > 0.5")
+
+# Convert the PyFlink Table to a Pandas DataFrame
+pdf = table.to_pandas()
+{% endhighlight %}
+
+### from_elements()
+
+`from_elements()` is used to creates a table from a collection of elements. The elements types must be acceptable atomic types or acceptable composite types. 

Review comment:
       ```suggestion
   `from_elements()` is used to create a table from a collection of elements. The element types must be acceptable atomic types or acceptable composite types. 
   ```

##########
File path: docs/dev/python/user-guide/table/python_table_api_connectors.md
##########
@@ -0,0 +1,194 @@
+---
+title: "Connectors"
+nav-parent_id: python_tableapi
+nav-pos: 130
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+
+This page describes how to use connectors in PyFlink and highlights the different parts between using connectors in PyFlink vs Java/Scala. 

Review comment:
       "and highlights the different parts between using connectors in PyFlink vs Java/Scala."
   
   I think this doesn't really reflect what is being explained in this page and would suggest:

##########
File path: docs/dev/python/user-guide/table/python_table_api_connectors.md
##########
@@ -0,0 +1,194 @@
+---
+title: "Connectors"
+nav-parent_id: python_tableapi
+nav-pos: 130
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+
+This page describes how to use connectors in PyFlink and highlights the different parts between using connectors in PyFlink vs Java/Scala. 
+
+* This will be replaced by the TOC
+{:toc}
+
+<span class="label label-info">Note</span>For general connector information and common configuration, please refer to the corresponding [Java/Scala documentation]({{ site.baseurl }}/dev/table/connectors/index.html). 
+
+## Download connector and format jars
+
+For both connectors and formats, implementations are available as jars that need to be specified as job [dependencies]({{ site.baseurl }}/dev/python/user-guide/table/dependency_management.html).
+
+{% highlight python %}
+
+table_env.get_config().get_configuration().set_string("pipeline.jars", "file:///my/jar/path/connector.jar;file:///my/jar/path/json.jar")
+
+{% endhighlight %}
+
+## How to use connectors
+
+In PyFink's Table API, DDL is the recommended way to define sources and sinks, executed via the `execute_sql()` method on the `TableEnvironment`. This makes the table available for use by the application.
+
+{% highlight python %}
+
+source_ddl = """
+        CREATE TABLE source_table(
+            a VARCHAR,
+            b INT
+        ) WITH (
+          'connector.type' = 'kafka',
+          'connector.version' = 'universal',
+          'connector.topic' = 'source_topic',
+          'connector.properties.bootstrap.servers' = 'kafka:9092',
+          'connector.properties.group.id' = 'test_3',
+          'connector.startup-mode' = 'latest-offset',
+          'format.type' = 'json'
+        )
+        """
+
+sink_ddl = """
+        CREATE TABLE sink_table(
+            a VARCHAR
+        ) WITH (
+          'connector.type' = 'kafka',
+          'connector.version' = 'universal',
+          'connector.topic' = 'sink_topic',
+          'connector.properties.bootstrap.servers' = 'kafka:9092',
+          'format.type' = 'json'
+        )
+        """
+
+t_env.execute_sql(source_ddl)
+t_env.execute_sql(sink_ddl)
+
+t_env.sql_query("select a from source_table") \
+    .insert_into("sink_table")
+    
+{% endhighlight %}
+
+Below is a complete example of how to use the Kafka and Json format in PyFlink.

Review comment:
       ```suggestion
   Below is a complete example of how to use a Kafka source and the JSON format in PyFlink.
   ```

##########
File path: docs/dev/python/user-guide/table/python_table_api_connectors.md
##########
@@ -0,0 +1,194 @@
+---
+title: "Connectors"
+nav-parent_id: python_tableapi
+nav-pos: 130
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+
+This page describes how to use connectors in PyFlink and highlights the different parts between using connectors in PyFlink vs Java/Scala. 
+
+* This will be replaced by the TOC
+{:toc}
+
+<span class="label label-info">Note</span>For general connector information and common configuration, please refer to the corresponding [Java/Scala documentation]({{ site.baseurl }}/dev/table/connectors/index.html). 
+
+## Download connector and format jars
+
+For both connectors and formats, implementations are available as jars that need to be specified as job [dependencies]({{ site.baseurl }}/dev/python/user-guide/table/dependency_management.html).
+
+{% highlight python %}
+
+table_env.get_config().get_configuration().set_string("pipeline.jars", "file:///my/jar/path/connector.jar;file:///my/jar/path/json.jar")
+
+{% endhighlight %}
+
+## How to use connectors
+
+In PyFink's Table API, DDL is the recommended way to define sources and sinks, executed via the `execute_sql()` method on the `TableEnvironment`. This makes the table available for use by the application.
+
+{% highlight python %}
+
+source_ddl = """
+        CREATE TABLE source_table(
+            a VARCHAR,
+            b INT
+        ) WITH (
+          'connector.type' = 'kafka',
+          'connector.version' = 'universal',
+          'connector.topic' = 'source_topic',
+          'connector.properties.bootstrap.servers' = 'kafka:9092',
+          'connector.properties.group.id' = 'test_3',
+          'connector.startup-mode' = 'latest-offset',
+          'format.type' = 'json'
+        )
+        """
+
+sink_ddl = """
+        CREATE TABLE sink_table(
+            a VARCHAR
+        ) WITH (
+          'connector.type' = 'kafka',
+          'connector.version' = 'universal',
+          'connector.topic' = 'sink_topic',
+          'connector.properties.bootstrap.servers' = 'kafka:9092',
+          'format.type' = 'json'
+        )
+        """
+
+t_env.execute_sql(source_ddl)
+t_env.execute_sql(sink_ddl)
+
+t_env.sql_query("select a from source_table") \

Review comment:
       Just a personal preference/suggestion, for readability.

##########
File path: docs/dev/python/user-guide/table/python_table_api_connectors.md
##########
@@ -0,0 +1,194 @@
+---
+title: "Connectors"
+nav-parent_id: python_tableapi
+nav-pos: 130
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+
+This page describes how to use connectors in PyFlink and highlights the different parts between using connectors in PyFlink vs Java/Scala. 
+
+* This will be replaced by the TOC
+{:toc}
+
+<span class="label label-info">Note</span>For general connector information and common configuration, please refer to the corresponding [Java/Scala documentation]({{ site.baseurl }}/dev/table/connectors/index.html). 
+
+## Download connector and format jars
+
+For both connectors and formats, implementations are available as jars that need to be specified as job [dependencies]({{ site.baseurl }}/dev/python/user-guide/table/dependency_management.html).
+
+{% highlight python %}
+
+table_env.get_config().get_configuration().set_string("pipeline.jars", "file:///my/jar/path/connector.jar;file:///my/jar/path/json.jar")
+
+{% endhighlight %}
+
+## How to use connectors
+
+In PyFink's Table API, DDL is the recommended way to define sources and sinks, executed via the `execute_sql()` method on the `TableEnvironment`. This makes the table available for use by the application.
+
+{% highlight python %}
+
+source_ddl = """
+        CREATE TABLE source_table(
+            a VARCHAR,
+            b INT
+        ) WITH (
+          'connector.type' = 'kafka',
+          'connector.version' = 'universal',
+          'connector.topic' = 'source_topic',
+          'connector.properties.bootstrap.servers' = 'kafka:9092',
+          'connector.properties.group.id' = 'test_3',
+          'connector.startup-mode' = 'latest-offset',
+          'format.type' = 'json'
+        )
+        """
+
+sink_ddl = """
+        CREATE TABLE sink_table(
+            a VARCHAR
+        ) WITH (
+          'connector.type' = 'kafka',
+          'connector.version' = 'universal',
+          'connector.topic' = 'sink_topic',
+          'connector.properties.bootstrap.servers' = 'kafka:9092',
+          'format.type' = 'json'
+        )
+        """
+
+t_env.execute_sql(source_ddl)
+t_env.execute_sql(sink_ddl)
+
+t_env.sql_query("select a from source_table") \
+    .insert_into("sink_table")
+    
+{% endhighlight %}
+
+Below is a complete example of how to use the Kafka and Json format in PyFlink.
+
+{% highlight python %}
+
+from pyflink.datastream import StreamExecutionEnvironment, TimeCharacteristic
+from pyflink.table import StreamTableEnvironment, EnvironmentSettings
+
+
+def log_processing():
+    env = StreamExecutionEnvironment.get_execution_environment()
+    env_settings = EnvironmentSettings.Builder().use_blink_planner().build()
+    t_env = StreamTableEnvironment.create(stream_execution_environment=env, environment_settings=env_settings)
+    t_env.get_config().get_configuration().set_boolean("python.fn-execution.memory.managed", True)
+    # specify connector and format jars
+    t_env.get_config().get_configuration().set_string("pipeline.jars", "file:///my/jar/path/connector.jar;file:///my/jar/path/json.jar")
+    
+    source_ddl = """
+            CREATE TABLE source_table(
+                a VARCHAR,
+                b INT
+            ) WITH (
+              'connector.type' = 'kafka',
+              'connector.version' = 'universal',
+              'connector.topic' = 'source_topic',
+              'connector.properties.bootstrap.servers' = 'kafka:9092',
+              'connector.properties.group.id' = 'test_3',
+              'connector.startup-mode' = 'latest-offset',
+              'format.type' = 'json'
+            )
+            """
+
+    sink_ddl = """
+            CREATE TABLE sink_table(
+                a VARCHAR
+            ) WITH (
+              'connector.type' = 'kafka',
+              'connector.version' = 'universal',
+              'connector.topic' = 'sink_topic',
+              'connector.properties.bootstrap.servers' = 'kafka:9092',
+              'format.type' = 'json'
+            )
+            """
+
+    t_env.execute_sql(source_ddl)
+    t_env.execute_sql(sink_ddl)
+
+    t_env.sql_query("select a from source_table") \
+        .insert_into("sink_table")
+
+    t_env.execute("payment_demo")
+
+
+if __name__ == '__main__':
+    log_processing()
+{% endhighlight %}
+
+
+## Predefined Sources and Sinks
+
+A few basic data sources and sinks are built into Flink and are always available. The predefined data sources include reading from Pandas DataFrame, or ingesting data from collections. The predefined data sinks support writing to Pandas DataFrame.
+
+### from/to Pandas
+
+PyFlink Tables support conversion to and from Pandas DataFrame.
+
+{% highlight python %}
+
+import pandas as pd
+import numpy as np
+
+# Create a PyFlink Table
+pdf = pd.DataFrame(np.random.rand(1000, 2))
+table = t_env.from_pandas(pdf, ["a", "b"]).filter("a > 0.5")
+
+# Convert the PyFlink Table to a Pandas DataFrame
+pdf = table.to_pandas()
+{% endhighlight %}
+
+### from_elements()
+
+`from_elements()` is used to creates a table from a collection of elements. The elements types must be acceptable atomic types or acceptable composite types. 
+
+{% highlight python %}
+
+table_env.from_elements([(1, 'Hi'), (2, 'Hello')])
+
+# use the second parameter to specify custom field names
+table_env.from_elements([(1, 'Hi'), (2, 'Hello')], ['a', 'b'])
+
+# use the second parameter to specify custom table schema

Review comment:
       ```suggestion
   # use the second parameter to specify a custom table schema
   ```

##########
File path: docs/dev/python/user-guide/table/python_table_api_connectors.md
##########
@@ -0,0 +1,194 @@
+---
+title: "Connectors"
+nav-parent_id: python_tableapi
+nav-pos: 130
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+
+This page describes how to use connectors in PyFlink and highlights the different parts between using connectors in PyFlink vs Java/Scala. 
+
+* This will be replaced by the TOC
+{:toc}
+
+<span class="label label-info">Note</span>For general connector information and common configuration, please refer to the corresponding [Java/Scala documentation]({{ site.baseurl }}/dev/table/connectors/index.html). 
+
+## Download connector and format jars
+
+For both connectors and formats, implementations are available as jars that need to be specified as job [dependencies]({{ site.baseurl }}/dev/python/user-guide/table/dependency_management.html).
+
+{% highlight python %}
+
+table_env.get_config().get_configuration().set_string("pipeline.jars", "file:///my/jar/path/connector.jar;file:///my/jar/path/json.jar")
+
+{% endhighlight %}
+
+## How to use connectors
+
+In PyFink's Table API, DDL is the recommended way to define sources and sinks, executed via the `execute_sql()` method on the `TableEnvironment`. This makes the table available for use by the application.
+
+{% highlight python %}
+
+source_ddl = """
+        CREATE TABLE source_table(
+            a VARCHAR,
+            b INT
+        ) WITH (
+          'connector.type' = 'kafka',
+          'connector.version' = 'universal',
+          'connector.topic' = 'source_topic',
+          'connector.properties.bootstrap.servers' = 'kafka:9092',
+          'connector.properties.group.id' = 'test_3',
+          'connector.startup-mode' = 'latest-offset',
+          'format.type' = 'json'
+        )
+        """
+
+sink_ddl = """
+        CREATE TABLE sink_table(
+            a VARCHAR
+        ) WITH (
+          'connector.type' = 'kafka',
+          'connector.version' = 'universal',
+          'connector.topic' = 'sink_topic',
+          'connector.properties.bootstrap.servers' = 'kafka:9092',
+          'format.type' = 'json'
+        )
+        """
+
+t_env.execute_sql(source_ddl)
+t_env.execute_sql(sink_ddl)
+
+t_env.sql_query("select a from source_table") \
+    .insert_into("sink_table")
+    
+{% endhighlight %}
+
+Below is a complete example of how to use the Kafka and Json format in PyFlink.
+
+{% highlight python %}
+
+from pyflink.datastream import StreamExecutionEnvironment, TimeCharacteristic
+from pyflink.table import StreamTableEnvironment, EnvironmentSettings
+
+
+def log_processing():
+    env = StreamExecutionEnvironment.get_execution_environment()
+    env_settings = EnvironmentSettings.Builder().use_blink_planner().build()
+    t_env = StreamTableEnvironment.create(stream_execution_environment=env, environment_settings=env_settings)
+    t_env.get_config().get_configuration().set_boolean("python.fn-execution.memory.managed", True)
+    # specify connector and format jars
+    t_env.get_config().get_configuration().set_string("pipeline.jars", "file:///my/jar/path/connector.jar;file:///my/jar/path/json.jar")
+    
+    source_ddl = """
+            CREATE TABLE source_table(
+                a VARCHAR,
+                b INT
+            ) WITH (
+              'connector.type' = 'kafka',
+              'connector.version' = 'universal',
+              'connector.topic' = 'source_topic',
+              'connector.properties.bootstrap.servers' = 'kafka:9092',
+              'connector.properties.group.id' = 'test_3',
+              'connector.startup-mode' = 'latest-offset',
+              'format.type' = 'json'
+            )
+            """
+
+    sink_ddl = """
+            CREATE TABLE sink_table(
+                a VARCHAR
+            ) WITH (
+              'connector.type' = 'kafka',
+              'connector.version' = 'universal',
+              'connector.topic' = 'sink_topic',
+              'connector.properties.bootstrap.servers' = 'kafka:9092',
+              'format.type' = 'json'
+            )
+            """
+
+    t_env.execute_sql(source_ddl)
+    t_env.execute_sql(sink_ddl)
+
+    t_env.sql_query("select a from source_table") \
+        .insert_into("sink_table")
+
+    t_env.execute("payment_demo")
+
+
+if __name__ == '__main__':
+    log_processing()
+{% endhighlight %}
+
+
+## Predefined Sources and Sinks
+
+A few basic data sources and sinks are built into Flink and are always available. The predefined data sources include reading from Pandas DataFrame, or ingesting data from collections. The predefined data sinks support writing to Pandas DataFrame.

Review comment:
       IMO, using "basic" and such words may downplay the technology to the user.

##########
File path: docs/dev/python/user-guide/table/python_table_api_connectors.md
##########
@@ -0,0 +1,194 @@
+---
+title: "Connectors"
+nav-parent_id: python_tableapi
+nav-pos: 130
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+
+This page describes how to use connectors in PyFlink and highlights the different parts between using connectors in PyFlink vs Java/Scala. 
+
+* This will be replaced by the TOC
+{:toc}
+
+<span class="label label-info">Note</span>For general connector information and common configuration, please refer to the corresponding [Java/Scala documentation]({{ site.baseurl }}/dev/table/connectors/index.html). 
+
+## Download connector and format jars
+
+For both connectors and formats, implementations are available as jars that need to be specified as job [dependencies]({{ site.baseurl }}/dev/python/user-guide/table/dependency_management.html).
+
+{% highlight python %}
+
+table_env.get_config().get_configuration().set_string("pipeline.jars", "file:///my/jar/path/connector.jar;file:///my/jar/path/json.jar")
+
+{% endhighlight %}
+
+## How to use connectors
+
+In PyFink's Table API, DDL is the recommended way to define sources and sinks, executed via the `execute_sql()` method on the `TableEnvironment`. This makes the table available for use by the application.
+
+{% highlight python %}
+
+source_ddl = """
+        CREATE TABLE source_table(
+            a VARCHAR,
+            b INT
+        ) WITH (
+          'connector.type' = 'kafka',
+          'connector.version' = 'universal',
+          'connector.topic' = 'source_topic',
+          'connector.properties.bootstrap.servers' = 'kafka:9092',
+          'connector.properties.group.id' = 'test_3',
+          'connector.startup-mode' = 'latest-offset',
+          'format.type' = 'json'
+        )
+        """
+
+sink_ddl = """
+        CREATE TABLE sink_table(
+            a VARCHAR
+        ) WITH (
+          'connector.type' = 'kafka',
+          'connector.version' = 'universal',
+          'connector.topic' = 'sink_topic',
+          'connector.properties.bootstrap.servers' = 'kafka:9092',
+          'format.type' = 'json'
+        )
+        """
+
+t_env.execute_sql(source_ddl)
+t_env.execute_sql(sink_ddl)
+
+t_env.sql_query("select a from source_table") \
+    .insert_into("sink_table")
+    
+{% endhighlight %}
+
+Below is a complete example of how to use the Kafka and Json format in PyFlink.
+
+{% highlight python %}
+
+from pyflink.datastream import StreamExecutionEnvironment, TimeCharacteristic
+from pyflink.table import StreamTableEnvironment, EnvironmentSettings
+
+
+def log_processing():
+    env = StreamExecutionEnvironment.get_execution_environment()
+    env_settings = EnvironmentSettings.Builder().use_blink_planner().build()
+    t_env = StreamTableEnvironment.create(stream_execution_environment=env, environment_settings=env_settings)
+    t_env.get_config().get_configuration().set_boolean("python.fn-execution.memory.managed", True)
+    # specify connector and format jars
+    t_env.get_config().get_configuration().set_string("pipeline.jars", "file:///my/jar/path/connector.jar;file:///my/jar/path/json.jar")
+    
+    source_ddl = """
+            CREATE TABLE source_table(
+                a VARCHAR,
+                b INT
+            ) WITH (
+              'connector.type' = 'kafka',
+              'connector.version' = 'universal',
+              'connector.topic' = 'source_topic',
+              'connector.properties.bootstrap.servers' = 'kafka:9092',
+              'connector.properties.group.id' = 'test_3',
+              'connector.startup-mode' = 'latest-offset',
+              'format.type' = 'json'
+            )
+            """
+
+    sink_ddl = """
+            CREATE TABLE sink_table(
+                a VARCHAR
+            ) WITH (
+              'connector.type' = 'kafka',
+              'connector.version' = 'universal',
+              'connector.topic' = 'sink_topic',
+              'connector.properties.bootstrap.servers' = 'kafka:9092',
+              'format.type' = 'json'
+            )
+            """
+
+    t_env.execute_sql(source_ddl)
+    t_env.execute_sql(sink_ddl)
+
+    t_env.sql_query("select a from source_table") \

Review comment:
       ```suggestion
       t_env.sql_query("SELECT a FROM source_table") \
   ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] hequn8128 commented on a change in pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
hequn8128 commented on a change in pull request #13193:
URL: https://github.com/apache/flink/pull/13193#discussion_r472592715



##########
File path: docs/dev/python/user-guide/table/python_table_api_connectors.md
##########
@@ -0,0 +1,194 @@
+---
+title: "Connectors"
+nav-parent_id: python_tableapi
+nav-pos: 130
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+
+* This will be replaced by the TOC
+{:toc}
+
+This page describes how to use connectors in PyFlink. The main purpose of this page is to highlight the different parts between using connectors in PyFlink and Java/Scala. Below, we will guide you how to use connectors through an explicit example in which Kafka and Json format are used.
+
+<span class="label label-info">Note</span> For the common parts of using connectors between PyFlink and Java/Scala, you can refer to the [Java/Scala document]({{ site.baseurl }}/dev/table/connectors/index.html) for more details. 
+
+## Download connector and format jars
+
+Suppose you are using Kafka connector and Json format, you need first download the [Kafka]({{ site.baseurl }}/dev/table/connectors/kafka.html) and [Json](https://repo.maven.apache.org/maven2/org/apache/flink/flink-json/) jars. Once the connector and format jars are downloaded to local, specify them with the [Dependency Management]({{ site.baseurl }}/dev/python/user-guide/table/dependency_management.html) APIs.
+
+{% highlight python %}
+
+table_env.get_config().get_configuration().set_string("pipeline.jars", "file:///my/jar/path/connector.jar;file:///my/jar/path/json.jar")
+
+{% endhighlight %}
+
+## How to use connectors
+
+In the Table API of PyFlink, DDL is recommended to define source and sink. You can use the `execute_sql()` method on `TableEnvironment` to register source and sink with DDL. After that, you can select from the source table and insert into the sink table.

Review comment:
       Maybe `DDL is the recommended way ...` ? 




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] dianfu commented on pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
dianfu commented on pull request #13193:
URL: https://github.com/apache/flink/pull/13193#issuecomment-689245491


   @sjwiesman So sorry to ping you again. Could you help to take a look at this PR when it's convenient for you?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] morsapaes commented on a change in pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
morsapaes commented on a change in pull request #13193:
URL: https://github.com/apache/flink/pull/13193#discussion_r474700929



##########
File path: docs/dev/python/user-guide/table/python_table_api_connectors.md
##########
@@ -0,0 +1,194 @@
+---
+title: "Connectors"
+nav-parent_id: python_tableapi
+nav-pos: 130
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+
+This page describes how to use connectors in PyFlink and highlights the different parts between using connectors in PyFlink vs Java/Scala. 
+
+* This will be replaced by the TOC
+{:toc}
+
+<span class="label label-info">Note</span>For general connector information and common configuration, please refer to the corresponding [Java/Scala documentation]({{ site.baseurl }}/dev/table/connectors/index.html). 
+
+## Download connector and format jars
+
+For both connectors and formats, implementations are available as jars that need to be specified as job [dependencies]({{ site.baseurl }}/dev/python/user-guide/table/dependency_management.html).
+
+{% highlight python %}
+
+table_env.get_config().get_configuration().set_string("pipeline.jars", "file:///my/jar/path/connector.jar;file:///my/jar/path/json.jar")
+
+{% endhighlight %}
+
+## How to use connectors
+
+In PyFink's Table API, DDL is the recommended way to define sources and sinks, executed via the `execute_sql()` method on the `TableEnvironment`. This makes the table available for use by the application.
+
+{% highlight python %}
+
+source_ddl = """
+        CREATE TABLE source_table(
+            a VARCHAR,
+            b INT
+        ) WITH (
+          'connector.type' = 'kafka',
+          'connector.version' = 'universal',
+          'connector.topic' = 'source_topic',
+          'connector.properties.bootstrap.servers' = 'kafka:9092',
+          'connector.properties.group.id' = 'test_3',
+          'connector.startup-mode' = 'latest-offset',
+          'format.type' = 'json'
+        )
+        """
+
+sink_ddl = """
+        CREATE TABLE sink_table(
+            a VARCHAR
+        ) WITH (
+          'connector.type' = 'kafka',
+          'connector.version' = 'universal',
+          'connector.topic' = 'sink_topic',
+          'connector.properties.bootstrap.servers' = 'kafka:9092',
+          'format.type' = 'json'
+        )
+        """
+
+t_env.execute_sql(source_ddl)
+t_env.execute_sql(sink_ddl)
+
+t_env.sql_query("select a from source_table") \
+    .insert_into("sink_table")
+    
+{% endhighlight %}
+
+Below is a complete example of how to use the Kafka and Json format in PyFlink.
+
+{% highlight python %}
+
+from pyflink.datastream import StreamExecutionEnvironment, TimeCharacteristic
+from pyflink.table import StreamTableEnvironment, EnvironmentSettings
+
+
+def log_processing():
+    env = StreamExecutionEnvironment.get_execution_environment()
+    env_settings = EnvironmentSettings.Builder().use_blink_planner().build()
+    t_env = StreamTableEnvironment.create(stream_execution_environment=env, environment_settings=env_settings)
+    t_env.get_config().get_configuration().set_boolean("python.fn-execution.memory.managed", True)
+    # specify connector and format jars
+    t_env.get_config().get_configuration().set_string("pipeline.jars", "file:///my/jar/path/connector.jar;file:///my/jar/path/json.jar")
+    
+    source_ddl = """
+            CREATE TABLE source_table(
+                a VARCHAR,
+                b INT
+            ) WITH (
+              'connector.type' = 'kafka',
+              'connector.version' = 'universal',
+              'connector.topic' = 'source_topic',
+              'connector.properties.bootstrap.servers' = 'kafka:9092',
+              'connector.properties.group.id' = 'test_3',
+              'connector.startup-mode' = 'latest-offset',
+              'format.type' = 'json'
+            )
+            """
+
+    sink_ddl = """
+            CREATE TABLE sink_table(
+                a VARCHAR
+            ) WITH (
+              'connector.type' = 'kafka',
+              'connector.version' = 'universal',
+              'connector.topic' = 'sink_topic',
+              'connector.properties.bootstrap.servers' = 'kafka:9092',
+              'format.type' = 'json'
+            )
+            """
+
+    t_env.execute_sql(source_ddl)
+    t_env.execute_sql(sink_ddl)
+
+    t_env.sql_query("select a from source_table") \

Review comment:
       Just a personal preference/suggestion, for readability.




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #13193:
URL: https://github.com/apache/flink/pull/13193#issuecomment-675460565


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "3218091590ebac41d0351ff99aad0b2e6dc29cb6",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5682",
       "triggerID" : "3218091590ebac41d0351ff99aad0b2e6dc29cb6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ec8c4b3b4e25ecbe5e1397e30bd4aef7900c38af",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5726",
       "triggerID" : "ec8c4b3b4e25ecbe5e1397e30bd4aef7900c38af",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6ca3029fb50b58d12bbfb87fbe79012195ba0b10",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5785",
       "triggerID" : "6ca3029fb50b58d12bbfb87fbe79012195ba0b10",
       "triggerType" : "PUSH"
     }, {
       "hash" : "dae6f793cca0d71b2aaaf860759a6f03506f9cc0",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5929",
       "triggerID" : "dae6f793cca0d71b2aaaf860759a6f03506f9cc0",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * dae6f793cca0d71b2aaaf860759a6f03506f9cc0 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=5929) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run travis` re-run the last Travis build
    - `@flinkbot run azure` re-run the last Azure build
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot commented on pull request #13193: [FLINK-18918][python][docs] Add dedicated connector documentation for Python Table API

Posted by GitBox <gi...@apache.org>.
flinkbot commented on pull request #13193:
URL: https://github.com/apache/flink/pull/13193#issuecomment-675449504


   Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community
   to review your pull request. We will use this comment to track the progress of the review.
   
   
   ## Automated Checks
   Last check on commit 3218091590ebac41d0351ff99aad0b2e6dc29cb6 (Tue Aug 18 12:30:48 UTC 2020)
   
    ✅no warnings
   
   <sub>Mention the bot in a comment to re-run the automated checks.</sub>
   ## Review Progress
   
   * ❓ 1. The [description] looks good.
   * ❓ 2. There is [consensus] that the contribution should go into to Flink.
   * ❓ 3. Needs [attention] from.
   * ❓ 4. The change fits into the overall [architecture].
   * ❓ 5. Overall code [quality] is good.
   
   Please see the [Pull Request Review Guide](https://flink.apache.org/contributing/reviewing-prs.html) for a full explanation of the review process.<details>
    The Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot approve description` to approve one or more aspects (aspects: `description`, `consensus`, `architecture` and `quality`)
    - `@flinkbot approve all` to approve all aspects
    - `@flinkbot approve-until architecture` to approve everything until `architecture`
    - `@flinkbot attention @username1 [@username2 ..]` to require somebody's attention
    - `@flinkbot disapprove architecture` to remove an approval you gave earlier
   </details>


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org