You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@flink.apache.org by ma...@apache.org on 2022/03/09 10:54:48 UTC

[flink] branch master updated: [FLINK-25129][docs] Improvements to the table-planner-loader related docs (#18812)

This is an automated email from the ASF dual-hosted git repository.

martijnvisser pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/master by this push:
     new e35385a  [FLINK-25129][docs] Improvements to the table-planner-loader related docs (#18812)
e35385a is described below

commit e35385a76f9ea67d766ba590c459384dbf02a777
Author: Francesco Guardiani <fr...@gmail.com>
AuthorDate: Wed Mar 9 11:54:07 2022 +0100

    [FLINK-25129][docs] Improvements to the table-planner-loader related docs (#18812)
    
    [FLINK-25129][docs] Improvements to the project configuration docs. This closes #18812
---
 docs/README.md                                   |   7 +-
 docs/content/docs/dev/configuration/advanced.md  |  54 ++++++------
 docs/content/docs/dev/configuration/connector.md |  61 ++++++++------
 docs/content/docs/dev/configuration/maven.md     |   7 +-
 docs/content/docs/dev/configuration/overview.md  |  69 +++++++++++-----
 docs/content/docs/dev/configuration/testing.md   |  64 +++-----------
 docs/content/docs/dev/table/sourcesSinks.md      |  20 +++--
 docs/layouts/shortcodes/artifact_gradle.html     |  56 +++++++++++++
 docs/layouts/shortcodes/artifact_tabs.html       | 101 +++++++++++++++++++++++
 9 files changed, 304 insertions(+), 135 deletions(-)

diff --git a/docs/README.md b/docs/README.md
index a3edac1..e850eed 100644
--- a/docs/README.md
+++ b/docs/README.md
@@ -119,7 +119,12 @@ It includes a number of optional flags:
 
 * withScalaVersion: Includes the scala version suffix to the artifact id
 * withTestScope: Includes `<scope>test</scope>` to the module. Useful for marking test dependencies.
-* withTestClassifier: Includes `<classifier>tests</classifier>`. Useful when users should be pulling in Flinks tests dependencies. This is mostly for the test harnesses and probably not what you want. 
+* withTestClassifier: Includes `<classifier>tests</classifier>`. Useful when users should be pulling in Flink tests dependencies. This is mostly for the test harnesses and probably not what you want. 
+
+You can also use the shortcodes (with same flags) instead:
+
+* `artifact_gradle` to show the Gradle syntax
+* `artifact_tabs` to create a tabbed view, showing both Maven and Gradle syntax
 
 #### Back to Top
 
diff --git a/docs/content/docs/dev/configuration/advanced.md b/docs/content/docs/dev/configuration/advanced.md
index 9164d08..ccd192b 100644
--- a/docs/content/docs/dev/configuration/advanced.md
+++ b/docs/content/docs/dev/configuration/advanced.md
@@ -24,33 +24,29 @@ under the License.
 
 # Advanced Configuration Topics
 
-## Dependencies: Flink Core and User Application
-
-There are two broad categories of dependencies and libraries in Flink, which are explained below.
-
-### Flink Core Dependencies
+## Anatomy of the Flink distribution
 
 Flink itself consists of a set of classes and dependencies that form the core of Flink's runtime
 and must be present when a Flink application is started. The classes and dependencies needed to run
 the system handle areas such as coordination, networking, checkpointing, failover, APIs,
 operators (such as windowing), resource management, etc.
 
-These core classes and dependencies are packaged in the `flink-dist` jar, are part of Flink's `lib`
-folder, and part of the basic Flink container images. You can think of these dependencies as similar
-to Java's core library, which contains classes like `String` and `List`.
+These core classes and dependencies are packaged in the `flink-dist.jar`, which is available in the `/lib`
+folder in the downloaded distribution and is part of the basic Flink container images. 
+You can think of these dependencies as similar to Java's core library, which contains classes like `String` and `List`.
 
 In order to keep the core dependencies as small as possible and avoid dependency clashes, the
 Flink Core Dependencies do not contain any connectors or libraries (i.e. CEP, SQL, ML) in order to
 avoid having an excessive default number of classes and dependencies in the classpath.
 
-### User Application Dependencies
+The `/lib` directory of the Flink distribution additionally contains various JARs including commonly used modules, 
+such as all the required [modules to execute Table jobs](#anatomy-of-table-dependencies) and a set of connector and formats.
+These are loaded by default and can be removed from the classpath just by removing them from the `/lib` folder.
 
-These dependencies include all connectors, formats, or libraries that a specific user application
-needs and explicitly do not include the Flink DataStream and Table APIs and runtime dependencies 
-since those are already part of the Flink core dependencies.
+Flink also ships additional optional dependencies under the `/opt` folder, 
+which can be enabled by moving the JARs in the `/lib` folder.
 
-The user application is typically packaged into an *application jar*, which contains the application
-code and the required connector and library dependencies.
+For more information about classloading, refer to the section on [Classloading in Flink]({{< ref "docs/ops/debugging/debugging_classloading.md" >}}).
 
 ## Scala Versions
 
@@ -84,12 +80,14 @@ The Flink distribution contains by default the required JARs to execute Flink SQ
 in particular:
 
 - `flink-table-api-java-uber-{{< version >}}.jar` &#8594; contains all the Java APIs 
-- `flink-table-runtime-{{< version >}}.jar` &#8594; contains the runtime
+- `flink-table-runtime-{{< version >}}.jar` &#8594; contains the table runtime
 - `flink-table-planner-loader-{{< version >}}.jar` &#8594; contains the query planner
 
-**Note:** Previously, these JARs were all packaged into `flink-table.jar`. Since Flink 1.15, this has 
+{{< hint warning >}}
+Previously, these JARs were all packaged into `flink-table.jar`. Since Flink 1.15, this has 
 now been split into three JARs in order to allow users to swap the `flink-table-planner-loader-{{< version >}}.jar` 
 with `flink-table-planner{{< scala_version >}}-{{< version >}}.jar`.
+{{< /hint >}}
 
 While Table Java API artifacts are built into the distribution, Table Scala API artifacts are not 
 included by default. When using formats and connectors with the Flink Scala API, you need to either 
@@ -102,20 +100,28 @@ For more details, check out how to [connect to external systems]({{< ref "docs/c
 
 Starting from Flink 1.15, the distribution contains two planners:
 
--`flink-table-planner{{< scala_version >}}-{{< version >}}.jar`, in `/opt`, contains the query planner
--`flink-table-planner-loader-{{< version >}}.jar`, loaded by default in `/lib`, contains the query planner 
+- `flink-table-planner{{< scala_version >}}-{{< version >}}.jar`, in `/opt`, contains the query planner
+- `flink-table-planner-loader-{{< version >}}.jar`, loaded by default in `/lib`, contains the query planner 
   hidden behind an isolated classpath (you won't be able to address any `io.apache.flink.table.planner` directly)
 
-The planners contain the same code, but they are packaged differently. In one case, you must use the 
-same Scala version of the JAR. In the other, you do not need to make considerations about Scala, since
+The two planner JARs contain the same code, but they are packaged differently. In the first case, you must use the 
+same Scala version of the JAR. In second case, you do not need to make considerations about Scala, since
 it is hidden inside the JAR.
 
-If you need to access and use the internals of the query planner, you can swap the JARs (copying and
-pasting them in the downloaded distribution). Be aware that you will be constrained to using the Scala 
-version of the Flink distribution that you are using.
+By default,`flink-table-planner-loader` is used by the distribution. If you need to access and use the internals of the query planner, 
+you can swap the JARs (copying and pasting `flink-table-planner{{< scala_version >}}.jar` in the distribution `/lib` folder). 
+Be aware that you will be constrained to using the Scala version of the Flink distribution that you are using.
 
-**Note:** The two planners cannot co-exist at the same time in the classpath. If you load both of them
+{{< hint danger >}}
+The two planners cannot co-exist at the same time in the classpath. If you load both of them
 in `/lib` your Table Jobs will fail.
+{{< /hint >}}
+
+{{< hint warning >}}
+In the upcoming Flink versions, we will stop shipping the `flink-table-planner{{< scala_version >}}` artifact in the Flink distribution. 
+We strongly suggest migrating your jobs and your custom connectors/formats to work with the API modules, without relying on planner internals. 
+If you need some functionality from the planner, which is currently not exposed through the API modules, please open a ticket in order to discuss it with the community.
+{{< /hint >}}
 
 ## Hadoop Dependencies
 
diff --git a/docs/content/docs/dev/configuration/connector.md b/docs/content/docs/dev/configuration/connector.md
index d6779e1..25be453 100644
--- a/docs/content/docs/dev/configuration/connector.md
+++ b/docs/content/docs/dev/configuration/connector.md
@@ -1,5 +1,5 @@
 ---
-title: "Dependencies: Connectors and Formats"
+title: "Connectors and Formats"
 weight: 5
 type: docs
 ---
@@ -24,39 +24,46 @@ under the License.
 
 # Connectors and Formats
 
-Flink can read from and write to various external systems via connectors and define the format in 
-which to store the data.
+Flink applications can read from and write to various external systems via connectors.
+It supports multiple formats in order to encode and decode data to match Flink's data structures.
 
-The way that information is serialized is represented in the external system and that system needs
-to know how to read this data in a format that can be read by Flink.  This is done through format 
-dependencies.
+An overview of available connectors and formats is available for both
+[DataStream]({{< ref "docs/connectors/datastream/overview.md" >}}) and
+[Table API/SQL]({{< ref "docs/connectors/table/overview.md" >}}).
 
-Most applications need specific connectors to run. Flink provides a set of formats that can be used 
-with connectors (with the dependencies for both being fairly unified). These are not part of Flink's 
-core dependencies and must be added as dependencies to the application.
+## Available artifacts
 
-## Adding Dependencies 
+In order to use connectors and formats, you need to make sure Flink has access to the artifacts implementing them. 
+For each connector supported by the Flink community, we publish two artifacts on [Maven Central](https://search.maven.org):
 
-For more information on how to add dependencies, refer to the build tools sections on [Maven]({{< ref "docs/dev/configuration/maven" >}})
-and [Gradle]({{< ref "docs/dev/configuration/gradle" >}}). 
+* `flink-connector-<NAME>` which is a thin JAR including only the connector code, but excluding eventual third-party dependencies
+* `flink-sql-connector-<NAME>` which is an uber JAR ready to use with all the connector third-party dependencies
 
-## Packaging Dependencies
+The same applies for formats as well. Note that some connectors may not have a corresponding 
+`flink-sql-connector-<NAME>` artifact because they do not require third-party dependencies.
 
-We recommend packaging the application code and all its required dependencies into one fat/uber JAR. 
-This job JAR can be submitted to an already running Flink cluster, or added to a Flink application 
-container image.
+{{< hint info >}}
+The uber/fat JARs are supported mostly for being used in conjunction with the [SQL client]({{< ref "docs/dev/table/sqlClient" >}}),
+but you can also use them in any DataStream/Table application.
+{{< /hint >}}
 
-On [Maven Central](https://search.maven.org), we publish connectors named "flink-connector-<NAME>" and
-"flink-sql-connector-<NAME>". The former are thin JARs while the latter are uber JARs.
+## Using artifacts
 
-In order to use the uber JARs, you can shade them (including and renaming dependencies to create a 
-private copy) in the uber JAR of your Flink job, or you can add them to the `/lib` folder of the 
-distribution.
+In order to use a connector/format module, you can either:
 
-If you shade a dependency, you will have more control over the dependency version in the job JAR. 
-In case of shading the thin JAR, you will have even more control over the transitive dependencies, 
-since you can change the versions without changing the connector version (binary compatibility permitting).
+* Shade the thin JAR and its transitive dependencies in your job JAR
+* Shade the uber JAR in your job JAR
+* Copy the uber JAR directly in the `/lib` folder of the Flink distribution
+
+For shading dependencies, check out the specific [Maven]({{< ref "docs/dev/configuration/maven" >}}) 
+and [Gradle]({{< ref "docs/dev/configuration/gradle" >}}) guides. 
+For a reference about the Flink distribution, check [Anatomy of the Flink distribution]({{< ref "docs/dev/configuration/overview" >}}#anatomy-of-the-flink-distribution).
 
-If you include uber JARs directly in the distribution, this can simplify the management of dependencies 
-in a shared multi-job Flink cluster, but it also means that you will lock in a specific version of the 
-dependency.
+{{< hint info >}}
+Deciding whether to shade the uber JAR, the thin JAR or just include the dependency in the distribution is up to you and your use case.
+If you shade a dependency, you will have more control over the dependency version in the job JAR.
+In case of shading the thin JAR, you will have even more control over the transitive dependencies,
+since you can change the versions without changing the connector version (binary compatibility permitting).
+In case of embedding the connector uber JAR directly in the Flink distribution `/lib` folder,
+you will be able to control in one place connector versions for all jobs.
+{{< /hint >}}
diff --git a/docs/content/docs/dev/configuration/maven.md b/docs/content/docs/dev/configuration/maven.md
index cf6a9f4..ec11599 100644
--- a/docs/content/docs/dev/configuration/maven.md
+++ b/docs/content/docs/dev/configuration/maven.md
@@ -24,10 +24,9 @@ under the License.
 
 # How to use Maven to configure your project
 
-You will likely need a build tool to configure your Flink project. This guide will show you how to 
-do so with [Maven](https://maven.apache.org), an open-source build automation tool developed by the 
-Apache Group that enables you to build, publish, and deploy projects. You can use it to manage the 
-entire lifecycle of your software project.
+This guide will show you how to configure a Flink job project with [Maven](https://maven.apache.org), 
+an open-source build automation tool developed by the Apache Software Foundation that enables you to build, 
+publish, and deploy projects. You can use it to manage the entire lifecycle of your software project.
 
 ## Requirements
 
diff --git a/docs/content/docs/dev/configuration/overview.md b/docs/content/docs/dev/configuration/overview.md
index 4ee3a6b..4ba7f83 100644
--- a/docs/content/docs/dev/configuration/overview.md
+++ b/docs/content/docs/dev/configuration/overview.md
@@ -38,15 +38,15 @@ under the License.
 
 # Project Configuration
 
-Every Flink application depends on a set of Flink libraries. At a minimum, the application depends
-on the Flink APIs and, in addition, on certain connector libraries (i.e. Kafka, Cassandra).
-When running Flink applications in the IDE, add a provided dependency to the [Flink runtime library](https://mvnrepository.com/artifact/org.apache.flink/flink-runtime).
-
 The guides in this section will show you how to configure your projects via popular build tools
 ([Maven]({{< ref "docs/dev/configuration/maven" >}}), [Gradle]({{< ref "docs/dev/configuration/gradle" >}})),
-add the necessary dependencies (i.e. [connectors and formats]({{< ref "docs/dev/configuration/connector" >}}), 
-[testing]({{< ref "docs/dev/configuration/testing" >}})), and cover some 
-[advanced]({{< ref "docs/dev/configuration/advanced" >}}) configuration topics. 
+add the necessary dependencies (i.e. [connectors and formats]({{< ref "docs/dev/configuration/connector" >}}),
+[testing]({{< ref "docs/dev/configuration/testing" >}})), and cover some
+[advanced]({{< ref "docs/dev/configuration/advanced" >}}) configuration topics.
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. Kafka, Cassandra) and 
+3rd party dependencies required to the user to develop custom functions to process the data.
 
 ## Getting started
 
@@ -130,7 +130,7 @@ configurations {
 dependencies {
     // --------------------------------------------------------------
     // Compile-time dependencies that should NOT be part of the
-    // shadow jar and are provided in the lib folder of Flink
+    // shadow (uber) jar and are provided in the lib folder of Flink
     // --------------------------------------------------------------
     implementation "org.apache.flink:flink-streaming-java:${flinkVersion}"
     implementation "org.apache.flink:flink-clients:${flinkVersion}"
@@ -177,19 +177,46 @@ bash -c "$(curl https://flink.apache.org/q/gradle-quickstart.sh)" -- {{< version
 
 ## Which dependencies do you need?
 
-Depending on what you want to achieve, you are going to choose a combination of our available APIs, 
-which will require different dependencies. 
+To start working on a Flink job, you usually need the following dependencies:
+
+* Flink APIs, in order to develop your job
+* [Connectors and formats]({{< ref "docs/dev/configuration/connector" >}}), in order to integrate your job with external systems
+* [Testing utilities]({{< ref "docs/dev/configuration/testing" >}}), in order to test your job
+
+And in addition to these, you might want to add 3rd party dependencies that you need to develop custom functions.
+
+### Flink APIs
+
+Flink offers two major APIs: [Datastream API]({{< ref "docs/dev/datastream/overview" >}}) and [Table API & SQL]({{< ref "docs/dev/table/overview" >}}). 
+They can be used separately, or they can be mixed, depending on your use cases:
+
+| APIs you want to use                                                              | Dependency you need to add                          |
+|-----------------------------------------------------------------------------------|-----------------------------------------------------|
+| [DataStream]({{< ref "docs/dev/datastream/overview" >}})                          | `flink-streaming-java`                              |  
+| [DataStream with Scala]({{< ref "docs/dev/datastream/scala_api_extensions" >}})   | `flink-streaming-scala{{< scala_version >}}`        |   
+| [Table API]({{< ref "docs/dev/table/common" >}})                                  | `flink-table-api-java`                              |   
+| [Table API with Scala]({{< ref "docs/dev/table/common" >}})                       | `flink-table-api-scala{{< scala_version >}}`        |
+| [Table API + DataStream]({{< ref "docs/dev/table/data_stream_api" >}})            | `flink-table-api-java-bridge`                       |
+| [Table API + DataStream with Scala]({{< ref "docs/dev/table/data_stream_api" >}}) | `flink-table-api-scala-bridge{{< scala_version >}}` |
+
+Just include them in your build tool script/descriptor, and you can start developing your job!
+
+## Running and packaging
+
+If you want to run your job by simply executing the main class, you will need `flink-runtime` in your classpath.
+In case of Table API programs, you will also need `flink-table-runtime` and `flink-table-planner-loader`.
 
-Here is a table of artifact/dependency names:
+As a rule of thumb, we **suggest** packaging the application code and all its required dependencies into one fat/uber JAR.
+This includes packaging connectors, formats, and third-party dependencies of your job.
+This rule **does not apply** to Java APIs, DataStream Scala APIs, and the aforementioned runtime modules, 
+which are already provided by Flink itself and **should not** be included in a job uber JAR.
+This job JAR can be submitted to an already running Flink cluster, or added to a Flink application
+container image easily without modifying the distribution.
 
-| APIs you want to use              | Dependency you need to add    |
-|-----------------------------------|-------------------------------|
-| DataStream                        | flink-streaming-java          |  
-| DataStream with Scala             | flink-streaming-scala{{< scala_version >}}         |   
-| Table API                         | flink-table-api-java          |   
-| Table API with Scala              | flink-table-api-scala{{< scala_version >}}         |
-| Table API + DataStream            | flink-table-api-java-bridge   |
-| Table API + DataStream with Scala | flink-table-api-scala-bridge{{< scala_version >}}  |
+## What's next?
 
-Check out the sections on [Datastream API]({{< ref "docs/dev/datastream/overview" >}}) and 
-[Table API & SQL]({{< ref "docs/dev/table/overview" >}}) to learn more.
+* To start developing your job, check out [DataStream API]({{< ref "docs/dev/datastream/overview" >}}) and [Table API & SQL]({{< ref "docs/dev/table/overview" >}}).
+* For more details on how to package your job depending on the build tools, check out the following specific guides:
+  * [Maven]({{< ref "docs/dev/configuration/maven" >}})
+  * [Gradle]({{< ref "docs/dev/configuration/gradle" >}})
+* For more advanced topics about project configuration, check out the section on [advanced topics]({{< ref "docs/dev/configuration/advanced" >}}).
diff --git a/docs/content/docs/dev/configuration/testing.md b/docs/content/docs/dev/configuration/testing.md
index a64c972..44782ad 100644
--- a/docs/content/docs/dev/configuration/testing.md
+++ b/docs/content/docs/dev/configuration/testing.md
@@ -26,65 +26,27 @@ under the License.
 
 Flink provides utilities for testing your job that you can add as dependencies.
 
-## DataStream API Test Dependencies
+## DataStream API Testing
 
-You need to add the following dependencies if you want to develop tests for a job built with the 
+You need to add the following dependencies if you want to develop tests for a job built with the
 DataStream API:
 
-{{< tabs "datastream test" >}}
+{{< artifact_tabs flink-test-utils withTestScope >}}
 
-{{< tab "Maven" >}}
-Open the `pom.xml` file in your project directory and add these dependencies in between the dependencies tab.
-{{< artifact flink-test-utils withTestScope >}}
-{{< artifact flink-runtime withTestScope >}}
-{{< /tab >}}
-
-{{< tab "Gradle" >}}
-Open the `build.gradle` file in your project directory and add the following in the dependencies block.
-```gradle
-...
-dependencies {
-    ...  
-    testImplementation "org.apache.flink:flink-test-utils:${flinkVersion}"
-    testImplementation "org.apache.flink:flink-runtime:${flinkVersion}"
-    ...
-}
-...
-```
-**Note:** This assumes that you have created your project using our Gradle build script or quickstart script.
-{{< /tab >}}
-
-{{< /tabs >}}
+Among the various test utilities, this module provides `MiniCluster`, a lightweight configurable Flink cluster runnable in a JUnit test that can directly execute jobs.
 
 For more information on how to use these utilities, check out the section on [DataStream API testing]({{< ref "docs/dev/datastream/testing" >}})
 
-## Table Program Test Dependencies
-
-If you want to test the Table API & SQL programs locally within your IDE, you can add the following 
-dependency:
+## Table API Testing
 
-{{< tabs "table test" >}}
+If you want to test the Table API & SQL programs locally within your IDE, you can add the following
+dependency, in addition to the aforementioned `flink-test-utils`:
 
-{{< tab "Maven" >}}
-Open the `pom.xml` file in your project directory and add this dependency in between the dependencies tab.
-{{< artifact flink-table-test-utils withTestScope >}}
-{{< /tab >}}
+{{< artifact_tabs flink-table-test-utils withTestScope >}}
 
-{{< tab "Gradle" >}}
-Open the `build.gradle` file in your project directory and add the following in the dependencies block.
-```gradle
-...
-dependencies {
-    ...  
-    testImplementation "org.apache.flink:flink-table-test-utils:${flinkVersion}"
-    ...
-}
-...
-```
-**Note:** This assumes that you have created your project using our Gradle build script or quickstart script.
-{{< /tab >}}
-
-{{< /tabs >}}
-
-This will automatically bring in the query planner and the runtime, required respectively to plan 
+This will automatically bring in the query planner and the runtime, required respectively to plan
 and execute the queries.
+
+{{< hint info >}}
+The module `flink-table-test-utils` has been introduced in Flink 1.15 and is considered experimental.
+{{< /hint >}}
diff --git a/docs/content/docs/dev/table/sourcesSinks.md b/docs/content/docs/dev/table/sourcesSinks.md
index 3d3313c..b11bd89 100644
--- a/docs/content/docs/dev/table/sourcesSinks.md
+++ b/docs/content/docs/dev/table/sourcesSinks.md
@@ -113,19 +113,25 @@ Project Configuration
 If you want to implement a custom connector or a custom format, the following dependency is usually 
 sufficient:
 
-{{< artifact flink-table-common withProvidedScope >}}
+{{< artifact_tabs flink-table-common withProvidedScope >}}
 
 If you want to develop a connector that needs to bridge with DataStream APIs (i.e. if you want to adapt
 a DataStream connector to the Table API), you need to add this dependency:
 
-{{< artifact flink-table-api-java-bridge withProvidedScope >}}
+{{< artifact_tabs flink-table-api-java-bridge withProvidedScope >}}
 
-When shipping the connector/format, we suggest providing both a thin JAR and an uber JAR. This way, 
-users can easily load the uber JAR in the SQL client or in the Flink distribution and start using it.
-
-**Note:** None of the table dependencies listed above should be packaged in the uber JAR since they 
-are already provided by the Flink distribution.
+When developing the connector/format, we suggest shipping both a thin JAR and an uber JAR, so users 
+can easily load the uber JAR in the SQL client or in the Flink distribution and start using it.
+The uber JAR should include all the third-party dependencies of the connector, 
+excluding the table dependencies listed above.
 
+{{< hint warning >}}
+You should not depend on `flink-table-planner{{< scala_version >}}` in production code.
+With the new module `flink-table-planner-loader` introduced in Flink 1.15, the 
+application's classpath will not have direct access to `org.apache.flink.table.planner` classes anymore. 
+If you need a feature available only internally within the `org.apache.flink.table.planner` package and subpackages, please open an issue.
+To learn more, check out [Anatomy of Table Dependencies]({{< ref "docs/dev/configuration/advanced" >}}#anatomy-of-table-dependencies).
+{{< /hint >}}
 
 Extension Points
 ----------------
diff --git a/docs/layouts/shortcodes/artifact_gradle.html b/docs/layouts/shortcodes/artifact_gradle.html
new file mode 100644
index 0000000..4ab4dd6
--- /dev/null
+++ b/docs/layouts/shortcodes/artifact_gradle.html
@@ -0,0 +1,56 @@
+{{/*
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+*/}}
+{{/* 
+  Generates the gradle snippet for the gradle artifact.
+  IMPORTANT: the whitespace is relevant. Do not change without looking at the 
+  rendered documentation. 
+*/}}
+{{ $scalaVersion := "" }}
+{{ $testScope := "" }}
+{{ $providedScope := "" }}
+{{ $testClassifier := "" }}
+
+{{ $artifactId := .Get 0 }}
+
+{{ $path := .Page.Path }}
+
+{{ range after 1 .Params }}
+  {{ if eq . "withScalaVersion" }}
+    {{ $scalaVersion = "true" }}
+  {{ else if eq . "withTestScope" }}
+    {{ $testScope = "true" }}
+  {{ else if eq . "withProvidedScope" }}
+    {{ $providedScope = "true" }}
+  {{ else if eq . "withTestClassifier" }}
+    {{ $testClassifier = "true" }}
+  {{ else }}
+    {{ errorf "%q: Invalid use of artifact shortcode. Unknown flag `%s`" $path . }}
+  {{ end }}
+{{ end }}
+
+{{ $hash := md5 now }}
+
+{{ if ne $scalaVersion "" }}
+  {{ $artifactId = printf "%s%s" $artifactId $.Site.Params.ScalaVersion }}
+{{ end }}
+
+<div id="{{ $hash }}" onclick="selectTextAndCopy('{{ $hash }}')" class="highlight"><pre class="chroma"><code class="language-gradle" data-lang="gradle">{{ if ne $testScope "" }}testCompile{{ else if ne $providedScope "" }}runtime{{ else }}flinkShadowJar{{ end }} "org.apache.flink:{{- $artifactId -}}:{{- site.Params.Version -}}{{ if ne $scalaVersion "" }}:tests{{ end }}"</code></pre></div>
+<div class="book-hint info" style="text-align:center;display:none" copyable="flink-module" copyattribute="{{ $hash }}">
+  Copied to clipboard!
+</div> 
diff --git a/docs/layouts/shortcodes/artifact_tabs.html b/docs/layouts/shortcodes/artifact_tabs.html
new file mode 100644
index 0000000..fdbff37
--- /dev/null
+++ b/docs/layouts/shortcodes/artifact_tabs.html
@@ -0,0 +1,101 @@
+{{/*
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+*/}}
+{{/* 
+  Generates a tabbed snippet with Maven and Gradle dependencies.
+  IMPORTANT: the whitespace is relevant. Do not change without looking at the 
+  rendered documentation. 
+*/}}
+
+{{ $scalaVersion := "" }}
+{{ $testScope := "" }}
+{{ $providedScope := "" }}
+{{ $testClassifier := "" }}
+
+{{ $artifactId := .Get 0 }}
+{{ $tabGroup := printf "tabs-%s" $artifactId }}
+
+{{ $path := .Page.Path }}
+
+{{ range after 1 .Params }}
+    {{ if eq . "withScalaVersion" }}
+        {{ $scalaVersion = "true" }}
+    {{ else if eq . "withTestScope" }}
+        {{ $testScope = "true" }}
+    {{ else if eq . "withProvidedScope" }}
+        {{ $providedScope = "true" }}
+    {{ else if eq . "withTestClassifier" }}
+        {{ $testClassifier = "true" }}
+    {{ else }}
+        {{ errorf "%q: Invalid use of artifact shortcode. Unknown flag `%s`" $path . }}
+    {{ end }}
+{{ end }}
+
+{{ $hash := md5 now }}
+
+{{ if ne $scalaVersion "" }}
+    {{ $artifactId = printf "%s%s" $artifactId $.Site.Params.ScalaVersion }}
+{{ end }}
+
+<div class="book-tabs">
+    <input
+            type="radio"
+            class="toggle"
+            data-tab-group="flink-tabs"
+            data-tab-item="Maven"
+            name="{{ $tabGroup }}"
+            id="{{ printf "%s-%d" $tabGroup 0 }}"
+            checked="checked"
+            onclick="onSwitch('Maven')"
+            />
+    <label for="{{ printf "%s-%d" $tabGroup 0 }}">Maven</label>
+    <div class="book-tabs-content markdown-inner">
+        Open the <code class="highlighter-rouge">pom.xml</code> file in your project directory and add the following in the dependencies block.
+<div id="{{ $hash }}" onclick="selectTextAndCopy('{{ $hash }}')"class="highlight"><pre class="chroma"><code class="language-xml" data-lang="xml"><span class="nt">&ltdependency&gt</span>
+    <span class="nt">&ltgroupId&gt</span>org.apache.flink<span class="nt">&lt/groupId&gt</span>
+    <span class="nt">&ltartifactId&gt</span>{{- $artifactId -}}<span class="nt">&lt/artifactId&gt</span>
+    <span class="nt">&ltversion&gt</span>{{- site.Params.Version -}}<span class="nt">&lt/version&gt</span>{{ if ne $testScope "" }}
+    <span class="nt">&ltscope&gt</span>test<span class="nt">&lt/scope&gt</span>{{ end }}{{ if ne $providedScope "" }}
+    <span class="nt">&ltscope&gt</span>provided<span class="nt">&lt/scope&gt</span>{{ end }}{{ if ne $testClassifier "" }}
+    <span class="nt">&ltclassifier&gt</span>tests<span class="nt">&lt/classifier&gt</span>{{ end }}
+<span class="nt">&lt/dependency&gt</span></code></pre></div>
+        <div class="book-hint info" style="text-align:center;display:none" copyable="flink-module" copyattribute="{{ $hash }}">
+            Copied to clipboard!
+        </div>
+        Check out <a href="{{.Site.BaseURL}}{{.Site.LanguagePrefix}}/docs/dev/configuration/overview/">Project configuration</a> for more details.
+    </div>
+    <input
+            type="radio"
+            class="toggle"
+            data-tab-group="flink-tabs"
+            data-tab-item="Gradle"
+            name="{{ $tabGroup }}"
+            id="{{ printf "%s-%d" $tabGroup 1 }}"
+            onclick="onSwitch('Gradle')"
+            />
+    <label for="{{ printf "%s-%d" $tabGroup 1 }}">Gradle</label>
+    <div class="book-tabs-content markdown-inner">
+        Open the <code class="highlighter-rouge">build.gradle</code> file in your project directory and add the following in the dependencies block.
+        <div id="{{ $hash }}" onclick="selectTextAndCopy('{{ $hash }}')" class="highlight"><pre class="chroma"><code class="language-gradle" data-lang="gradle">{{ if ne $testScope "" }}testCompile{{ else if ne $providedScope "" }}runtime{{ else }}flinkShadowJar{{ end }} "org.apache.flink:{{- $artifactId -}}:{{- site.Params.Version -}}{{ if ne $scalaVersion "" }}:tests{{ end }}"</code></pre></div>
+        <div class="book-hint info" style="text-align:center;display:none" copyable="flink-module" copyattribute="{{ $hash }}">
+            Copied to clipboard!
+        </div>
+        <b>Note:</b> This assumes that you have created your project using our Gradle build script or quickstart script.<br/>
+        Check out <a href="{{.Site.BaseURL}}{{.Site.LanguagePrefix}}/docs/dev/configuration/overview/">Project configuration</a> for more details.
+    </div>
+</div>