You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by GitBox <gi...@apache.org> on 2022/01/13 14:24:23 UTC

[GitHub] [flink] infoverload opened a new pull request #18353: [FLINK-25129][docs]project configuation changes in docs

infoverload opened a new pull request #18353:
URL: https://github.com/apache/flink/pull/18353


   <!--
   *Thank you very much for contributing to Apache Flink - we are happy that you want to help us improve Flink. To help the community review your contribution in the best possible way, please go through the checklist below, which will get the contribution into a shape in which it can be best reviewed.*
   
   *Please understand that we do not do this to make contributions to Flink a hassle. In order to uphold a high standard of quality for code contributions, while at the same time managing a large number of contributions, we need contributors to prepare the contributions well, and give reviewers enough contextual information for the review. Please also understand that contributions that do not follow this guide will take longer to review and thus typically be picked up with lower priority by the community.*
   
   ## Contribution Checklist
   
     - Make sure that the pull request corresponds to a [JIRA issue](https://issues.apache.org/jira/projects/FLINK/issues). Exceptions are made for typos in JavaDoc or documentation files, which need no JIRA issue.
     
     - Name the pull request in the form "[FLINK-XXXX] [component] Title of the pull request", where *FLINK-XXXX* should be replaced by the actual issue number. Skip *component* if you are unsure about which is the best component.
     Typo fixes that have no associated JIRA issue should be named following this pattern: `[hotfix] [docs] Fix typo in event time introduction` or `[hotfix] [javadocs] Expand JavaDoc for PuncuatedWatermarkGenerator`.
   
     - Fill out the template below to describe the changes contributed by the pull request. That will give reviewers the context they need to do the review.
     
     - Make sure that the change passes the automated tests, i.e., `mvn clean verify` passes. You can set up Azure Pipelines CI to do that following [this guide](https://cwiki.apache.org/confluence/display/FLINK/Azure+Pipelines#AzurePipelines-Tutorial:SettingupAzurePipelinesforaforkoftheFlinkrepository).
   
     - Each pull request should address only one issue, not mix up code from multiple issues.
     
     - Each commit in the pull request has a meaningful commit message (including the JIRA id)
   
     - Once all items of the checklist are addressed, remove the above text and this checklist, leaving only the filled out template below.
   
   
   **(The sections below can be removed for hotfixes of typos)**
   -->
   
   ## What is the purpose of the change
   
   A unified configuration section for both DataStream users and Table API users.  
   
   
   ## Brief change log
   
   - configuration pages split up into multiple pages and put into its own section under Application Development
   
   
   ## Verifying this change
   
   Please make sure both new and modified tests in this PR follows the conventions defined in our code quality guide: https://flink.apache.org/contributing/code-style-and-quality-common.html#testing
   
   This change is already covered by existing tests, such as *(please describe tests)*.
   
   ## Does this pull request potentially affect one of the following parts:
   
     - Dependencies (does it add or upgrade a dependency): (yes / **no**)
     - The public API, i.e., is any changed class annotated with `@Public(Evolving)`: (yes / **no**)
     - The serializers: (yes / **no** / don't know)
     - The runtime per-record code paths (performance sensitive): (yes / **no** / don't know)
     - Anything that affects deployment or recovery: JobManager (and its components), Checkpointing, Kubernetes/Yarn, ZooKeeper: (yes / **no** / don't know)
     - The S3 file system connector: (yes / **no** / don't know)
   
   ## Documentation
   
     - Does this pull request introduce a new feature? (**yes** / no)
     - If yes, how is the feature documented? (not applicable / **docs** / JavaDocs / not documented)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 80fd50ad46865be06e2c83b2470fb3eb2d35cd96 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823) 
   * d0b2b188c37443b7bbda39af499398326cd56979 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1bc80c1579db9e7f2d57112fdb31f0190f71db67 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277) 
   * 274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312) 
   * 26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] MartijnVisser commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
MartijnVisser commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r799240257



##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -181,22 +171,6 @@ rootProject.name = 'quickstart'
 bash -c "$(curl https://flink.apache.org/q/gradle-quickstart.sh)" -- {{< version >}} {{< scala_version >}}
 ```
 {{< /tab >}}
-{{< tab "sbt" >}}
-You can scaffold a new Flink project with the following [giter8 template](https://github.com/tillrohrmann/flink-project.g8)
-and the `sbt new` command (which creates new build definitions from a template) or use the provided quickstart bash script.
-
-### sbt template
-
-```bash
-$ sbt new tillrohrmann/flink-project.g8

Review comment:
       We talked to @tillrohrmann about it - The SBT example was still targeting 0.13.13 while SBT 1.6.1 is the latest version. We decided to first remove it and create a follow-up ticket for adding an SBT example later. 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] matriv commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
matriv commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r798349236



##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,219 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or locally for testing),
+the [Flink runtime library](https://ossindex.sonatype.org/component/pkg:maven/org.apache.flink/flink-runtime@1.14.3) 
+must be available.
+
+The guides in this section will show you how to configure your projects via popular build tools
+([Maven]({{< ref "docs/dev/configuration/maven" >}}), [Gradle]({{< ref "docs/dev/configuration/gradle" >}}),
+add the necessary dependencies (i.e. [connectors and formats]({{< ref "docs/dev/configuration/connector" >}}), 
+[testing]({{< ref "docs/dev/configuration/testing" >}})), and cover some 
+[advanced]({{< ref "docs/dev/configuration/advanced" >}}) configuration topics. 
+
+## Getting started
+
+To get started working on your Flink application, use the following commands, scripts, and templates 
+to create a Flink project.  
+
+{{< tabs "creating project" >}}
+{{< tab "Maven" >}}
+
+You can create a project based on an [Archetype](https://maven.apache.org/guides/introduction/introduction-to-archetypes.html)
+with the Maven command below or use the provided quickstart bash script.
+
+### Maven command
+```bash
+$ mvn archetype:generate                \
+  -DarchetypeGroupId=org.apache.flink   \
+  -DarchetypeArtifactId=flink-quickstart-java \
+  -DarchetypeVersion={{< version >}}
+```
+This allows you to name your newly created project and will interactively ask you for the groupId,
+artifactId, and package name.
+
+### Quickstart script
+```bash
+$ curl https://flink.apache.org/q/quickstart.sh | bash -s {{< version >}}
+```
+
+{{< /tab >}}
+{{< tab "Gradle" >}}
+You can create a project with a Gradle build script or use the provided quickstart bash script.
+
+### Gradle build script
+
+To execute these build configuration scripts, run the `gradle` command in the directory with these scripts.
+
+**build.gradle**
+
+```gradle
+plugins {
+    id 'java'
+    id 'application'
+    // shadow plugin to produce fat JARs
+    id 'com.github.johnrengelman.shadow' version '7.1.2'
+}
+// artifact properties
+group = 'org.myorg.quickstart'
+version = '0.1-SNAPSHOT'
+mainClassName = 'org.myorg.quickstart.StreamingJob'
+description = """Flink Quickstart Job"""
+ext {
+    javaVersion = '1.8'
+    flinkVersion = '{{< version >}}'
+    scalaBinaryVersion = '{{< scala_version >}}'
+    slf4jVersion = '1.7.32'
+    log4jVersion = '2.17.1'
+}
+sourceCompatibility = javaVersion
+targetCompatibility = javaVersion
+tasks.withType(JavaCompile) {
+	options.encoding = 'UTF-8'
+}
+applicationDefaultJvmArgs = ["-Dlog4j.configurationFile=log4j2.properties"]
+
+// declare where to find the dependencies of your project
+repositories {
+    mavenCentral()
+    maven { url "https://repository.apache.org/content/repositories/snapshots/" }
+}
+// NOTE: We cannot use "compileOnly" or "shadow" configurations since then we could not run code
+// in the IDE or with "gradle run". We also cannot exclude transitive dependencies from the
+// shadowJar yet (see https://github.com/johnrengelman/shadow/issues/159).
+// -> Explicitly define the // libraries we want to be included in the "flinkShadowJar" configuration!
+configurations {
+    flinkShadowJar // dependencies which go into the shadowJar
+    // always exclude these (also from transitive dependencies) since they are provided by Flink
+    flinkShadowJar.exclude group: 'org.apache.flink', module: 'force-shading'
+    flinkShadowJar.exclude group: 'com.google.code.findbugs', module: 'jsr305'
+    flinkShadowJar.exclude group: 'org.slf4j'
+    flinkShadowJar.exclude group: 'org.apache.logging.log4j'
+}
+// declare the dependencies for your production and test code
+dependencies {
+    // --------------------------------------------------------------
+    // Compile-time dependencies that should NOT be part of the
+    // shadow jar and are provided in the lib folder of Flink
+    // --------------------------------------------------------------
+    compile "org.apache.flink:flink-streaming-java:${flinkVersion}"

Review comment:
       Here and next line it should be `implementation` instead of `compile`.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] matriv commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
matriv commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r798344325



##########
File path: docs/content/docs/dev/configuration/gradle.md
##########
@@ -0,0 +1,120 @@
+---
+title: "Using Gradle"
+weight: 3
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Gradle to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide will show you how to
+do so with [Gradle](https://gradle.org), an open-source general-purpose build tool that can be used 
+to automate tasks in the development process.
+
+## Requirements
+
+- Gradle 7.x 
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you import this project into
+your IDE for developing and testing.
+
+IntelliJ IDEA supports Gradle projects via the `Gradle` plugin.
+
+Eclipse does so via the [Eclipse Buildship](https://projects.eclipse.org/projects/tools.buildship)
+plugin (make sure to specify a Gradle version >= 3.0 in the last step of the import wizard; the `shadow`
+plugin requires it). You may also use [Gradle's IDE integration](https://docs.gradle.org/current/userguide/userguide.html#ide-integration)
+to create project files with Gradle.
+
+**Note**: The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write into the `VM Arguments` box: `-Xmx800m`.
+In IntelliJ IDEA recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is necessary to tick the
+`Include dependencies with "Provided" scope` box in the run configuration. If this option is not available
+(possibly due to using an older IntelliJ IDEA version), then a workaround is to create a test that
+calls the application's `main()` method.
+
+## Building the project
+
+If you want to __build/package your project__, go to your project directory and
+run the '`gradle clean shadowJar`' command.
+You will __find a JAR file__ that contains your application, plus connectors and libraries
+that you may have added as dependencies to the application: `build/libs/<project-name>-<version>-all.jar`.
+
+__Note:__ If you use a different class than *StreamingJob* as the application's main class / entry point,
+we recommend you change the `mainClassName` setting in the `build.gradle` file accordingly. That way, Flink
+can run the application from the JAR file without additionally specifying the main class.
+
+## Adding dependencies to the project
+
+Specify a dependency configuration in the dependencies block of your `build.gradle` file.
+
+For example, you can add the Kafka connector as a dependency like this:
+
+**build.gradle**
+
+```gradle
+...
+dependencies {
+    ...  
+    flinkShadowJar "org.apache.flink:flink-connector-kafka:${flinkVersion}"
+    ...
+}
+...
+```
+
+**Important:** Note that all these (core) dependencies should have their scope set to [*provided*](https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#dependency-scope). This means that
+they are needed to compile against, but that they should not be packaged into the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
+becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR file clash with some of
+your own dependency versions (which is normally avoided through inverted classloading).
+
+To correctly package the dependencies into the application JAR, these application dependencies must
+be set to the *compile* scope.
+
+## Packaging the application
+
+Depending on your use case, you may need to package your Flink application in different ways before 
+it gets deployed to a Flink environment.
+
+If you want to create a JAR for a Flink Job and use only Flink dependencies without any third-party
+dependencies (i.e. using the filesystem connector with JSON format), you do not need to create an
+uber/fat JAR or shade any dependencies.
+
+If you want to create a JAR for a Flink Job and use external dependencies not built into the Flink

Review comment:
       And here the installShadowDist




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30537",
       "triggerID" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1845471184d68e8edd89fd19a591030290695cf3",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30593",
       "triggerID" : "1845471184d68e8edd89fd19a591030290695cf3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e33ad31f47e113b480de1c4f6ac6efc40992084b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30651",
       "triggerID" : "e33ad31f47e113b480de1c4f6ac6efc40992084b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "78c5075dd300c8e74705afbb13b10377da61865a",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30675",
       "triggerID" : "78c5075dd300c8e74705afbb13b10377da61865a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 78c5075dd300c8e74705afbb13b10377da61865a Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30675) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 50e221c6373f2774eeb8ec134df27caef83b2d40 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495) 
   * 8e5ae4098896a176d49d4b7319f69b6529412549 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516) 
   * 532af8b3b761f7b589500e5ead9b888f2370675a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528) 
   * 3de39d65f624fb6cf075e9e5ddfe005c197ee3dc UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] slinkydeveloper commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
slinkydeveloper commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r784028895



##########
File path: docs/content/docs/connectors/table/kafka.md
##########
@@ -38,7 +38,7 @@ Dependencies
 {{< sql_download_table "kafka" >}}
 
 The Kafka connector is not currently part of the binary distribution.
-See how to link with it for cluster execution [here]({{< ref "docs/dev/datastream/project-configuration" >}}).
+See how to link with it for cluster execution [here]({{< ref "docs/dev/configuration" >}}).

Review comment:
       This sentence should be in the other sql connectors as well, except filesystem, which is included by default

##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,97 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or in the IDE for testing),
+the Flink runtime library must be available.
+
+## Setting up a Flink project: Getting started
+
+Every Flink application needs, at a minimum, the API dependencies to develop against. When setting up
+a project manually, you need to add the following dependencies for the Java/Scala API.
+
+In Maven syntax, it would look like:
+
+{{< tabs "a49d57a4-27ee-4dd3-a2b8-a673b99b011e" >}}
+{{< tab "Java" >}}
+```xml
+<dependency>
+  <groupId>org.apache.flink</groupId>
+  <artifactId>flink-streaming-java</artifactId>
+  <version>{{< version >}}</version>
+  <scope>provided</scope>
+</dependency>
+```
+{{< /tab >}}
+{{< tab "Scala" >}}
+```xml
+<dependency>
+  <groupId>org.apache.flink</groupId>
+  <artifactId>flink-streaming-scala{{< scala_version >}}</artifactId>
+  <version>{{< version >}}</version>
+  <scope>provided</scope>
+</dependency>
+```
+{{< /tab >}}
+{{< /tabs >}}
+
+**Important:** Note that all these dependencies have their scope set to *provided*. This means that
+they are needed to compile against, but that they should not be packaged into the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
+becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR file clash with some of
+your own dependency versions (which is normally avoided through inverted classloading).
+
+**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is necessary to tick the
+`Include dependencies with "Provided" scope` box in the run configuration. If this option is not available
+(possibly due to using an older IntelliJ IDEA version), then a workaround is to create a test that
+calls the application's `main()` method.
+
+## Which dependencies do you need?

Review comment:
       Perhaps this goes before the explanation on provided? Between the maven snippet and the "important" text block

##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,97 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or in the IDE for testing),
+the Flink runtime library must be available.
+
+## Setting up a Flink project: Getting started
+
+Every Flink application needs, at a minimum, the API dependencies to develop against. When setting up
+a project manually, you need to add the following dependencies for the Java/Scala API.
+
+In Maven syntax, it would look like:
+
+{{< tabs "a49d57a4-27ee-4dd3-a2b8-a673b99b011e" >}}
+{{< tab "Java" >}}
+```xml
+<dependency>
+  <groupId>org.apache.flink</groupId>
+  <artifactId>flink-streaming-java</artifactId>
+  <version>{{< version >}}</version>
+  <scope>provided</scope>
+</dependency>
+```
+{{< /tab >}}
+{{< tab "Scala" >}}
+```xml
+<dependency>
+  <groupId>org.apache.flink</groupId>
+  <artifactId>flink-streaming-scala{{< scala_version >}}</artifactId>
+  <version>{{< version >}}</version>
+  <scope>provided</scope>
+</dependency>
+```
+{{< /tab >}}
+{{< /tabs >}}
+
+**Important:** Note that all these dependencies have their scope set to *provided*. This means that
+they are needed to compile against, but that they should not be packaged into the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
+becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR file clash with some of
+your own dependency versions (which is normally avoided through inverted classloading).
+
+**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is necessary to tick the
+`Include dependencies with "Provided" scope` box in the run configuration. If this option is not available
+(possibly due to using an older IntelliJ IDEA version), then a workaround is to create a test that
+calls the application's `main()` method.
+
+## Which dependencies do you need?
+
+| APIs you want to use              | Dependency you need to add    |
+|-----------------------------------|-------------------------------|
+| DataStream                        | flink-streaming-java          |  
+| DataStream with Scala             | flink-streaming-scala         |   
+| Table API                         | flink-table-api-java          |   
+| Table API with Scala              | flink-table-api-scala         |
+| Table API + DataStream            | flink-table-api-java-bridge   |
+| Table API + DataStream with Scala | flink-table-api-scala-bridge  |

Review comment:
       Perhaps here add the links to maven/gradle/sbt pages?

##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,97 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or in the IDE for testing),
+the Flink runtime library must be available.
+
+## Setting up a Flink project: Getting started
+
+Every Flink application needs, at a minimum, the API dependencies to develop against. When setting up
+a project manually, you need to add the following dependencies for the Java/Scala API.
+
+In Maven syntax, it would look like:
+
+{{< tabs "a49d57a4-27ee-4dd3-a2b8-a673b99b011e" >}}
+{{< tab "Java" >}}
+```xml
+<dependency>
+  <groupId>org.apache.flink</groupId>
+  <artifactId>flink-streaming-java</artifactId>
+  <version>{{< version >}}</version>
+  <scope>provided</scope>
+</dependency>
+```
+{{< /tab >}}
+{{< tab "Scala" >}}
+```xml
+<dependency>
+  <groupId>org.apache.flink</groupId>
+  <artifactId>flink-streaming-scala{{< scala_version >}}</artifactId>
+  <version>{{< version >}}</version>
+  <scope>provided</scope>
+</dependency>
+```
+{{< /tab >}}
+{{< /tabs >}}
+
+**Important:** Note that all these dependencies have their scope set to *provided*. This means that
+they are needed to compile against, but that they should not be packaged into the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
+becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR file clash with some of
+your own dependency versions (which is normally avoided through inverted classloading).
+
+**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is necessary to tick the
+`Include dependencies with "Provided" scope` box in the run configuration. If this option is not available
+(possibly due to using an older IntelliJ IDEA version), then a workaround is to create a test that
+calls the application's `main()` method.

Review comment:
       Actually, this sounds like already the next step and might be build system dependant (not sure how it works with gradle honestly)




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] matriv commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
matriv commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r798344325



##########
File path: docs/content/docs/dev/configuration/gradle.md
##########
@@ -0,0 +1,120 @@
+---
+title: "Using Gradle"
+weight: 3
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Gradle to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide will show you how to
+do so with [Gradle](https://gradle.org), an open-source general-purpose build tool that can be used 
+to automate tasks in the development process.
+
+## Requirements
+
+- Gradle 7.x 
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you import this project into
+your IDE for developing and testing.
+
+IntelliJ IDEA supports Gradle projects via the `Gradle` plugin.
+
+Eclipse does so via the [Eclipse Buildship](https://projects.eclipse.org/projects/tools.buildship)
+plugin (make sure to specify a Gradle version >= 3.0 in the last step of the import wizard; the `shadow`
+plugin requires it). You may also use [Gradle's IDE integration](https://docs.gradle.org/current/userguide/userguide.html#ide-integration)
+to create project files with Gradle.
+
+**Note**: The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write into the `VM Arguments` box: `-Xmx800m`.
+In IntelliJ IDEA recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is necessary to tick the
+`Include dependencies with "Provided" scope` box in the run configuration. If this option is not available
+(possibly due to using an older IntelliJ IDEA version), then a workaround is to create a test that
+calls the application's `main()` method.
+
+## Building the project
+
+If you want to __build/package your project__, go to your project directory and
+run the '`gradle clean shadowJar`' command.
+You will __find a JAR file__ that contains your application, plus connectors and libraries
+that you may have added as dependencies to the application: `build/libs/<project-name>-<version>-all.jar`.
+
+__Note:__ If you use a different class than *StreamingJob* as the application's main class / entry point,
+we recommend you change the `mainClassName` setting in the `build.gradle` file accordingly. That way, Flink
+can run the application from the JAR file without additionally specifying the main class.
+
+## Adding dependencies to the project
+
+Specify a dependency configuration in the dependencies block of your `build.gradle` file.
+
+For example, you can add the Kafka connector as a dependency like this:
+
+**build.gradle**
+
+```gradle
+...
+dependencies {
+    ...  
+    flinkShadowJar "org.apache.flink:flink-connector-kafka:${flinkVersion}"
+    ...
+}
+...
+```
+
+**Important:** Note that all these (core) dependencies should have their scope set to [*provided*](https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#dependency-scope). This means that
+they are needed to compile against, but that they should not be packaged into the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
+becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR file clash with some of
+your own dependency versions (which is normally avoided through inverted classloading).
+
+To correctly package the dependencies into the application JAR, these application dependencies must
+be set to the *compile* scope.
+
+## Packaging the application
+
+Depending on your use case, you may need to package your Flink application in different ways before 
+it gets deployed to a Flink environment.
+
+If you want to create a JAR for a Flink Job and use only Flink dependencies without any third-party
+dependencies (i.e. using the filesystem connector with JSON format), you do not need to create an
+uber/fat JAR or shade any dependencies.
+
+If you want to create a JAR for a Flink Job and use external dependencies not built into the Flink

Review comment:
       And here the `installShadowDist`, so the next paragraph can be removed, thx!




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d0b2b188c37443b7bbda39af499398326cd56979 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1bc80c1579db9e7f2d57112fdb31f0190f71db67 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277) 
   * 274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312) 
   * 26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316) 
   * 6f44bc92aca6fc133b10cefd438f99810ce65293 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1bc80c1579db9e7f2d57112fdb31f0190f71db67 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277) 
   * 274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312) 
   * 26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316) 
   * 6f44bc92aca6fc133b10cefd438f99810ce65293 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321) 
   * 1a637d30a2332b6c7059be2118a3cb8d40718eb9 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1a637d30a2332b6c7059be2118a3cb8d40718eb9 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334) 
   * 2e955fd5db3754a069a1ed8c48ce2e581aaef51b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 50e221c6373f2774eeb8ec134df27caef83b2d40 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495) 
   * 8e5ae4098896a176d49d4b7319f69b6529412549 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516) 
   * 532af8b3b761f7b589500e5ead9b888f2370675a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528) 
   * 3de39d65f624fb6cf075e9e5ddfe005c197ee3dc UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 8e5ae4098896a176d49d4b7319f69b6529412549 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516) 
   * 532af8b3b761f7b589500e5ead9b888f2370675a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30537",
       "triggerID" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1845471184d68e8edd89fd19a591030290695cf3",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30593",
       "triggerID" : "1845471184d68e8edd89fd19a591030290695cf3",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1845471184d68e8edd89fd19a591030290695cf3 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30593) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] slinkydeveloper commented on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
slinkydeveloper commented on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1028000521


   There are still some unresolved comments, once they're solved we're ready to merge


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 50e221c6373f2774eeb8ec134df27caef83b2d40 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495) 
   * 8e5ae4098896a176d49d4b7319f69b6529412549 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516) 
   * 532af8b3b761f7b589500e5ead9b888f2370675a UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] zentol commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
zentol commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r799237575



##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -94,14 +92,12 @@ plugins {
     id 'com.github.johnrengelman.shadow' version '7.1.2'
 }
 // artifact properties
-group = 'org.myorg.quickstart'
+group = 'org.quickstart'
 version = '0.1-SNAPSHOT'
-mainClassName = 'org.myorg.quickstart.StreamingJob'
+mainClassName = 'org.quickstart.StreamingJob'
+mainClassName = 'org.quickstart.StreamingJob'
 description = """Flink Quickstart Job"""
 ext {
-    javaVersion = '1.8'
-    flinkVersion = '{{< version >}}'

Review comment:
       these 2 properties were used though ;)




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30537",
       "triggerID" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1845471184d68e8edd89fd19a591030290695cf3",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30593",
       "triggerID" : "1845471184d68e8edd89fd19a591030290695cf3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e33ad31f47e113b480de1c4f6ac6efc40992084b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30651",
       "triggerID" : "e33ad31f47e113b480de1c4f6ac6efc40992084b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "78c5075dd300c8e74705afbb13b10377da61865a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30675",
       "triggerID" : "78c5075dd300c8e74705afbb13b10377da61865a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ddabc7bb39a84b79407d5c9b85de9c83d0959de2",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30708",
       "triggerID" : "ddabc7bb39a84b79407d5c9b85de9c83d0959de2",
       "triggerType" : "PUSH"
     }, {
       "hash" : "059b1a0df499474c5f672df131abc826660101a8",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30737",
       "triggerID" : "059b1a0df499474c5f672df131abc826660101a8",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ddabc7bb39a84b79407d5c9b85de9c83d0959de2 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30708) 
   * 059b1a0df499474c5f672df131abc826660101a8 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30737) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] matriv commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
matriv commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r798349546



##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,219 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or locally for testing),
+the [Flink runtime library](https://ossindex.sonatype.org/component/pkg:maven/org.apache.flink/flink-runtime@1.14.3) 
+must be available.
+
+The guides in this section will show you how to configure your projects via popular build tools
+([Maven]({{< ref "docs/dev/configuration/maven" >}}), [Gradle]({{< ref "docs/dev/configuration/gradle" >}}),
+add the necessary dependencies (i.e. [connectors and formats]({{< ref "docs/dev/configuration/connector" >}}), 
+[testing]({{< ref "docs/dev/configuration/testing" >}})), and cover some 
+[advanced]({{< ref "docs/dev/configuration/advanced" >}}) configuration topics. 
+
+## Getting started
+
+To get started working on your Flink application, use the following commands, scripts, and templates 
+to create a Flink project.  
+
+{{< tabs "creating project" >}}
+{{< tab "Maven" >}}
+
+You can create a project based on an [Archetype](https://maven.apache.org/guides/introduction/introduction-to-archetypes.html)
+with the Maven command below or use the provided quickstart bash script.
+
+### Maven command
+```bash
+$ mvn archetype:generate                \
+  -DarchetypeGroupId=org.apache.flink   \
+  -DarchetypeArtifactId=flink-quickstart-java \
+  -DarchetypeVersion={{< version >}}
+```
+This allows you to name your newly created project and will interactively ask you for the groupId,
+artifactId, and package name.
+
+### Quickstart script
+```bash
+$ curl https://flink.apache.org/q/quickstart.sh | bash -s {{< version >}}
+```
+
+{{< /tab >}}
+{{< tab "Gradle" >}}
+You can create a project with a Gradle build script or use the provided quickstart bash script.
+
+### Gradle build script
+
+To execute these build configuration scripts, run the `gradle` command in the directory with these scripts.
+
+**build.gradle**
+
+```gradle
+plugins {
+    id 'java'
+    id 'application'
+    // shadow plugin to produce fat JARs
+    id 'com.github.johnrengelman.shadow' version '7.1.2'
+}
+// artifact properties
+group = 'org.myorg.quickstart'
+version = '0.1-SNAPSHOT'
+mainClassName = 'org.myorg.quickstart.StreamingJob'
+description = """Flink Quickstart Job"""
+ext {
+    javaVersion = '1.8'
+    flinkVersion = '{{< version >}}'
+    scalaBinaryVersion = '{{< scala_version >}}'
+    slf4jVersion = '1.7.32'
+    log4jVersion = '2.17.1'
+}
+sourceCompatibility = javaVersion
+targetCompatibility = javaVersion
+tasks.withType(JavaCompile) {
+	options.encoding = 'UTF-8'
+}
+applicationDefaultJvmArgs = ["-Dlog4j.configurationFile=log4j2.properties"]
+
+// declare where to find the dependencies of your project
+repositories {
+    mavenCentral()
+    maven { url "https://repository.apache.org/content/repositories/snapshots/" }
+}
+// NOTE: We cannot use "compileOnly" or "shadow" configurations since then we could not run code
+// in the IDE or with "gradle run". We also cannot exclude transitive dependencies from the
+// shadowJar yet (see https://github.com/johnrengelman/shadow/issues/159).
+// -> Explicitly define the // libraries we want to be included in the "flinkShadowJar" configuration!
+configurations {
+    flinkShadowJar // dependencies which go into the shadowJar
+    // always exclude these (also from transitive dependencies) since they are provided by Flink
+    flinkShadowJar.exclude group: 'org.apache.flink', module: 'force-shading'
+    flinkShadowJar.exclude group: 'com.google.code.findbugs', module: 'jsr305'
+    flinkShadowJar.exclude group: 'org.slf4j'
+    flinkShadowJar.exclude group: 'org.apache.logging.log4j'
+}
+// declare the dependencies for your production and test code
+dependencies {
+    // --------------------------------------------------------------
+    // Compile-time dependencies that should NOT be part of the
+    // shadow jar and are provided in the lib folder of Flink
+    // --------------------------------------------------------------
+    compile "org.apache.flink:flink-streaming-java:${flinkVersion}"
+    compile "org.apache.flink:flink-clients:${flinkVersion}"
+    // --------------------------------------------------------------
+    // Dependencies that should be part of the shadow jar, e.g.
+    // connectors. These must be in the flinkShadowJar configuration!
+    // --------------------------------------------------------------
+    //flinkShadowJar "org.apache.flink:flink-connector-kafka:${flinkVersion}"
+    compile "org.apache.logging.log4j:log4j-api:${log4jVersion}"

Review comment:
       Here and the next 3 lines it should be `runtimeOnly` instead of `compile`.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30537",
       "triggerID" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1845471184d68e8edd89fd19a591030290695cf3",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30593",
       "triggerID" : "1845471184d68e8edd89fd19a591030290695cf3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e33ad31f47e113b480de1c4f6ac6efc40992084b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30651",
       "triggerID" : "e33ad31f47e113b480de1c4f6ac6efc40992084b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "78c5075dd300c8e74705afbb13b10377da61865a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30675",
       "triggerID" : "78c5075dd300c8e74705afbb13b10377da61865a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ddabc7bb39a84b79407d5c9b85de9c83d0959de2",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30708",
       "triggerID" : "ddabc7bb39a84b79407d5c9b85de9c83d0959de2",
       "triggerType" : "PUSH"
     }, {
       "hash" : "059b1a0df499474c5f672df131abc826660101a8",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "059b1a0df499474c5f672df131abc826660101a8",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ddabc7bb39a84b79407d5c9b85de9c83d0959de2 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30708) 
   * 059b1a0df499474c5f672df131abc826660101a8 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] slinkydeveloper commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
slinkydeveloper commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r789423489



##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -52,46 +52,36 @@ In Maven syntax, it would look like:
 
 {{< tabs "a49d57a4-27ee-4dd3-a2b8-a673b99b011e" >}}
 {{< tab "Java" >}}
-```xml
-<dependency>
-  <groupId>org.apache.flink</groupId>
-  <artifactId>flink-streaming-java</artifactId>
-  <version>{{< version >}}</version>
-  <scope>provided</scope>
-</dependency>
-```
+
+{{< artifact flink-streaming-java withProvidedScope >}}
+
 {{< /tab >}}
 {{< tab "Scala" >}}
-```xml
-<dependency>
-  <groupId>org.apache.flink</groupId>
-  <artifactId>flink-streaming-scala{{< scala_version >}}</artifactId>
-  <version>{{< version >}}</version>
-  <scope>provided</scope>
-</dependency>
-```
-{{< /tab >}}
-{{< /tabs >}}
 
-**Important:** Note that all these dependencies have their scope set to *provided*. This means that
-they are needed to compile against, but that they should not be packaged into the project's resulting
-application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
-becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
-is that the Flink core dependencies that are added to the application's JAR file clash with some of
-your own dependency versions (which is normally avoided through inverted classloading).
+{{< artifact flink-streaming-scala withScalaVersion withProvidedScope >}}
 
-**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is necessary to tick the
-`Include dependencies with "Provided" scope` box in the run configuration. If this option is not available
-(possibly due to using an older IntelliJ IDEA version), then a workaround is to create a test that
-calls the application's `main()` method.
+{{< /tab >}}
+{{< /tabs >}}

Review comment:
       Please remove these tabs and replace them with simple tabs for maven/gradle/sbt conf like here: https://github.com/slinkydeveloper/flink/commit/5d49dd7a0c0b0b824ed72942136a1857aaea91b9#diff-0bf4db953b94c9b897e098765f0ecf359afb3954363dd8c29574dbe3548c7d01R50
   
   Telling me the syntax for the maven dependencies is not really useful here.

##########
File path: docs/content/docs/dev/table/sourcesSinks.md
##########
@@ -106,6 +106,41 @@ that the planner can handle.
 
 {{< top >}}
 
+
+Project Configuration
+---------------------
+
+If you want to implement a custom format, the following dependency is usually sufficient and can be 
+used for JAR files for the SQL Client:
+
+```xml
+<dependency>
+  <groupId>org.apache.flink</groupId>
+  <artifactId>flink-table-common</artifactId>
+  <version>{{< version >}}</version>
+  <scope>provided</scope>
+</dependency>
+```
+
+If you want to develop a connector that needs to bridge with DataStream APIs (i.e. if you want to adapt
+a DataStream connector to the Table API), you need to add this dependency:
+
+```xml
+<dependency>
+  <groupId>org.apache.flink</groupId>
+  <artifactId>flink-table-api-java-bridge</artifactId>
+  <version>{{< version >}}</version>
+  <scope>provided</scope>
+</dependency>
+```

Review comment:
       Use the artifact docgen tag

##########
File path: docs/content/docs/dev/configuration/connector.md
##########
@@ -0,0 +1,72 @@
+---
+title: "Dependencies: Connectors and Formats"
+weight: 5
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Dependencies: Connectors and Formats
+
+Flink can read from and write to various external systems via [connectors]({{< ref "docs/connectors/table/overview" >}})
+and define the [format]({{< ref "docs/connectors/table/formats/overview" >}}) in which to store the 
+data (i.e. mapping binary data onto table columns).  
+
+The way that the information is serialized is represented in the external system and that system needs
+to know how to read this data in a format that can be read by Flink.  This is done through format dependencies.
+
+Most applications need specific connectors to run. Flink provides a set of table formats that can be 
+used with table connectors (with the dependencies for both being fairly unified). These are not part 
+of Flink's core dependencies and must be added as dependencies to the application.
+
+## Adding Connector Dependencies 
+
+As an example, you can add the Kafka connector as a dependency like this (Maven syntax):
+
+{{< artifact flink-connector-kafka >}}
+
+We recommend packaging the application code and all its required dependencies into one *JAR-with-dependencies* 
+which we refer to as the *application JAR*. The application JAR can be submitted to an already running 
+Flink cluster, or added to a Flink application container image.
+
+Projects created from the `Java Project Template`, the `Scala Project Template`, or Gradle are configured 
+to automatically include the application dependencies into the application JAR when you run `mvn clean package`. 
+For projects that are not set up from those templates, we recommend adding the Maven Shade Plugin to 
+build the application jar with all required dependencies.
+
+**Important:** For Maven (and other build tools) to correctly package the dependencies into the application jar,
+these application dependencies must be specified in scope *compile* (unlike the core dependencies, which
+must be specified in scope *provided*).
+
+## Packaging Dependencies
+
+In the Maven Repository, you will find connectors named "flink-connector-<NAME>" and
+"flink-sql-connector-<NAME>". The former are thin JARs while the latter are uber JARs.
+
+In order to use the uber JARs, you can shade them in the uber JAR of your application, or you can add
+them to the `/lib` folder of the distribution (i.e. SQL client).
+
+[ EXPLAIN PROS and CONS ]
+
+In order to create an uber JAR to run the job, do this:
+
+[ FILL IN ]
+
+**Note:** You do not need to shade Flink API dependencies. You only need to do this for connectors,
+formats and third-party dependencies.

Review comment:
       For Maven specifically, what you have in `advanced.md > Template for building a JAR with Dependencies` is ok to start with (see https://github.com/slinkydeveloper/flink/commit/41ea33a191bc7e8459c3ca6372cdeb675e9f8552)

##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -52,46 +52,36 @@ In Maven syntax, it would look like:
 
 {{< tabs "a49d57a4-27ee-4dd3-a2b8-a673b99b011e" >}}
 {{< tab "Java" >}}
-```xml
-<dependency>
-  <groupId>org.apache.flink</groupId>
-  <artifactId>flink-streaming-java</artifactId>
-  <version>{{< version >}}</version>
-  <scope>provided</scope>
-</dependency>
-```
+
+{{< artifact flink-streaming-java withProvidedScope >}}
+
 {{< /tab >}}
 {{< tab "Scala" >}}
-```xml
-<dependency>
-  <groupId>org.apache.flink</groupId>
-  <artifactId>flink-streaming-scala{{< scala_version >}}</artifactId>
-  <version>{{< version >}}</version>
-  <scope>provided</scope>
-</dependency>
-```
-{{< /tab >}}
-{{< /tabs >}}
 
-**Important:** Note that all these dependencies have their scope set to *provided*. This means that
-they are needed to compile against, but that they should not be packaged into the project's resulting
-application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
-becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
-is that the Flink core dependencies that are added to the application's JAR file clash with some of
-your own dependency versions (which is normally avoided through inverted classloading).
+{{< artifact flink-streaming-scala withScalaVersion withProvidedScope >}}
 
-**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is necessary to tick the
-`Include dependencies with "Provided" scope` box in the run configuration. If this option is not available
-(possibly due to using an older IntelliJ IDEA version), then a workaround is to create a test that
-calls the application's `main()` method.
+{{< /tab >}}
+{{< /tabs >}}
 
 ## Which dependencies do you need?
 
+Different APIs will require different dependencies. 
+
 | APIs you want to use              | Dependency you need to add    |
 |-----------------------------------|-------------------------------|
 | DataStream                        | flink-streaming-java          |  
-| DataStream with Scala             | flink-streaming-scala         |   
+| DataStream with Scala             | flink-streaming-scala{{< scala_version >}}         |   
 | Table API                         | flink-table-api-java          |   
-| Table API with Scala              | flink-table-api-scala         |
+| Table API with Scala              | flink-table-api-scala{{< scala_version >}}         |
 | Table API + DataStream            | flink-table-api-java-bridge   |
-| Table API + DataStream with Scala | flink-table-api-scala-bridge  |
+| Table API + DataStream with Scala | flink-table-api-scala-bridge{{< scala_version >}}  |
+
+You can use [Maven]({{< ref "docs/dev/configuration/maven" >}}), [Gradle]({{< ref "docs/dev/configuration/gradle" >}}), 
+or [sbt]({{< ref "docs/dev/configuration/sbt" >}}) to configure your project and add these dependencies.
+
+**Important:** Note that all these dependencies should have their scope set to *provided*. This means that
+they are needed to compile against, but that they should not be packaged into the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
+becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR file clash with some of
+your own dependency versions (which is normally avoided through inverted classloading).

Review comment:
       I would move that to the maven/gradle/sbt specific pages

##########
File path: docs/content/docs/dev/table/sourcesSinks.md
##########
@@ -106,6 +106,41 @@ that the planner can handle.
 
 {{< top >}}
 
+
+Project Configuration
+---------------------
+
+If you want to implement a custom format, the following dependency is usually sufficient and can be 
+used for JAR files for the SQL Client:

Review comment:
       Remove the wording after  "is sufficient"

##########
File path: docs/content/docs/dev/table/sourcesSinks.md
##########
@@ -106,6 +106,41 @@ that the planner can handle.
 
 {{< top >}}
 
+
+Project Configuration
+---------------------
+
+If you want to implement a custom format, the following dependency is usually sufficient and can be 
+used for JAR files for the SQL Client:
+
+```xml
+<dependency>
+  <groupId>org.apache.flink</groupId>
+  <artifactId>flink-table-common</artifactId>
+  <version>{{< version >}}</version>
+  <scope>provided</scope>
+</dependency>
+```

Review comment:
       Use the artifact docgen tag

##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -52,46 +52,36 @@ In Maven syntax, it would look like:
 
 {{< tabs "a49d57a4-27ee-4dd3-a2b8-a673b99b011e" >}}
 {{< tab "Java" >}}
-```xml
-<dependency>
-  <groupId>org.apache.flink</groupId>
-  <artifactId>flink-streaming-java</artifactId>
-  <version>{{< version >}}</version>
-  <scope>provided</scope>
-</dependency>
-```
+
+{{< artifact flink-streaming-java withProvidedScope >}}
+
 {{< /tab >}}
 {{< tab "Scala" >}}
-```xml
-<dependency>
-  <groupId>org.apache.flink</groupId>
-  <artifactId>flink-streaming-scala{{< scala_version >}}</artifactId>
-  <version>{{< version >}}</version>
-  <scope>provided</scope>
-</dependency>
-```
-{{< /tab >}}
-{{< /tabs >}}
 
-**Important:** Note that all these dependencies have their scope set to *provided*. This means that
-they are needed to compile against, but that they should not be packaged into the project's resulting
-application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
-becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
-is that the Flink core dependencies that are added to the application's JAR file clash with some of
-your own dependency versions (which is normally avoided through inverted classloading).
+{{< artifact flink-streaming-scala withScalaVersion withProvidedScope >}}
 
-**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is necessary to tick the
-`Include dependencies with "Provided" scope` box in the run configuration. If this option is not available
-(possibly due to using an older IntelliJ IDEA version), then a workaround is to create a test that
-calls the application's `main()` method.
+{{< /tab >}}
+{{< /tabs >}}
 
 ## Which dependencies do you need?
 
+Different APIs will require different dependencies. 

Review comment:
       Try to be less formal on this page? Like
   
   > At this point you have a ready to play project configuration. Now, depending on what you're trying to achieve, you're going to choose a combination of our available APIs[...] This is a table of artifact names:

##########
File path: docs/content/docs/dev/table/data_stream_api.md
##########
@@ -412,60 +412,7 @@ also the [dedicated batch mode section below](#batch-runtime-mode) for more insi
 
 ### Dependencies and Imports
 
-Projects that combine Table API with DataStream API need to add one of the following bridging modules.
-They include transitive dependencies to `flink-table-api-java` or `flink-table-api-scala` and the
-corresponding language-specific DataStream API module.
-
-{{< tabs "0d2da52a-ee43-4d06-afde-b165517c0617" >}}
-{{< tab "Java" >}}
-```xml
-<dependency>
-  <groupId>org.apache.flink</groupId>
-  <artifactId>flink-table-api-java-bridge{{< scala_version >}}</artifactId>
-  <version>{{< version >}}</version>
-  <scope>provided</scope>
-</dependency>
-```
-{{< /tab >}}
-{{< tab "Scala" >}}
-```xml
-<dependency>
-  <groupId>org.apache.flink</groupId>
-  <artifactId>flink-table-api-scala-bridge{{< scala_version >}}</artifactId>
-  <version>{{< version >}}</version>
-  <scope>provided</scope>
-</dependency>
-```
-{{< /tab >}}
-{{< /tabs >}}
-
-The following imports are required to declare common pipelines using either the Java or Scala version

Review comment:
       This code below is still valid, keep it. 

##########
File path: docs/content/docs/dev/table/sourcesSinks.md
##########
@@ -106,6 +106,41 @@ that the planner can handle.
 
 {{< top >}}
 
+
+Project Configuration
+---------------------
+
+If you want to implement a custom format, the following dependency is usually sufficient and can be 

Review comment:
       "a custom connector or a custom format"

##########
File path: docs/content/docs/dev/configuration/connector.md
##########
@@ -0,0 +1,72 @@
+---
+title: "Dependencies: Connectors and Formats"
+weight: 5
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Dependencies: Connectors and Formats
+
+Flink can read from and write to various external systems via [connectors]({{< ref "docs/connectors/table/overview" >}})
+and define the [format]({{< ref "docs/connectors/table/formats/overview" >}}) in which to store the 
+data (i.e. mapping binary data onto table columns).  
+
+The way that the information is serialized is represented in the external system and that system needs
+to know how to read this data in a format that can be read by Flink.  This is done through format dependencies.
+
+Most applications need specific connectors to run. Flink provides a set of table formats that can be 
+used with table connectors (with the dependencies for both being fairly unified). These are not part 
+of Flink's core dependencies and must be added as dependencies to the application.

Review comment:
       This wording is valid and correct but it's too specific to table, generalize it for DataStream as well

##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -52,46 +52,36 @@ In Maven syntax, it would look like:
 
 {{< tabs "a49d57a4-27ee-4dd3-a2b8-a673b99b011e" >}}
 {{< tab "Java" >}}
-```xml
-<dependency>
-  <groupId>org.apache.flink</groupId>
-  <artifactId>flink-streaming-java</artifactId>
-  <version>{{< version >}}</version>
-  <scope>provided</scope>
-</dependency>
-```
+
+{{< artifact flink-streaming-java withProvidedScope >}}
+
 {{< /tab >}}
 {{< tab "Scala" >}}
-```xml
-<dependency>
-  <groupId>org.apache.flink</groupId>
-  <artifactId>flink-streaming-scala{{< scala_version >}}</artifactId>
-  <version>{{< version >}}</version>
-  <scope>provided</scope>
-</dependency>
-```
-{{< /tab >}}
-{{< /tabs >}}
 
-**Important:** Note that all these dependencies have their scope set to *provided*. This means that
-they are needed to compile against, but that they should not be packaged into the project's resulting
-application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
-becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
-is that the Flink core dependencies that are added to the application's JAR file clash with some of
-your own dependency versions (which is normally avoided through inverted classloading).
+{{< artifact flink-streaming-scala withScalaVersion withProvidedScope >}}
 
-**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is necessary to tick the
-`Include dependencies with "Provided" scope` box in the run configuration. If this option is not available
-(possibly due to using an older IntelliJ IDEA version), then a workaround is to create a test that
-calls the application's `main()` method.
+{{< /tab >}}
+{{< /tabs >}}
 
 ## Which dependencies do you need?
 
+Different APIs will require different dependencies. 
+
 | APIs you want to use              | Dependency you need to add    |
 |-----------------------------------|-------------------------------|
 | DataStream                        | flink-streaming-java          |  
-| DataStream with Scala             | flink-streaming-scala         |   
+| DataStream with Scala             | flink-streaming-scala{{< scala_version >}}         |   
 | Table API                         | flink-table-api-java          |   
-| Table API with Scala              | flink-table-api-scala         |
+| Table API with Scala              | flink-table-api-scala{{< scala_version >}}         |
 | Table API + DataStream            | flink-table-api-java-bridge   |
-| Table API + DataStream with Scala | flink-table-api-scala-bridge  |
+| Table API + DataStream with Scala | flink-table-api-scala-bridge{{< scala_version >}}  |
+
+You can use [Maven]({{< ref "docs/dev/configuration/maven" >}}), [Gradle]({{< ref "docs/dev/configuration/gradle" >}}), 
+or [sbt]({{< ref "docs/dev/configuration/sbt" >}}) to configure your project and add these dependencies.
+
+**Important:** Note that all these dependencies should have their scope set to *provided*. This means that
+they are needed to compile against, but that they should not be packaged into the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
+becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR file clash with some of
+your own dependency versions (which is normally avoided through inverted classloading).

Review comment:
       Missing a next steps section here linking to the other pages

##########
File path: docs/content/docs/dev/configuration/connector.md
##########
@@ -0,0 +1,72 @@
+---
+title: "Dependencies: Connectors and Formats"
+weight: 5
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Dependencies: Connectors and Formats
+
+Flink can read from and write to various external systems via [connectors]({{< ref "docs/connectors/table/overview" >}})
+and define the [format]({{< ref "docs/connectors/table/formats/overview" >}}) in which to store the 
+data (i.e. mapping binary data onto table columns).  
+
+The way that the information is serialized is represented in the external system and that system needs
+to know how to read this data in a format that can be read by Flink.  This is done through format dependencies.
+
+Most applications need specific connectors to run. Flink provides a set of table formats that can be 
+used with table connectors (with the dependencies for both being fairly unified). These are not part 
+of Flink's core dependencies and must be added as dependencies to the application.
+
+## Adding Connector Dependencies 
+
+As an example, you can add the Kafka connector as a dependency like this (Maven syntax):
+
+{{< artifact flink-connector-kafka >}}
+
+We recommend packaging the application code and all its required dependencies into one *JAR-with-dependencies* 
+which we refer to as the *application JAR*. The application JAR can be submitted to an already running 
+Flink cluster, or added to a Flink application container image.
+
+Projects created from the `Java Project Template`, the `Scala Project Template`, or Gradle are configured 
+to automatically include the application dependencies into the application JAR when you run `mvn clean package`. 
+For projects that are not set up from those templates, we recommend adding the Maven Shade Plugin to 
+build the application jar with all required dependencies.
+
+**Important:** For Maven (and other build tools) to correctly package the dependencies into the application jar,
+these application dependencies must be specified in scope *compile* (unlike the core dependencies, which
+must be specified in scope *provided*).

Review comment:
       All of this is very build tool specific, please split this wording and put it in the various maven/gradle pages, and then link them here

##########
File path: docs/content/docs/dev/configuration/connector.md
##########
@@ -0,0 +1,72 @@
+---
+title: "Dependencies: Connectors and Formats"
+weight: 5
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Dependencies: Connectors and Formats
+
+Flink can read from and write to various external systems via [connectors]({{< ref "docs/connectors/table/overview" >}})
+and define the [format]({{< ref "docs/connectors/table/formats/overview" >}}) in which to store the 
+data (i.e. mapping binary data onto table columns).  
+
+The way that the information is serialized is represented in the external system and that system needs
+to know how to read this data in a format that can be read by Flink.  This is done through format dependencies.
+
+Most applications need specific connectors to run. Flink provides a set of table formats that can be 
+used with table connectors (with the dependencies for both being fairly unified). These are not part 
+of Flink's core dependencies and must be added as dependencies to the application.
+
+## Adding Connector Dependencies 
+
+As an example, you can add the Kafka connector as a dependency like this (Maven syntax):
+
+{{< artifact flink-connector-kafka >}}
+
+We recommend packaging the application code and all its required dependencies into one *JAR-with-dependencies* 
+which we refer to as the *application JAR*. The application JAR can be submitted to an already running 
+Flink cluster, or added to a Flink application container image.
+
+Projects created from the `Java Project Template`, the `Scala Project Template`, or Gradle are configured 
+to automatically include the application dependencies into the application JAR when you run `mvn clean package`. 
+For projects that are not set up from those templates, we recommend adding the Maven Shade Plugin to 
+build the application jar with all required dependencies.
+
+**Important:** For Maven (and other build tools) to correctly package the dependencies into the application jar,
+these application dependencies must be specified in scope *compile* (unlike the core dependencies, which
+must be specified in scope *provided*).
+
+## Packaging Dependencies
+
+In the Maven Repository, you will find connectors named "flink-connector-<NAME>" and
+"flink-sql-connector-<NAME>". The former are thin JARs while the latter are uber JARs.
+
+In order to use the uber JARs, you can shade them in the uber JAR of your application, or you can add
+them to the `/lib` folder of the distribution (i.e. SQL client).
+
+[ EXPLAIN PROS and CONS ]
+
+In order to create an uber JAR to run the job, do this:
+
+[ FILL IN ]
+
+**Note:** You do not need to shade Flink API dependencies. You only need to do this for connectors,
+formats and third-party dependencies.

Review comment:
       This should not be here, and should rather link to the build tool specifics. Every build tool works differently for shading, so we need a shading section both for Maven and Gradle (also for SBT, but I don't know how it works for them so just leave it empty now for them), and then link it from here. 

##########
File path: docs/content/docs/dev/table/sqlClient.md
##########
@@ -417,14 +417,17 @@ When execute queries or insert statements, please enter the interactive mode or
 
 ### Dependencies
 
-The SQL Client does not require to setup a Java project using Maven or SBT. Instead, you can pass the
+The SQL Client does not require setting up a Java project using Maven or sbt. Instead, you can pass the

Review comment:
       "Maven, Gradle or SBT"

##########
File path: docs/content/docs/dev/configuration/connector.md
##########
@@ -0,0 +1,72 @@
+---
+title: "Dependencies: Connectors and Formats"
+weight: 5
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Dependencies: Connectors and Formats
+
+Flink can read from and write to various external systems via [connectors]({{< ref "docs/connectors/table/overview" >}})
+and define the [format]({{< ref "docs/connectors/table/formats/overview" >}}) in which to store the 
+data (i.e. mapping binary data onto table columns).  
+
+The way that the information is serialized is represented in the external system and that system needs
+to know how to read this data in a format that can be read by Flink.  This is done through format dependencies.
+
+Most applications need specific connectors to run. Flink provides a set of table formats that can be 
+used with table connectors (with the dependencies for both being fairly unified). These are not part 
+of Flink's core dependencies and must be added as dependencies to the application.
+
+## Adding Connector Dependencies 
+
+As an example, you can add the Kafka connector as a dependency like this (Maven syntax):
+
+{{< artifact flink-connector-kafka >}}
+
+We recommend packaging the application code and all its required dependencies into one *JAR-with-dependencies* 
+which we refer to as the *application JAR*. The application JAR can be submitted to an already running 
+Flink cluster, or added to a Flink application container image.
+
+Projects created from the `Java Project Template`, the `Scala Project Template`, or Gradle are configured 
+to automatically include the application dependencies into the application JAR when you run `mvn clean package`. 
+For projects that are not set up from those templates, we recommend adding the Maven Shade Plugin to 
+build the application jar with all required dependencies.
+
+**Important:** For Maven (and other build tools) to correctly package the dependencies into the application jar,
+these application dependencies must be specified in scope *compile* (unlike the core dependencies, which
+must be specified in scope *provided*).
+
+## Packaging Dependencies
+
+In the Maven Repository, you will find connectors named "flink-connector-<NAME>" and
+"flink-sql-connector-<NAME>". The former are thin JARs while the latter are uber JARs.
+
+In order to use the uber JARs, you can shade them in the uber JAR of your application, or you can add
+them to the `/lib` folder of the distribution (i.e. SQL client).
+
+[ EXPLAIN PROS and CONS ]

Review comment:
       I start suggesting some wording:
   
   > When loading uber jars directly in the distribution, you lock on a specific version of the connector, simplifying the management of a shared multi-job flink cluster, but as a downside each job developer doesn't have control over the connector version. While when shading the connector jar, either the thin or the fat one, in your job JAR, guarantees each job developer more control over the connector version. In case of shading the thin JAR, the job developer has even control over the transitive dependencies, as it can bump them without bumping the connector (binary compatibility permitting).
   
   @fapaul @twalthr wanna add some details here on why one should use thin vs uber jars?

##########
File path: docs/content/docs/dev/configuration/connector.md
##########
@@ -0,0 +1,72 @@
+---
+title: "Dependencies: Connectors and Formats"
+weight: 5
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Dependencies: Connectors and Formats
+
+Flink can read from and write to various external systems via [connectors]({{< ref "docs/connectors/table/overview" >}})
+and define the [format]({{< ref "docs/connectors/table/formats/overview" >}}) in which to store the 
+data (i.e. mapping binary data onto table columns).  
+
+The way that the information is serialized is represented in the external system and that system needs
+to know how to read this data in a format that can be read by Flink.  This is done through format dependencies.
+
+Most applications need specific connectors to run. Flink provides a set of table formats that can be 
+used with table connectors (with the dependencies for both being fairly unified). These are not part 
+of Flink's core dependencies and must be added as dependencies to the application.
+
+## Adding Connector Dependencies 
+
+As an example, you can add the Kafka connector as a dependency like this (Maven syntax):
+
+{{< artifact flink-connector-kafka >}}
+
+We recommend packaging the application code and all its required dependencies into one *JAR-with-dependencies* 
+which we refer to as the *application JAR*. The application JAR can be submitted to an already running 
+Flink cluster, or added to a Flink application container image.
+
+Projects created from the `Java Project Template`, the `Scala Project Template`, or Gradle are configured 
+to automatically include the application dependencies into the application JAR when you run `mvn clean package`. 
+For projects that are not set up from those templates, we recommend adding the Maven Shade Plugin to 
+build the application jar with all required dependencies.
+
+**Important:** For Maven (and other build tools) to correctly package the dependencies into the application jar,
+these application dependencies must be specified in scope *compile* (unlike the core dependencies, which
+must be specified in scope *provided*).
+
+## Packaging Dependencies
+
+In the Maven Repository, you will find connectors named "flink-connector-<NAME>" and

Review comment:
       > In the Maven Repository, you will find connectors
   
   To
   
   > On Maven Central we publish connectors




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot commented on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot commented on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012186701


   Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community
   to review your pull request. We will use this comment to track the progress of the review.
   
   
   ## Automated Checks
   Last check on commit 988fcbec1b60588c4c0cb2249a94f258145f7040 (Thu Jan 13 14:27:46 UTC 2022)
   
   **Warnings:**
    * Documentation files were touched, but no `docs/content.zh/` files: Update Chinese documentation or file Jira ticket.
   
   
   <sub>Mention the bot in a comment to re-run the automated checks.</sub>
   ## Review Progress
   
   * ❓ 1. The [description] looks good.
   * ❓ 2. There is [consensus] that the contribution should go into to Flink.
   * ❓ 3. Needs [attention] from.
   * ❓ 4. The change fits into the overall [architecture].
   * ❓ 5. Overall code [quality] is good.
   
   Please see the [Pull Request Review Guide](https://flink.apache.org/contributing/reviewing-prs.html) for a full explanation of the review process.<details>
    The Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot approve description` to approve one or more aspects (aspects: `description`, `consensus`, `architecture` and `quality`)
    - `@flinkbot approve all` to approve all aspects
    - `@flinkbot approve-until architecture` to approve everything until `architecture`
    - `@flinkbot attention @username1 [@username2 ..]` to require somebody's attention
    - `@flinkbot disapprove architecture` to remove an approval you gave earlier
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1bc80c1579db9e7f2d57112fdb31f0190f71db67 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277) 
   * 274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 80fd50ad46865be06e2c83b2470fb3eb2d35cd96 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1bc80c1579db9e7f2d57112fdb31f0190f71db67 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277) 
   * 274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312) 
   * 26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 80fd50ad46865be06e2c83b2470fb3eb2d35cd96 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823) 
   * d0b2b188c37443b7bbda39af499398326cd56979 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] slinkydeveloper commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
slinkydeveloper commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r795426010



##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,253 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or locally for testing),
+the [Flink runtime library](https://ossindex.sonatype.org/component/pkg:maven/org.apache.flink/flink-runtime@1.14.3) 
+must be available.
+
+The guides in this section will show you how to configure your projects via popular build tools
+([Maven]({{< ref "docs/dev/configuration/maven" >}}), [Gradle]({{< ref "docs/dev/configuration/gradle" >}}),
+[sbt]({{< ref "docs/dev/configuration/sbt" >}})), add the necessary dependencies 
+(i.e. [connectors and formats]({{< ref "docs/dev/configuration/connector" >}}), 
+[testing]({{< ref "docs/dev/configuration/testing" >}})), and cover some 
+[advanced]({{< ref "docs/dev/configuration/advanced" >}}) configuration topics. 
+
+## Getting started
+
+To get started working on your Flink application, use the following commands, scripts, and templates 
+to create a Flink project.  
+
+{{< tabs "creating project" >}}
+{{< tab "Maven" >}}
+
+You can create a project based on an [Archetype](https://maven.apache.org/guides/introduction/introduction-to-archetypes.html)
+with the Maven command below or use the provided quickstart bash script.
+
+### Maven command
+```bash
+$ mvn archetype:generate                \
+  -DarchetypeGroupId=org.apache.flink   \
+  -DarchetypeArtifactId=flink-quickstart-java \
+  -DarchetypeVersion={{< version >}}
+```
+This allows you to name your newly created project and will interactively ask you for the groupId,
+artifactId, and package name.
+
+### Quickstart script
+```bash
+$ curl https://flink.apache.org/q/quickstart.sh | bash -s {{< version >}}
+```
+
+{{< hint info >}}
+For Maven 3.0 or higher, it is no longer possible to specify the repository (-DarchetypeCatalog) via
+the command line. For details about this change, please refer to the <a href="http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html">official Maven document</a> If you wish to use a snapshot repository, you need to add a
+repository entry to your `settings.xml` file. For example:
+
+```xml
+<settings>
+  <activeProfiles>
+    <activeProfile>apache</activeProfile>
+  </activeProfiles>
+  <profiles>
+    <profile>
+      <id>apache</id>
+      <repositories>
+        <repository>
+          <id>apache-snapshots</id>
+          <url>https://repository.apache.org/content/repositories/snapshots/</url>
+        </repository>
+      </repositories>
+    </profile>
+  </profiles>
+</settings>
+```
+{{< /hint >}}

Review comment:
       I don't think this hint is important at this point, perhaps move it to maven page?

##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,253 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or locally for testing),
+the [Flink runtime library](https://ossindex.sonatype.org/component/pkg:maven/org.apache.flink/flink-runtime@1.14.3) 
+must be available.
+
+The guides in this section will show you how to configure your projects via popular build tools
+([Maven]({{< ref "docs/dev/configuration/maven" >}}), [Gradle]({{< ref "docs/dev/configuration/gradle" >}}),
+[sbt]({{< ref "docs/dev/configuration/sbt" >}})), add the necessary dependencies 
+(i.e. [connectors and formats]({{< ref "docs/dev/configuration/connector" >}}), 
+[testing]({{< ref "docs/dev/configuration/testing" >}})), and cover some 
+[advanced]({{< ref "docs/dev/configuration/advanced" >}}) configuration topics. 
+
+## Getting started
+
+To get started working on your Flink application, use the following commands, scripts, and templates 
+to create a Flink project.  
+
+{{< tabs "creating project" >}}
+{{< tab "Maven" >}}
+
+You can create a project based on an [Archetype](https://maven.apache.org/guides/introduction/introduction-to-archetypes.html)
+with the Maven command below or use the provided quickstart bash script.
+
+### Maven command
+```bash
+$ mvn archetype:generate                \
+  -DarchetypeGroupId=org.apache.flink   \
+  -DarchetypeArtifactId=flink-quickstart-java \
+  -DarchetypeVersion={{< version >}}
+```
+This allows you to name your newly created project and will interactively ask you for the groupId,
+artifactId, and package name.
+
+### Quickstart script
+```bash
+$ curl https://flink.apache.org/q/quickstart.sh | bash -s {{< version >}}
+```
+
+{{< hint info >}}
+For Maven 3.0 or higher, it is no longer possible to specify the repository (-DarchetypeCatalog) via
+the command line. For details about this change, please refer to the <a href="http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html">official Maven document</a> If you wish to use a snapshot repository, you need to add a
+repository entry to your `settings.xml` file. For example:
+
+```xml
+<settings>
+  <activeProfiles>
+    <activeProfile>apache</activeProfile>
+  </activeProfiles>
+  <profiles>
+    <profile>
+      <id>apache</id>
+      <repositories>
+        <repository>
+          <id>apache-snapshots</id>
+          <url>https://repository.apache.org/content/repositories/snapshots/</url>
+        </repository>
+      </repositories>
+    </profile>
+  </profiles>
+</settings>
+```
+{{< /hint >}}
+
+{{< /tab >}}
+{{< tab "Gradle" >}}
+You can create a project with a Gradle build script or use the provided quickstart bash script.
+
+### Gradle build script
+
+To execute these build configuration scripts, run the `gradle` command in the directory with these scripts.
+
+**build.gradle**
+
+```gradle
+buildscript {
+    repositories {
+        jcenter() // this applies only to the Gradle 'Shadow' plugin
+    }
+    dependencies {
+        classpath 'com.github.jengelman.gradle.plugins:shadow:2.0.4'
+    }
+}
+plugins {
+    id 'java'
+    id 'application'
+    // shadow plugin to produce fat JARs
+    id 'com.github.johnrengelman.shadow' version '2.0.4'
+}
+// artifact properties
+group = 'org.myorg.quickstart'
+version = '0.1-SNAPSHOT'
+mainClassName = 'org.myorg.quickstart.StreamingJob'
+description = """Flink Quickstart Job"""
+ext {
+    javaVersion = '1.8'
+    flinkVersion = '{{< version >}}'
+    scalaBinaryVersion = '{{< scala_version >}}'
+    slf4jVersion = '1.7.32'
+    log4jVersion = '2.17.1'
+}
+sourceCompatibility = javaVersion
+targetCompatibility = javaVersion
+tasks.withType(JavaCompile) {
+	options.encoding = 'UTF-8'
+}
+applicationDefaultJvmArgs = ["-Dlog4j.configurationFile=log4j2.properties"]
+task wrapper(type: Wrapper) {
+    gradleVersion = '3.1'
+}
+// declare where to find the dependencies of your project
+repositories {
+    mavenCentral()
+    maven { url "https://repository.apache.org/content/repositories/snapshots/" }
+}
+// NOTE: We cannot use "compileOnly" or "shadow" configurations since then we could not run code
+// in the IDE or with "gradle run". We also cannot exclude transitive dependencies from the
+// shadowJar yet (see https://github.com/johnrengelman/shadow/issues/159).
+// -> Explicitly define the // libraries we want to be included in the "flinkShadowJar" configuration!
+configurations {
+    flinkShadowJar // dependencies which go into the shadowJar
+    // always exclude these (also from transitive dependencies) since they are provided by Flink
+    flinkShadowJar.exclude group: 'org.apache.flink', module: 'force-shading'
+    flinkShadowJar.exclude group: 'com.google.code.findbugs', module: 'jsr305'
+    flinkShadowJar.exclude group: 'org.slf4j'
+    flinkShadowJar.exclude group: 'org.apache.logging.log4j'
+}
+// declare the dependencies for your production and test code
+dependencies {
+    // --------------------------------------------------------------
+    // Compile-time dependencies that should NOT be part of the
+    // shadow jar and are provided in the lib folder of Flink
+    // --------------------------------------------------------------
+    compile "org.apache.flink:flink-streaming-java:${flinkVersion}"
+    compile "org.apache.flink:flink-clients:${flinkVersion}"
+    // --------------------------------------------------------------
+    // Dependencies that should be part of the shadow jar, e.g.
+    // connectors. These must be in the flinkShadowJar configuration!
+    // --------------------------------------------------------------
+    //flinkShadowJar "org.apache.flink:flink-connector-kafka:${flinkVersion}"
+    compile "org.apache.logging.log4j:log4j-api:${log4jVersion}"
+    compile "org.apache.logging.log4j:log4j-core:${log4jVersion}"
+    compile "org.apache.logging.log4j:log4j-slf4j-impl:${log4jVersion}"
+    compile "org.slf4j:slf4j-log4j12:${slf4jVersion}"
+    // Add test dependencies here.
+    // testCompile "junit:junit:4.12"
+}
+// make compileOnly dependencies available for tests:
+sourceSets {
+    main.compileClasspath += configurations.flinkShadowJar
+    main.runtimeClasspath += configurations.flinkShadowJar
+    test.compileClasspath += configurations.flinkShadowJar
+    test.runtimeClasspath += configurations.flinkShadowJar
+    javadoc.classpath += configurations.flinkShadowJar
+}
+run.classpath = sourceSets.main.runtimeClasspath
+jar {
+    manifest {
+        attributes 'Built-By': System.getProperty('user.name'),
+                'Build-Jdk': System.getProperty('java.version')
+    }
+}
+shadowJar {
+    configurations = [project.configurations.flinkShadowJar]
+}
+```
+
+**settings.gradle**
+```gradle
+rootProject.name = 'quickstart'
+```
+### Quickstart script
+```bash

Review comment:
       newline

##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,253 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or locally for testing),
+the [Flink runtime library](https://ossindex.sonatype.org/component/pkg:maven/org.apache.flink/flink-runtime@1.14.3) 
+must be available.
+
+The guides in this section will show you how to configure your projects via popular build tools
+([Maven]({{< ref "docs/dev/configuration/maven" >}}), [Gradle]({{< ref "docs/dev/configuration/gradle" >}}),
+[sbt]({{< ref "docs/dev/configuration/sbt" >}})), add the necessary dependencies 
+(i.e. [connectors and formats]({{< ref "docs/dev/configuration/connector" >}}), 
+[testing]({{< ref "docs/dev/configuration/testing" >}})), and cover some 
+[advanced]({{< ref "docs/dev/configuration/advanced" >}}) configuration topics. 
+
+## Getting started
+
+To get started working on your Flink application, use the following commands, scripts, and templates 
+to create a Flink project.  
+
+{{< tabs "creating project" >}}
+{{< tab "Maven" >}}
+
+You can create a project based on an [Archetype](https://maven.apache.org/guides/introduction/introduction-to-archetypes.html)
+with the Maven command below or use the provided quickstart bash script.
+
+### Maven command
+```bash
+$ mvn archetype:generate                \
+  -DarchetypeGroupId=org.apache.flink   \
+  -DarchetypeArtifactId=flink-quickstart-java \
+  -DarchetypeVersion={{< version >}}
+```
+This allows you to name your newly created project and will interactively ask you for the groupId,
+artifactId, and package name.
+
+### Quickstart script
+```bash
+$ curl https://flink.apache.org/q/quickstart.sh | bash -s {{< version >}}
+```
+
+{{< hint info >}}
+For Maven 3.0 or higher, it is no longer possible to specify the repository (-DarchetypeCatalog) via
+the command line. For details about this change, please refer to the <a href="http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html">official Maven document</a> If you wish to use a snapshot repository, you need to add a
+repository entry to your `settings.xml` file. For example:
+
+```xml
+<settings>
+  <activeProfiles>
+    <activeProfile>apache</activeProfile>
+  </activeProfiles>
+  <profiles>
+    <profile>
+      <id>apache</id>
+      <repositories>
+        <repository>
+          <id>apache-snapshots</id>
+          <url>https://repository.apache.org/content/repositories/snapshots/</url>
+        </repository>
+      </repositories>
+    </profile>
+  </profiles>
+</settings>
+```
+{{< /hint >}}
+
+{{< /tab >}}
+{{< tab "Gradle" >}}
+You can create a project with a Gradle build script or use the provided quickstart bash script.
+
+### Gradle build script
+
+To execute these build configuration scripts, run the `gradle` command in the directory with these scripts.
+
+**build.gradle**
+
+```gradle
+buildscript {
+    repositories {
+        jcenter() // this applies only to the Gradle 'Shadow' plugin
+    }
+    dependencies {
+        classpath 'com.github.jengelman.gradle.plugins:shadow:2.0.4'
+    }
+}
+plugins {
+    id 'java'
+    id 'application'
+    // shadow plugin to produce fat JARs
+    id 'com.github.johnrengelman.shadow' version '2.0.4'
+}
+// artifact properties
+group = 'org.myorg.quickstart'
+version = '0.1-SNAPSHOT'
+mainClassName = 'org.myorg.quickstart.StreamingJob'
+description = """Flink Quickstart Job"""
+ext {
+    javaVersion = '1.8'
+    flinkVersion = '{{< version >}}'
+    scalaBinaryVersion = '{{< scala_version >}}'
+    slf4jVersion = '1.7.32'
+    log4jVersion = '2.17.1'
+}
+sourceCompatibility = javaVersion
+targetCompatibility = javaVersion
+tasks.withType(JavaCompile) {
+	options.encoding = 'UTF-8'
+}
+applicationDefaultJvmArgs = ["-Dlog4j.configurationFile=log4j2.properties"]
+task wrapper(type: Wrapper) {
+    gradleVersion = '3.1'
+}
+// declare where to find the dependencies of your project
+repositories {
+    mavenCentral()
+    maven { url "https://repository.apache.org/content/repositories/snapshots/" }
+}
+// NOTE: We cannot use "compileOnly" or "shadow" configurations since then we could not run code
+// in the IDE or with "gradle run". We also cannot exclude transitive dependencies from the
+// shadowJar yet (see https://github.com/johnrengelman/shadow/issues/159).
+// -> Explicitly define the // libraries we want to be included in the "flinkShadowJar" configuration!
+configurations {
+    flinkShadowJar // dependencies which go into the shadowJar
+    // always exclude these (also from transitive dependencies) since they are provided by Flink
+    flinkShadowJar.exclude group: 'org.apache.flink', module: 'force-shading'
+    flinkShadowJar.exclude group: 'com.google.code.findbugs', module: 'jsr305'
+    flinkShadowJar.exclude group: 'org.slf4j'
+    flinkShadowJar.exclude group: 'org.apache.logging.log4j'
+}
+// declare the dependencies for your production and test code
+dependencies {
+    // --------------------------------------------------------------
+    // Compile-time dependencies that should NOT be part of the
+    // shadow jar and are provided in the lib folder of Flink
+    // --------------------------------------------------------------
+    compile "org.apache.flink:flink-streaming-java:${flinkVersion}"
+    compile "org.apache.flink:flink-clients:${flinkVersion}"
+    // --------------------------------------------------------------
+    // Dependencies that should be part of the shadow jar, e.g.
+    // connectors. These must be in the flinkShadowJar configuration!
+    // --------------------------------------------------------------
+    //flinkShadowJar "org.apache.flink:flink-connector-kafka:${flinkVersion}"
+    compile "org.apache.logging.log4j:log4j-api:${log4jVersion}"
+    compile "org.apache.logging.log4j:log4j-core:${log4jVersion}"
+    compile "org.apache.logging.log4j:log4j-slf4j-impl:${log4jVersion}"
+    compile "org.slf4j:slf4j-log4j12:${slf4jVersion}"
+    // Add test dependencies here.
+    // testCompile "junit:junit:4.12"
+}
+// make compileOnly dependencies available for tests:
+sourceSets {
+    main.compileClasspath += configurations.flinkShadowJar
+    main.runtimeClasspath += configurations.flinkShadowJar
+    test.compileClasspath += configurations.flinkShadowJar
+    test.runtimeClasspath += configurations.flinkShadowJar
+    javadoc.classpath += configurations.flinkShadowJar
+}
+run.classpath = sourceSets.main.runtimeClasspath
+jar {
+    manifest {
+        attributes 'Built-By': System.getProperty('user.name'),
+                'Build-Jdk': System.getProperty('java.version')
+    }
+}
+shadowJar {
+    configurations = [project.configurations.flinkShadowJar]
+}
+```
+
+**settings.gradle**
+```gradle
+rootProject.name = 'quickstart'
+```
+### Quickstart script
+```bash
+bash -c "$(curl https://flink.apache.org/q/gradle-quickstart.sh)" -- {{< version >}} {{< scala_version >}}
+```
+{{< /tab >}}
+{{< tab "sbt" >}}
+You can scaffold a new Flink project with the following [giter8 template](https://github.com/tillrohrmann/flink-project.g8)
+and the `sbt new` command (which creates new build definitions from a template) or use the provided quickstart bash script.
+
+### sbt template
+```bash
+$ sbt new tillrohrmann/flink-project.g8
+```
+
+### Quickstart script
+```bash
+$ bash <(curl https://flink.apache.org/q/sbt-quickstart.sh)
+```
+{{< /tab >}}
+{{< /tabs >}}
+
+## Which dependencies do you need?
+
+Depending on what you want to achieve, you are going to choose a combination of our available APIs, 
+which will require different dependencies. 
+
+Here is a table of artifact/dependency names:
+
+| APIs you want to use              | Dependency you need to add    |
+|-----------------------------------|-------------------------------|
+| DataStream                        | flink-streaming-java          |  
+| DataStream with Scala             | flink-streaming-scala{{< scala_version >}}         |   

Review comment:
       Link the text to the datastream api overview page?

##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,253 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or locally for testing),
+the [Flink runtime library](https://ossindex.sonatype.org/component/pkg:maven/org.apache.flink/flink-runtime@1.14.3) 
+must be available.
+
+The guides in this section will show you how to configure your projects via popular build tools
+([Maven]({{< ref "docs/dev/configuration/maven" >}}), [Gradle]({{< ref "docs/dev/configuration/gradle" >}}),
+[sbt]({{< ref "docs/dev/configuration/sbt" >}})), add the necessary dependencies 
+(i.e. [connectors and formats]({{< ref "docs/dev/configuration/connector" >}}), 
+[testing]({{< ref "docs/dev/configuration/testing" >}})), and cover some 
+[advanced]({{< ref "docs/dev/configuration/advanced" >}}) configuration topics. 
+
+## Getting started
+
+To get started working on your Flink application, use the following commands, scripts, and templates 
+to create a Flink project.  
+
+{{< tabs "creating project" >}}
+{{< tab "Maven" >}}
+
+You can create a project based on an [Archetype](https://maven.apache.org/guides/introduction/introduction-to-archetypes.html)
+with the Maven command below or use the provided quickstart bash script.
+
+### Maven command
+```bash
+$ mvn archetype:generate                \
+  -DarchetypeGroupId=org.apache.flink   \
+  -DarchetypeArtifactId=flink-quickstart-java \
+  -DarchetypeVersion={{< version >}}
+```
+This allows you to name your newly created project and will interactively ask you for the groupId,
+artifactId, and package name.
+
+### Quickstart script
+```bash
+$ curl https://flink.apache.org/q/quickstart.sh | bash -s {{< version >}}
+```
+
+{{< hint info >}}
+For Maven 3.0 or higher, it is no longer possible to specify the repository (-DarchetypeCatalog) via
+the command line. For details about this change, please refer to the <a href="http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html">official Maven document</a> If you wish to use a snapshot repository, you need to add a
+repository entry to your `settings.xml` file. For example:
+
+```xml
+<settings>
+  <activeProfiles>
+    <activeProfile>apache</activeProfile>
+  </activeProfiles>
+  <profiles>
+    <profile>
+      <id>apache</id>
+      <repositories>
+        <repository>
+          <id>apache-snapshots</id>
+          <url>https://repository.apache.org/content/repositories/snapshots/</url>
+        </repository>
+      </repositories>
+    </profile>
+  </profiles>
+</settings>
+```
+{{< /hint >}}
+
+{{< /tab >}}
+{{< tab "Gradle" >}}
+You can create a project with a Gradle build script or use the provided quickstart bash script.
+
+### Gradle build script
+
+To execute these build configuration scripts, run the `gradle` command in the directory with these scripts.
+
+**build.gradle**
+
+```gradle
+buildscript {
+    repositories {
+        jcenter() // this applies only to the Gradle 'Shadow' plugin
+    }
+    dependencies {
+        classpath 'com.github.jengelman.gradle.plugins:shadow:2.0.4'
+    }
+}
+plugins {
+    id 'java'
+    id 'application'
+    // shadow plugin to produce fat JARs
+    id 'com.github.johnrengelman.shadow' version '2.0.4'
+}
+// artifact properties
+group = 'org.myorg.quickstart'
+version = '0.1-SNAPSHOT'
+mainClassName = 'org.myorg.quickstart.StreamingJob'
+description = """Flink Quickstart Job"""
+ext {
+    javaVersion = '1.8'
+    flinkVersion = '{{< version >}}'
+    scalaBinaryVersion = '{{< scala_version >}}'
+    slf4jVersion = '1.7.32'
+    log4jVersion = '2.17.1'
+}
+sourceCompatibility = javaVersion
+targetCompatibility = javaVersion
+tasks.withType(JavaCompile) {
+	options.encoding = 'UTF-8'
+}
+applicationDefaultJvmArgs = ["-Dlog4j.configurationFile=log4j2.properties"]
+task wrapper(type: Wrapper) {
+    gradleVersion = '3.1'
+}
+// declare where to find the dependencies of your project
+repositories {
+    mavenCentral()
+    maven { url "https://repository.apache.org/content/repositories/snapshots/" }
+}
+// NOTE: We cannot use "compileOnly" or "shadow" configurations since then we could not run code
+// in the IDE or with "gradle run". We also cannot exclude transitive dependencies from the
+// shadowJar yet (see https://github.com/johnrengelman/shadow/issues/159).
+// -> Explicitly define the // libraries we want to be included in the "flinkShadowJar" configuration!
+configurations {
+    flinkShadowJar // dependencies which go into the shadowJar
+    // always exclude these (also from transitive dependencies) since they are provided by Flink
+    flinkShadowJar.exclude group: 'org.apache.flink', module: 'force-shading'
+    flinkShadowJar.exclude group: 'com.google.code.findbugs', module: 'jsr305'
+    flinkShadowJar.exclude group: 'org.slf4j'
+    flinkShadowJar.exclude group: 'org.apache.logging.log4j'
+}
+// declare the dependencies for your production and test code
+dependencies {
+    // --------------------------------------------------------------
+    // Compile-time dependencies that should NOT be part of the
+    // shadow jar and are provided in the lib folder of Flink
+    // --------------------------------------------------------------
+    compile "org.apache.flink:flink-streaming-java:${flinkVersion}"
+    compile "org.apache.flink:flink-clients:${flinkVersion}"
+    // --------------------------------------------------------------
+    // Dependencies that should be part of the shadow jar, e.g.
+    // connectors. These must be in the flinkShadowJar configuration!
+    // --------------------------------------------------------------
+    //flinkShadowJar "org.apache.flink:flink-connector-kafka:${flinkVersion}"
+    compile "org.apache.logging.log4j:log4j-api:${log4jVersion}"
+    compile "org.apache.logging.log4j:log4j-core:${log4jVersion}"
+    compile "org.apache.logging.log4j:log4j-slf4j-impl:${log4jVersion}"
+    compile "org.slf4j:slf4j-log4j12:${slf4jVersion}"
+    // Add test dependencies here.
+    // testCompile "junit:junit:4.12"
+}
+// make compileOnly dependencies available for tests:
+sourceSets {
+    main.compileClasspath += configurations.flinkShadowJar
+    main.runtimeClasspath += configurations.flinkShadowJar
+    test.compileClasspath += configurations.flinkShadowJar
+    test.runtimeClasspath += configurations.flinkShadowJar
+    javadoc.classpath += configurations.flinkShadowJar
+}
+run.classpath = sourceSets.main.runtimeClasspath
+jar {
+    manifest {
+        attributes 'Built-By': System.getProperty('user.name'),
+                'Build-Jdk': System.getProperty('java.version')
+    }
+}
+shadowJar {
+    configurations = [project.configurations.flinkShadowJar]
+}
+```
+
+**settings.gradle**
+```gradle
+rootProject.name = 'quickstart'
+```
+### Quickstart script
+```bash
+bash -c "$(curl https://flink.apache.org/q/gradle-quickstart.sh)" -- {{< version >}} {{< scala_version >}}
+```
+{{< /tab >}}
+{{< tab "sbt" >}}
+You can scaffold a new Flink project with the following [giter8 template](https://github.com/tillrohrmann/flink-project.g8)
+and the `sbt new` command (which creates new build definitions from a template) or use the provided quickstart bash script.
+
+### sbt template
+```bash

Review comment:
       newline

##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,253 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or locally for testing),
+the [Flink runtime library](https://ossindex.sonatype.org/component/pkg:maven/org.apache.flink/flink-runtime@1.14.3) 
+must be available.
+
+The guides in this section will show you how to configure your projects via popular build tools
+([Maven]({{< ref "docs/dev/configuration/maven" >}}), [Gradle]({{< ref "docs/dev/configuration/gradle" >}}),
+[sbt]({{< ref "docs/dev/configuration/sbt" >}})), add the necessary dependencies 
+(i.e. [connectors and formats]({{< ref "docs/dev/configuration/connector" >}}), 
+[testing]({{< ref "docs/dev/configuration/testing" >}})), and cover some 
+[advanced]({{< ref "docs/dev/configuration/advanced" >}}) configuration topics. 
+
+## Getting started
+
+To get started working on your Flink application, use the following commands, scripts, and templates 
+to create a Flink project.  
+
+{{< tabs "creating project" >}}
+{{< tab "Maven" >}}
+
+You can create a project based on an [Archetype](https://maven.apache.org/guides/introduction/introduction-to-archetypes.html)
+with the Maven command below or use the provided quickstart bash script.
+
+### Maven command
+```bash
+$ mvn archetype:generate                \
+  -DarchetypeGroupId=org.apache.flink   \
+  -DarchetypeArtifactId=flink-quickstart-java \
+  -DarchetypeVersion={{< version >}}
+```
+This allows you to name your newly created project and will interactively ask you for the groupId,
+artifactId, and package name.
+
+### Quickstart script
+```bash
+$ curl https://flink.apache.org/q/quickstart.sh | bash -s {{< version >}}
+```
+
+{{< hint info >}}
+For Maven 3.0 or higher, it is no longer possible to specify the repository (-DarchetypeCatalog) via
+the command line. For details about this change, please refer to the <a href="http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html">official Maven document</a> If you wish to use a snapshot repository, you need to add a
+repository entry to your `settings.xml` file. For example:
+
+```xml
+<settings>
+  <activeProfiles>
+    <activeProfile>apache</activeProfile>
+  </activeProfiles>
+  <profiles>
+    <profile>
+      <id>apache</id>
+      <repositories>
+        <repository>
+          <id>apache-snapshots</id>
+          <url>https://repository.apache.org/content/repositories/snapshots/</url>
+        </repository>
+      </repositories>
+    </profile>
+  </profiles>
+</settings>
+```
+{{< /hint >}}
+
+{{< /tab >}}
+{{< tab "Gradle" >}}
+You can create a project with a Gradle build script or use the provided quickstart bash script.
+
+### Gradle build script
+
+To execute these build configuration scripts, run the `gradle` command in the directory with these scripts.
+
+**build.gradle**
+
+```gradle
+buildscript {
+    repositories {
+        jcenter() // this applies only to the Gradle 'Shadow' plugin
+    }
+    dependencies {
+        classpath 'com.github.jengelman.gradle.plugins:shadow:2.0.4'
+    }
+}
+plugins {
+    id 'java'
+    id 'application'
+    // shadow plugin to produce fat JARs
+    id 'com.github.johnrengelman.shadow' version '2.0.4'
+}
+// artifact properties
+group = 'org.myorg.quickstart'
+version = '0.1-SNAPSHOT'
+mainClassName = 'org.myorg.quickstart.StreamingJob'
+description = """Flink Quickstart Job"""
+ext {
+    javaVersion = '1.8'
+    flinkVersion = '{{< version >}}'
+    scalaBinaryVersion = '{{< scala_version >}}'
+    slf4jVersion = '1.7.32'
+    log4jVersion = '2.17.1'
+}
+sourceCompatibility = javaVersion
+targetCompatibility = javaVersion
+tasks.withType(JavaCompile) {
+	options.encoding = 'UTF-8'
+}
+applicationDefaultJvmArgs = ["-Dlog4j.configurationFile=log4j2.properties"]
+task wrapper(type: Wrapper) {
+    gradleVersion = '3.1'
+}
+// declare where to find the dependencies of your project
+repositories {
+    mavenCentral()
+    maven { url "https://repository.apache.org/content/repositories/snapshots/" }
+}
+// NOTE: We cannot use "compileOnly" or "shadow" configurations since then we could not run code
+// in the IDE or with "gradle run". We also cannot exclude transitive dependencies from the
+// shadowJar yet (see https://github.com/johnrengelman/shadow/issues/159).
+// -> Explicitly define the // libraries we want to be included in the "flinkShadowJar" configuration!
+configurations {
+    flinkShadowJar // dependencies which go into the shadowJar
+    // always exclude these (also from transitive dependencies) since they are provided by Flink
+    flinkShadowJar.exclude group: 'org.apache.flink', module: 'force-shading'
+    flinkShadowJar.exclude group: 'com.google.code.findbugs', module: 'jsr305'
+    flinkShadowJar.exclude group: 'org.slf4j'
+    flinkShadowJar.exclude group: 'org.apache.logging.log4j'
+}
+// declare the dependencies for your production and test code
+dependencies {
+    // --------------------------------------------------------------
+    // Compile-time dependencies that should NOT be part of the
+    // shadow jar and are provided in the lib folder of Flink
+    // --------------------------------------------------------------
+    compile "org.apache.flink:flink-streaming-java:${flinkVersion}"
+    compile "org.apache.flink:flink-clients:${flinkVersion}"
+    // --------------------------------------------------------------
+    // Dependencies that should be part of the shadow jar, e.g.
+    // connectors. These must be in the flinkShadowJar configuration!
+    // --------------------------------------------------------------
+    //flinkShadowJar "org.apache.flink:flink-connector-kafka:${flinkVersion}"
+    compile "org.apache.logging.log4j:log4j-api:${log4jVersion}"
+    compile "org.apache.logging.log4j:log4j-core:${log4jVersion}"
+    compile "org.apache.logging.log4j:log4j-slf4j-impl:${log4jVersion}"
+    compile "org.slf4j:slf4j-log4j12:${slf4jVersion}"
+    // Add test dependencies here.
+    // testCompile "junit:junit:4.12"
+}
+// make compileOnly dependencies available for tests:
+sourceSets {
+    main.compileClasspath += configurations.flinkShadowJar
+    main.runtimeClasspath += configurations.flinkShadowJar
+    test.compileClasspath += configurations.flinkShadowJar
+    test.runtimeClasspath += configurations.flinkShadowJar
+    javadoc.classpath += configurations.flinkShadowJar
+}
+run.classpath = sourceSets.main.runtimeClasspath
+jar {
+    manifest {
+        attributes 'Built-By': System.getProperty('user.name'),
+                'Build-Jdk': System.getProperty('java.version')
+    }
+}
+shadowJar {
+    configurations = [project.configurations.flinkShadowJar]
+}
+```
+
+**settings.gradle**
+```gradle
+rootProject.name = 'quickstart'
+```
+### Quickstart script
+```bash
+bash -c "$(curl https://flink.apache.org/q/gradle-quickstart.sh)" -- {{< version >}} {{< scala_version >}}
+```
+{{< /tab >}}
+{{< tab "sbt" >}}
+You can scaffold a new Flink project with the following [giter8 template](https://github.com/tillrohrmann/flink-project.g8)
+and the `sbt new` command (which creates new build definitions from a template) or use the provided quickstart bash script.
+
+### sbt template
+```bash
+$ sbt new tillrohrmann/flink-project.g8
+```
+
+### Quickstart script
+```bash
+$ bash <(curl https://flink.apache.org/q/sbt-quickstart.sh)
+```
+{{< /tab >}}
+{{< /tabs >}}
+
+## Which dependencies do you need?
+
+Depending on what you want to achieve, you are going to choose a combination of our available APIs, 
+which will require different dependencies. 
+
+Here is a table of artifact/dependency names:
+
+| APIs you want to use              | Dependency you need to add    |
+|-----------------------------------|-------------------------------|
+| DataStream                        | flink-streaming-java          |  
+| DataStream with Scala             | flink-streaming-scala{{< scala_version >}}         |   
+| Table API                         | flink-table-api-java          |   
+| Table API with Scala              | flink-table-api-scala{{< scala_version >}}         |

Review comment:
       Link the text to the table api overview page?

##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,253 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or locally for testing),
+the [Flink runtime library](https://ossindex.sonatype.org/component/pkg:maven/org.apache.flink/flink-runtime@1.14.3) 
+must be available.
+
+The guides in this section will show you how to configure your projects via popular build tools
+([Maven]({{< ref "docs/dev/configuration/maven" >}}), [Gradle]({{< ref "docs/dev/configuration/gradle" >}}),
+[sbt]({{< ref "docs/dev/configuration/sbt" >}})), add the necessary dependencies 
+(i.e. [connectors and formats]({{< ref "docs/dev/configuration/connector" >}}), 
+[testing]({{< ref "docs/dev/configuration/testing" >}})), and cover some 
+[advanced]({{< ref "docs/dev/configuration/advanced" >}}) configuration topics. 
+
+## Getting started
+
+To get started working on your Flink application, use the following commands, scripts, and templates 
+to create a Flink project.  
+
+{{< tabs "creating project" >}}
+{{< tab "Maven" >}}
+
+You can create a project based on an [Archetype](https://maven.apache.org/guides/introduction/introduction-to-archetypes.html)
+with the Maven command below or use the provided quickstart bash script.
+
+### Maven command
+```bash
+$ mvn archetype:generate                \
+  -DarchetypeGroupId=org.apache.flink   \
+  -DarchetypeArtifactId=flink-quickstart-java \
+  -DarchetypeVersion={{< version >}}
+```
+This allows you to name your newly created project and will interactively ask you for the groupId,
+artifactId, and package name.
+
+### Quickstart script
+```bash
+$ curl https://flink.apache.org/q/quickstart.sh | bash -s {{< version >}}
+```
+
+{{< hint info >}}
+For Maven 3.0 or higher, it is no longer possible to specify the repository (-DarchetypeCatalog) via
+the command line. For details about this change, please refer to the <a href="http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html">official Maven document</a> If you wish to use a snapshot repository, you need to add a
+repository entry to your `settings.xml` file. For example:
+
+```xml
+<settings>
+  <activeProfiles>
+    <activeProfile>apache</activeProfile>
+  </activeProfiles>
+  <profiles>
+    <profile>
+      <id>apache</id>
+      <repositories>
+        <repository>
+          <id>apache-snapshots</id>
+          <url>https://repository.apache.org/content/repositories/snapshots/</url>
+        </repository>
+      </repositories>
+    </profile>
+  </profiles>
+</settings>
+```
+{{< /hint >}}
+
+{{< /tab >}}
+{{< tab "Gradle" >}}
+You can create a project with a Gradle build script or use the provided quickstart bash script.
+
+### Gradle build script
+
+To execute these build configuration scripts, run the `gradle` command in the directory with these scripts.
+
+**build.gradle**
+
+```gradle
+buildscript {
+    repositories {
+        jcenter() // this applies only to the Gradle 'Shadow' plugin
+    }
+    dependencies {
+        classpath 'com.github.jengelman.gradle.plugins:shadow:2.0.4'
+    }
+}
+plugins {
+    id 'java'
+    id 'application'
+    // shadow plugin to produce fat JARs
+    id 'com.github.johnrengelman.shadow' version '2.0.4'
+}
+// artifact properties
+group = 'org.myorg.quickstart'
+version = '0.1-SNAPSHOT'
+mainClassName = 'org.myorg.quickstart.StreamingJob'
+description = """Flink Quickstart Job"""
+ext {
+    javaVersion = '1.8'
+    flinkVersion = '{{< version >}}'
+    scalaBinaryVersion = '{{< scala_version >}}'
+    slf4jVersion = '1.7.32'
+    log4jVersion = '2.17.1'
+}
+sourceCompatibility = javaVersion
+targetCompatibility = javaVersion
+tasks.withType(JavaCompile) {
+	options.encoding = 'UTF-8'
+}
+applicationDefaultJvmArgs = ["-Dlog4j.configurationFile=log4j2.properties"]
+task wrapper(type: Wrapper) {
+    gradleVersion = '3.1'
+}
+// declare where to find the dependencies of your project
+repositories {
+    mavenCentral()
+    maven { url "https://repository.apache.org/content/repositories/snapshots/" }
+}
+// NOTE: We cannot use "compileOnly" or "shadow" configurations since then we could not run code
+// in the IDE or with "gradle run". We also cannot exclude transitive dependencies from the
+// shadowJar yet (see https://github.com/johnrengelman/shadow/issues/159).
+// -> Explicitly define the // libraries we want to be included in the "flinkShadowJar" configuration!
+configurations {
+    flinkShadowJar // dependencies which go into the shadowJar
+    // always exclude these (also from transitive dependencies) since they are provided by Flink
+    flinkShadowJar.exclude group: 'org.apache.flink', module: 'force-shading'
+    flinkShadowJar.exclude group: 'com.google.code.findbugs', module: 'jsr305'
+    flinkShadowJar.exclude group: 'org.slf4j'
+    flinkShadowJar.exclude group: 'org.apache.logging.log4j'
+}
+// declare the dependencies for your production and test code
+dependencies {
+    // --------------------------------------------------------------
+    // Compile-time dependencies that should NOT be part of the
+    // shadow jar and are provided in the lib folder of Flink
+    // --------------------------------------------------------------
+    compile "org.apache.flink:flink-streaming-java:${flinkVersion}"
+    compile "org.apache.flink:flink-clients:${flinkVersion}"
+    // --------------------------------------------------------------
+    // Dependencies that should be part of the shadow jar, e.g.
+    // connectors. These must be in the flinkShadowJar configuration!
+    // --------------------------------------------------------------
+    //flinkShadowJar "org.apache.flink:flink-connector-kafka:${flinkVersion}"
+    compile "org.apache.logging.log4j:log4j-api:${log4jVersion}"
+    compile "org.apache.logging.log4j:log4j-core:${log4jVersion}"
+    compile "org.apache.logging.log4j:log4j-slf4j-impl:${log4jVersion}"
+    compile "org.slf4j:slf4j-log4j12:${slf4jVersion}"
+    // Add test dependencies here.
+    // testCompile "junit:junit:4.12"
+}
+// make compileOnly dependencies available for tests:
+sourceSets {
+    main.compileClasspath += configurations.flinkShadowJar
+    main.runtimeClasspath += configurations.flinkShadowJar
+    test.compileClasspath += configurations.flinkShadowJar
+    test.runtimeClasspath += configurations.flinkShadowJar
+    javadoc.classpath += configurations.flinkShadowJar
+}
+run.classpath = sourceSets.main.runtimeClasspath
+jar {
+    manifest {
+        attributes 'Built-By': System.getProperty('user.name'),
+                'Build-Jdk': System.getProperty('java.version')
+    }
+}
+shadowJar {
+    configurations = [project.configurations.flinkShadowJar]
+}
+```
+
+**settings.gradle**
+```gradle
+rootProject.name = 'quickstart'
+```
+### Quickstart script
+```bash
+bash -c "$(curl https://flink.apache.org/q/gradle-quickstart.sh)" -- {{< version >}} {{< scala_version >}}
+```
+{{< /tab >}}
+{{< tab "sbt" >}}
+You can scaffold a new Flink project with the following [giter8 template](https://github.com/tillrohrmann/flink-project.g8)
+and the `sbt new` command (which creates new build definitions from a template) or use the provided quickstart bash script.
+
+### sbt template
+```bash
+$ sbt new tillrohrmann/flink-project.g8
+```
+
+### Quickstart script
+```bash

Review comment:
       newline

##########
File path: docs/content/docs/dev/configuration/maven.md
##########
@@ -0,0 +1,164 @@
+---
+title: "Using Maven"
+weight: 2
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Maven to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide will show you how to 
+do so with [Maven](https://maven.apache.org), an open-source build automation tool developed by the 
+Apache Group that enables you to build, publish, and deploy projects. You can use it to manage the 
+entire lifecycle of your software project.
+
+## Requirements
+
+- Maven 3.0.4 (or higher)
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you import this project into
+your IDE for developing and testing.
+
+IntelliJ IDEA supports Maven projects out-of-the-box. Eclipse offers the [m2e plugin](http://www.eclipse.org/m2e/) 
+to [import Maven projects](http://books.sonatype.com/m2eclipse-book/reference/creating-sect-importing-projects.html#fig-creating-import).
+
+*Note*: The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write into the `VM Arguments` box: `-Xmx800m`.
+In IntelliJ IDEA recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+## Building the project
+
+If you want to build/package your project, navigate to your project directory and run the
+'`mvn clean package`' command. You will find a JAR file that contains your application (plus connectors
+and libraries that you may have added as dependencies to the application) here:`target/<artifact-id>-<version>.jar`.
+
+__Note:__ If you used a different class than `DataStreamJob` as the application's main class / entry point,
+we recommend you change the `mainClass` setting in the `pom.xml` file accordingly so that Flink
+can run the application from the JAR file without additionally specifying the main class.
+
+## Adding dependencies to the project
+
+Open the `pom.xml` file in your profile directory and add the dependency in between
+the `dependencies` tab.  
+
+For example, you can add the Kafka connector as a dependency like this:
+
+```xml
+<dependencies>
+    
+    <dependency>
+        <groupId>org.apache.flink</groupId>
+        <artifactId>flink-connector-kafka</artifactId>
+        <version>{{< version >}}</version>
+    </dependency>
+    
+</dependencies>
+```
+
+Then execute `mvn install` on the command line. 
+
+Projects created from the `Java Project Template`, the `Scala Project Template`, or Gradle are configured
+to automatically include the application dependencies into the application JAR when you run `mvn clean package`.
+For projects that are not set up from those templates, we recommend adding the Maven Shade Plugin to
+build the application jar with all required dependencies.
+
+**Important:** Note that all these (core) dependencies should have their scope set to [*provided*](https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#dependency-scope). This means that
+they are needed to compile against, but that they should not be packaged into the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
+becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR file clash with some of
+your own dependency versions (which is normally avoided through inverted classloading).
+
+To correctly package the dependencies into the application JAR, these application dependencies must 
+be set to the *compile* scope.
+
+## Packaging the application
+
+Depending on your use case, you may need to package your Flink application in different ways before it
+gets deployed to a Flink environment. 
+
+If you want to create a JAR for a Flink Job and use only Flink dependencies without any third-party 
+dependencies (i.e. using the filesystem connector with JSON format), you do not need to create an 
+uber/fat JAR or shade any dependencies.
+
+If you want to create a JAR for a Flink Job and use external dependencies not built into the Flink 
+distribution, you can either add them to the classpath of the distribution or shade them into your 
+uber/fat application JAR.
+
+With the generated uber/fat JAR, you can submit it to a local or remote cluster with:
+
+```sh
+bin/flink run -c org.example.MyJob myFatJar.jar
+```
+
+## Template for building a JAR with dependencies

Review comment:
       Rename this paragraph to "Creating a fat JAR" or something like that?

##########
File path: docs/content/docs/dev/configuration/maven.md
##########
@@ -0,0 +1,164 @@
+---
+title: "Using Maven"
+weight: 2
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Maven to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide will show you how to 
+do so with [Maven](https://maven.apache.org), an open-source build automation tool developed by the 
+Apache Group that enables you to build, publish, and deploy projects. You can use it to manage the 
+entire lifecycle of your software project.
+
+## Requirements
+
+- Maven 3.0.4 (or higher)
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you import this project into
+your IDE for developing and testing.
+
+IntelliJ IDEA supports Maven projects out-of-the-box. Eclipse offers the [m2e plugin](http://www.eclipse.org/m2e/) 
+to [import Maven projects](http://books.sonatype.com/m2eclipse-book/reference/creating-sect-importing-projects.html#fig-creating-import).
+
+*Note*: The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write into the `VM Arguments` box: `-Xmx800m`.
+In IntelliJ IDEA recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+## Building the project
+
+If you want to build/package your project, navigate to your project directory and run the
+'`mvn clean package`' command. You will find a JAR file that contains your application (plus connectors
+and libraries that you may have added as dependencies to the application) here:`target/<artifact-id>-<version>.jar`.
+
+__Note:__ If you used a different class than `DataStreamJob` as the application's main class / entry point,
+we recommend you change the `mainClass` setting in the `pom.xml` file accordingly so that Flink
+can run the application from the JAR file without additionally specifying the main class.
+
+## Adding dependencies to the project
+
+Open the `pom.xml` file in your profile directory and add the dependency in between
+the `dependencies` tab.  
+
+For example, you can add the Kafka connector as a dependency like this:
+
+```xml
+<dependencies>
+    
+    <dependency>
+        <groupId>org.apache.flink</groupId>
+        <artifactId>flink-connector-kafka</artifactId>
+        <version>{{< version >}}</version>
+    </dependency>
+    
+</dependencies>
+```
+
+Then execute `mvn install` on the command line. 
+
+Projects created from the `Java Project Template`, the `Scala Project Template`, or Gradle are configured
+to automatically include the application dependencies into the application JAR when you run `mvn clean package`.
+For projects that are not set up from those templates, we recommend adding the Maven Shade Plugin to
+build the application jar with all required dependencies.
+
+**Important:** Note that all these (core) dependencies should have their scope set to [*provided*](https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#dependency-scope). This means that
+they are needed to compile against, but that they should not be packaged into the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
+becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR file clash with some of
+your own dependency versions (which is normally avoided through inverted classloading).
+
+To correctly package the dependencies into the application JAR, these application dependencies must 
+be set to the *compile* scope.
+
+## Packaging the application
+
+Depending on your use case, you may need to package your Flink application in different ways before it
+gets deployed to a Flink environment. 
+
+If you want to create a JAR for a Flink Job and use only Flink dependencies without any third-party 
+dependencies (i.e. using the filesystem connector with JSON format), you do not need to create an 
+uber/fat JAR or shade any dependencies.
+
+If you want to create a JAR for a Flink Job and use external dependencies not built into the Flink 
+distribution, you can either add them to the classpath of the distribution or shade them into your 
+uber/fat application JAR.
+
+With the generated uber/fat JAR, you can submit it to a local or remote cluster with:
+
+```sh
+bin/flink run -c org.example.MyJob myFatJar.jar
+```
+

Review comment:
       I like the flow of this paragraph, I think at the end you should add a link that refers to the deployment guide, which goes through all the details https://nightlies.apache.org/flink/flink-docs-master/docs/deployment/cli/. Actually, there is also the web ui which you can use to deploy jobs, but i can't find any docs page about it :shrug:  

##########
File path: docs/content/docs/dev/configuration/maven.md
##########
@@ -0,0 +1,164 @@
+---
+title: "Using Maven"
+weight: 2
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Maven to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide will show you how to 
+do so with [Maven](https://maven.apache.org), an open-source build automation tool developed by the 
+Apache Group that enables you to build, publish, and deploy projects. You can use it to manage the 
+entire lifecycle of your software project.
+
+## Requirements
+
+- Maven 3.0.4 (or higher)
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you import this project into
+your IDE for developing and testing.
+
+IntelliJ IDEA supports Maven projects out-of-the-box. Eclipse offers the [m2e plugin](http://www.eclipse.org/m2e/) 
+to [import Maven projects](http://books.sonatype.com/m2eclipse-book/reference/creating-sect-importing-projects.html#fig-creating-import).
+
+*Note*: The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write into the `VM Arguments` box: `-Xmx800m`.
+In IntelliJ IDEA recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+## Building the project
+
+If you want to build/package your project, navigate to your project directory and run the
+'`mvn clean package`' command. You will find a JAR file that contains your application (plus connectors
+and libraries that you may have added as dependencies to the application) here:`target/<artifact-id>-<version>.jar`.
+
+__Note:__ If you used a different class than `DataStreamJob` as the application's main class / entry point,
+we recommend you change the `mainClass` setting in the `pom.xml` file accordingly so that Flink
+can run the application from the JAR file without additionally specifying the main class.
+
+## Adding dependencies to the project
+
+Open the `pom.xml` file in your profile directory and add the dependency in between
+the `dependencies` tab.  
+
+For example, you can add the Kafka connector as a dependency like this:
+
+```xml
+<dependencies>
+    
+    <dependency>
+        <groupId>org.apache.flink</groupId>
+        <artifactId>flink-connector-kafka</artifactId>
+        <version>{{< version >}}</version>
+    </dependency>
+    
+</dependencies>
+```
+
+Then execute `mvn install` on the command line. 
+
+Projects created from the `Java Project Template`, the `Scala Project Template`, or Gradle are configured
+to automatically include the application dependencies into the application JAR when you run `mvn clean package`.
+For projects that are not set up from those templates, we recommend adding the Maven Shade Plugin to
+build the application jar with all required dependencies.
+
+**Important:** Note that all these (core) dependencies should have their scope set to [*provided*](https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#dependency-scope). This means that
+they are needed to compile against, but that they should not be packaged into the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
+becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR file clash with some of
+your own dependency versions (which is normally avoided through inverted classloading).
+
+To correctly package the dependencies into the application JAR, these application dependencies must 
+be set to the *compile* scope.
+

Review comment:
       _these_ here refers to flink "api" deps right?

##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,253 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or locally for testing),
+the [Flink runtime library](https://ossindex.sonatype.org/component/pkg:maven/org.apache.flink/flink-runtime@1.14.3) 
+must be available.
+
+The guides in this section will show you how to configure your projects via popular build tools
+([Maven]({{< ref "docs/dev/configuration/maven" >}}), [Gradle]({{< ref "docs/dev/configuration/gradle" >}}),
+[sbt]({{< ref "docs/dev/configuration/sbt" >}})), add the necessary dependencies 
+(i.e. [connectors and formats]({{< ref "docs/dev/configuration/connector" >}}), 
+[testing]({{< ref "docs/dev/configuration/testing" >}})), and cover some 
+[advanced]({{< ref "docs/dev/configuration/advanced" >}}) configuration topics. 
+
+## Getting started
+
+To get started working on your Flink application, use the following commands, scripts, and templates 
+to create a Flink project.  
+
+{{< tabs "creating project" >}}
+{{< tab "Maven" >}}
+
+You can create a project based on an [Archetype](https://maven.apache.org/guides/introduction/introduction-to-archetypes.html)
+with the Maven command below or use the provided quickstart bash script.
+
+### Maven command
+```bash
+$ mvn archetype:generate                \
+  -DarchetypeGroupId=org.apache.flink   \
+  -DarchetypeArtifactId=flink-quickstart-java \
+  -DarchetypeVersion={{< version >}}
+```
+This allows you to name your newly created project and will interactively ask you for the groupId,
+artifactId, and package name.
+
+### Quickstart script
+```bash
+$ curl https://flink.apache.org/q/quickstart.sh | bash -s {{< version >}}
+```
+
+{{< hint info >}}
+For Maven 3.0 or higher, it is no longer possible to specify the repository (-DarchetypeCatalog) via
+the command line. For details about this change, please refer to the <a href="http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html">official Maven document</a> If you wish to use a snapshot repository, you need to add a
+repository entry to your `settings.xml` file. For example:
+
+```xml
+<settings>
+  <activeProfiles>
+    <activeProfile>apache</activeProfile>
+  </activeProfiles>
+  <profiles>
+    <profile>
+      <id>apache</id>
+      <repositories>
+        <repository>
+          <id>apache-snapshots</id>
+          <url>https://repository.apache.org/content/repositories/snapshots/</url>
+        </repository>
+      </repositories>
+    </profile>
+  </profiles>
+</settings>
+```
+{{< /hint >}}
+
+{{< /tab >}}
+{{< tab "Gradle" >}}
+You can create a project with a Gradle build script or use the provided quickstart bash script.
+
+### Gradle build script
+
+To execute these build configuration scripts, run the `gradle` command in the directory with these scripts.
+
+**build.gradle**
+
+```gradle
+buildscript {
+    repositories {
+        jcenter() // this applies only to the Gradle 'Shadow' plugin
+    }
+    dependencies {
+        classpath 'com.github.jengelman.gradle.plugins:shadow:2.0.4'
+    }
+}
+plugins {
+    id 'java'
+    id 'application'
+    // shadow plugin to produce fat JARs
+    id 'com.github.johnrengelman.shadow' version '2.0.4'
+}
+// artifact properties
+group = 'org.myorg.quickstart'
+version = '0.1-SNAPSHOT'
+mainClassName = 'org.myorg.quickstart.StreamingJob'
+description = """Flink Quickstart Job"""
+ext {
+    javaVersion = '1.8'
+    flinkVersion = '{{< version >}}'
+    scalaBinaryVersion = '{{< scala_version >}}'
+    slf4jVersion = '1.7.32'
+    log4jVersion = '2.17.1'
+}
+sourceCompatibility = javaVersion
+targetCompatibility = javaVersion
+tasks.withType(JavaCompile) {
+	options.encoding = 'UTF-8'
+}
+applicationDefaultJvmArgs = ["-Dlog4j.configurationFile=log4j2.properties"]
+task wrapper(type: Wrapper) {
+    gradleVersion = '3.1'
+}
+// declare where to find the dependencies of your project
+repositories {
+    mavenCentral()
+    maven { url "https://repository.apache.org/content/repositories/snapshots/" }
+}
+// NOTE: We cannot use "compileOnly" or "shadow" configurations since then we could not run code
+// in the IDE or with "gradle run". We also cannot exclude transitive dependencies from the
+// shadowJar yet (see https://github.com/johnrengelman/shadow/issues/159).
+// -> Explicitly define the // libraries we want to be included in the "flinkShadowJar" configuration!
+configurations {
+    flinkShadowJar // dependencies which go into the shadowJar
+    // always exclude these (also from transitive dependencies) since they are provided by Flink
+    flinkShadowJar.exclude group: 'org.apache.flink', module: 'force-shading'
+    flinkShadowJar.exclude group: 'com.google.code.findbugs', module: 'jsr305'
+    flinkShadowJar.exclude group: 'org.slf4j'
+    flinkShadowJar.exclude group: 'org.apache.logging.log4j'
+}
+// declare the dependencies for your production and test code
+dependencies {
+    // --------------------------------------------------------------
+    // Compile-time dependencies that should NOT be part of the
+    // shadow jar and are provided in the lib folder of Flink
+    // --------------------------------------------------------------
+    compile "org.apache.flink:flink-streaming-java:${flinkVersion}"
+    compile "org.apache.flink:flink-clients:${flinkVersion}"
+    // --------------------------------------------------------------
+    // Dependencies that should be part of the shadow jar, e.g.
+    // connectors. These must be in the flinkShadowJar configuration!
+    // --------------------------------------------------------------
+    //flinkShadowJar "org.apache.flink:flink-connector-kafka:${flinkVersion}"
+    compile "org.apache.logging.log4j:log4j-api:${log4jVersion}"
+    compile "org.apache.logging.log4j:log4j-core:${log4jVersion}"
+    compile "org.apache.logging.log4j:log4j-slf4j-impl:${log4jVersion}"
+    compile "org.slf4j:slf4j-log4j12:${slf4jVersion}"
+    // Add test dependencies here.
+    // testCompile "junit:junit:4.12"
+}
+// make compileOnly dependencies available for tests:
+sourceSets {
+    main.compileClasspath += configurations.flinkShadowJar
+    main.runtimeClasspath += configurations.flinkShadowJar
+    test.compileClasspath += configurations.flinkShadowJar
+    test.runtimeClasspath += configurations.flinkShadowJar
+    javadoc.classpath += configurations.flinkShadowJar
+}
+run.classpath = sourceSets.main.runtimeClasspath
+jar {
+    manifest {
+        attributes 'Built-By': System.getProperty('user.name'),
+                'Build-Jdk': System.getProperty('java.version')
+    }
+}
+shadowJar {
+    configurations = [project.configurations.flinkShadowJar]
+}
+```
+
+**settings.gradle**
+```gradle
+rootProject.name = 'quickstart'
+```
+### Quickstart script
+```bash
+bash -c "$(curl https://flink.apache.org/q/gradle-quickstart.sh)" -- {{< version >}} {{< scala_version >}}
+```
+{{< /tab >}}
+{{< tab "sbt" >}}
+You can scaffold a new Flink project with the following [giter8 template](https://github.com/tillrohrmann/flink-project.g8)
+and the `sbt new` command (which creates new build definitions from a template) or use the provided quickstart bash script.
+
+### sbt template
+```bash
+$ sbt new tillrohrmann/flink-project.g8
+```
+
+### Quickstart script
+```bash
+$ bash <(curl https://flink.apache.org/q/sbt-quickstart.sh)
+```
+{{< /tab >}}
+{{< /tabs >}}
+
+## Which dependencies do you need?
+
+Depending on what you want to achieve, you are going to choose a combination of our available APIs, 
+which will require different dependencies. 
+
+Here is a table of artifact/dependency names:
+
+| APIs you want to use              | Dependency you need to add    |
+|-----------------------------------|-------------------------------|
+| DataStream                        | flink-streaming-java          |  
+| DataStream with Scala             | flink-streaming-scala{{< scala_version >}}         |   
+| Table API                         | flink-table-api-java          |   
+| Table API with Scala              | flink-table-api-scala{{< scala_version >}}         |
+| Table API + DataStream            | flink-table-api-java-bridge   |
+| Table API + DataStream with Scala | flink-table-api-scala-bridge{{< scala_version >}}  |
+
+
+## Next steps
+
+Check out the sections on how to add these dependencies with [Maven]({{< ref "docs/dev/configuration/maven" >}}), 
+[Gradle]({{< ref "docs/dev/configuration/gradle" >}}), or [sbt]({{< ref "docs/dev/configuration/sbt" >}}).

Review comment:
       I don't think at this point we need to "watch" users in every single next step, as chances are that most of the users reading this guide already know how to add dependencies. Perhaps just replace this with a dotted list of the next pages?  Or just repeat what you sad in the second sentence of the first paragraph of this page <<The guides in this section[...]>>?

##########
File path: docs/content/docs/dev/configuration/connector.md
##########
@@ -0,0 +1,63 @@
+---
+title: "Dependencies: Connectors and Formats"
+weight: 5
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Dependencies: Connectors and Formats
+
+Flink can read from and write to various external systems via [connectors]({{< ref "docs/connectors/table/overview" >}})
+and define the [format]({{< ref "docs/connectors/table/formats/overview" >}}) in which to store the 
+data.  
+
+The way that information is serialized is represented in the external system and that system needs
+to know how to read this data in a format that can be read by Flink.  This is done through format 
+dependencies.

Review comment:
       I would remove this sentence, not sure it adds some useful info here

##########
File path: docs/content/docs/dev/configuration/connector.md
##########
@@ -0,0 +1,63 @@
+---
+title: "Dependencies: Connectors and Formats"
+weight: 5
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Dependencies: Connectors and Formats
+
+Flink can read from and write to various external systems via [connectors]({{< ref "docs/connectors/table/overview" >}})
+and define the [format]({{< ref "docs/connectors/table/formats/overview" >}}) in which to store the 
+data.  
+
+The way that information is serialized is represented in the external system and that system needs
+to know how to read this data in a format that can be read by Flink.  This is done through format 
+dependencies.
+
+Most applications need specific connectors to run. Flink provides a set of formats that can be used 
+with connectors (with the dependencies for both being fairly unified). These are not part of Flink's 
+core dependencies and must be added as dependencies to the application.
+
+## Adding Dependencies 
+
+For more information on how to add dependencies, refer to the build tools sections on [Maven]({{< ref "docs/dev/configuration/maven" >}}),
+[Gradle]({{< ref "docs/dev/configuration/gradle" >}}), or [sbt]({{< ref "docs/dev/configuration/sbt" >}})..

Review comment:
       typo two dots at the end of the sentence

##########
File path: docs/content/docs/dev/configuration/gradle.md
##########
@@ -0,0 +1,110 @@
+---
+title: "Using Gradle"
+weight: 3
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Gradle to configure your project

Review comment:
       @matriv can you check out this page?

##########
File path: docs/content/docs/dev/configuration/connector.md
##########
@@ -0,0 +1,63 @@
+---
+title: "Dependencies: Connectors and Formats"
+weight: 5
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Dependencies: Connectors and Formats
+
+Flink can read from and write to various external systems via [connectors]({{< ref "docs/connectors/table/overview" >}})
+and define the [format]({{< ref "docs/connectors/table/formats/overview" >}}) in which to store the 
+data.  
+
+The way that information is serialized is represented in the external system and that system needs
+to know how to read this data in a format that can be read by Flink.  This is done through format 
+dependencies.
+
+Most applications need specific connectors to run. Flink provides a set of formats that can be used 
+with connectors (with the dependencies for both being fairly unified). These are not part of Flink's 
+core dependencies and must be added as dependencies to the application.
+
+## Adding Dependencies 
+
+For more information on how to add dependencies, refer to the build tools sections on [Maven]({{< ref "docs/dev/configuration/maven" >}}),
+[Gradle]({{< ref "docs/dev/configuration/gradle" >}}), or [sbt]({{< ref "docs/dev/configuration/sbt" >}})..
+
+## Packaging Dependencies
+
+We recommend packaging the application code and all its required dependencies into one *JAR-with-dependencies*
+which we refer to as the *application JAR*. The application JAR can be submitted to an already running
+Flink cluster, or added to a Flink application container image.

Review comment:
       Uh why is it called application jar? just name it fat job jar, i think it's more explicative. Also don't use the term JAR with dependencies, but just fat jar.

##########
File path: docs/content/docs/dev/configuration/connector.md
##########
@@ -0,0 +1,63 @@
+---
+title: "Dependencies: Connectors and Formats"
+weight: 5
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Dependencies: Connectors and Formats
+
+Flink can read from and write to various external systems via [connectors]({{< ref "docs/connectors/table/overview" >}})
+and define the [format]({{< ref "docs/connectors/table/formats/overview" >}}) in which to store the 

Review comment:
       More than adding the links only to the table specific docs, can you add another sentence in order to link to both datastream and table?

##########
File path: docs/content/docs/dev/configuration/maven.md
##########
@@ -0,0 +1,164 @@
+---
+title: "Using Maven"
+weight: 2
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Maven to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide will show you how to 
+do so with [Maven](https://maven.apache.org), an open-source build automation tool developed by the 
+Apache Group that enables you to build, publish, and deploy projects. You can use it to manage the 
+entire lifecycle of your software project.
+
+## Requirements
+
+- Maven 3.0.4 (or higher)
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you import this project into
+your IDE for developing and testing.
+
+IntelliJ IDEA supports Maven projects out-of-the-box. Eclipse offers the [m2e plugin](http://www.eclipse.org/m2e/) 
+to [import Maven projects](http://books.sonatype.com/m2eclipse-book/reference/creating-sect-importing-projects.html#fig-creating-import).
+
+*Note*: The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write into the `VM Arguments` box: `-Xmx800m`.
+In IntelliJ IDEA recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+## Building the project
+
+If you want to build/package your project, navigate to your project directory and run the
+'`mvn clean package`' command. You will find a JAR file that contains your application (plus connectors
+and libraries that you may have added as dependencies to the application) here:`target/<artifact-id>-<version>.jar`.
+
+__Note:__ If you used a different class than `DataStreamJob` as the application's main class / entry point,
+we recommend you change the `mainClass` setting in the `pom.xml` file accordingly so that Flink
+can run the application from the JAR file without additionally specifying the main class.
+
+## Adding dependencies to the project
+
+Open the `pom.xml` file in your profile directory and add the dependency in between
+the `dependencies` tab.  
+
+For example, you can add the Kafka connector as a dependency like this:
+
+```xml
+<dependencies>
+    
+    <dependency>
+        <groupId>org.apache.flink</groupId>
+        <artifactId>flink-connector-kafka</artifactId>
+        <version>{{< version >}}</version>
+    </dependency>
+    
+</dependencies>
+```
+
+Then execute `mvn install` on the command line. 
+
+Projects created from the `Java Project Template`, the `Scala Project Template`, or Gradle are configured
+to automatically include the application dependencies into the application JAR when you run `mvn clean package`.
+For projects that are not set up from those templates, we recommend adding the Maven Shade Plugin to
+build the application jar with all required dependencies.
+
+**Important:** Note that all these (core) dependencies should have their scope set to [*provided*](https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#dependency-scope). This means that
+they are needed to compile against, but that they should not be packaged into the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
+becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR file clash with some of
+your own dependency versions (which is normally avoided through inverted classloading).
+
+To correctly package the dependencies into the application JAR, these application dependencies must 
+be set to the *compile* scope.
+
+## Packaging the application
+
+Depending on your use case, you may need to package your Flink application in different ways before it
+gets deployed to a Flink environment. 
+
+If you want to create a JAR for a Flink Job and use only Flink dependencies without any third-party 
+dependencies (i.e. using the filesystem connector with JSON format), you do not need to create an 
+uber/fat JAR or shade any dependencies.
+
+If you want to create a JAR for a Flink Job and use external dependencies not built into the Flink 
+distribution, you can either add them to the classpath of the distribution or shade them into your 
+uber/fat application JAR.
+
+With the generated uber/fat JAR, you can submit it to a local or remote cluster with:
+
+```sh
+bin/flink run -c org.example.MyJob myFatJar.jar
+```
+
+## Template for building a JAR with dependencies
+
+To build an application JAR that contains all dependencies required for declared connectors and libraries,
+you can use the following shade plugin definition:
+
+```xml
+<build>
+    <plugins>
+        <plugin>
+            <groupId>org.apache.maven.plugins</groupId>
+            <artifactId>maven-shade-plugin</artifactId>
+            <version>3.1.1</version>
+            <executions>
+                <execution>
+                    <phase>package</phase>
+                    <goals>
+                        <goal>shade</goal>
+                    </goals>
+                    <configuration>
+                        <artifactSet>
+                            <excludes>
+                                <exclude>com.google.code.findbugs:jsr305</exclude>
+                                <exclude>org.slf4j:*</exclude>
+                                <exclude>log4j:*</exclude>
+                            </excludes>
+                        </artifactSet>
+                        <filters>
+                            <filter>
+                                <!-- Do not copy the signatures in the META-INF folder.
+                                Otherwise, this might cause SecurityExceptions when using the JAR. -->
+                                <artifact>*:*</artifact>
+                                <excludes>
+                                    <exclude>META-INF/*.SF</exclude>
+                                    <exclude>META-INF/*.DSA</exclude>
+                                    <exclude>META-INF/*.RSA</exclude>
+                                </excludes>
+                            </filter>
+                        </filters>
+                        <transformers>
+                            <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
+                                <mainClass>my.programs.main.clazz</mainClass>
+                            </transformer>
+                        </transformers>
+                    </configuration>
+                </execution>
+            </executions>
+        </plugin>
+    </plugins>
+</build>
+```

Review comment:
       Just add at the end that Maven shade plugin will include by default all the dependencies in scope "runtime" and "compile". Perhaps add a link to maven shade plugin https://maven.apache.org/plugins/maven-shade-plugin/index.html

##########
File path: docs/content/docs/dev/configuration/connector.md
##########
@@ -0,0 +1,63 @@
+---
+title: "Dependencies: Connectors and Formats"
+weight: 5
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Dependencies: Connectors and Formats
+
+Flink can read from and write to various external systems via [connectors]({{< ref "docs/connectors/table/overview" >}})
+and define the [format]({{< ref "docs/connectors/table/formats/overview" >}}) in which to store the 
+data.  
+
+The way that information is serialized is represented in the external system and that system needs
+to know how to read this data in a format that can be read by Flink.  This is done through format 
+dependencies.
+
+Most applications need specific connectors to run. Flink provides a set of formats that can be used 
+with connectors (with the dependencies for both being fairly unified). These are not part of Flink's 
+core dependencies and must be added as dependencies to the application.
+
+## Adding Dependencies 
+
+For more information on how to add dependencies, refer to the build tools sections on [Maven]({{< ref "docs/dev/configuration/maven" >}}),
+[Gradle]({{< ref "docs/dev/configuration/gradle" >}}), or [sbt]({{< ref "docs/dev/configuration/sbt" >}})..
+
+## Packaging Dependencies
+
+We recommend packaging the application code and all its required dependencies into one *JAR-with-dependencies*
+which we refer to as the *application JAR*. The application JAR can be submitted to an already running
+Flink cluster, or added to a Flink application container image.
+
+On [Maven Central](https://search.maven.org), we publish connectors named "flink-connector-<NAME>" and
+"flink-sql-connector-<NAME>". The former are thin JARs while the latter are uber JARs.
+
+In order to use the uber JARs, you can shade them (including and renaming dependencies to create a 
+private copy) in the uber JAR of your application, or you can add them to the `/lib` folder of the 

Review comment:
       use "job" rather than application when you refer to user code

##########
File path: docs/content/docs/dev/configuration/advanced.md
##########
@@ -0,0 +1,158 @@
+---
+title: "Advanced Configuration"
+weight: 10
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Advanced Configuration Topics
+
+## Dependencies: Flink Core and User Application
+
+There are two broad categories of dependencies and libraries in Flink, which are explained below.
+
+### Flink Core Dependencies
+
+Flink itself consists of a set of classes and dependencies that form the core of Flink's runtime
+and must be present when a Flink application is started. The classes and dependencies needed to run
+the system handle areas such as coordination, networking, checkpointing, failover, APIs,
+operators (such as windowing), resource management, etc.
+
+These core classes and dependencies are packaged in the `flink-dist` jar, are part of Flink's `lib`
+folder, and part of the basic Flink container images. You can think of these dependencies as similar
+to Java's core library (i.e. `rt.jar`, `charsets.jar`), which contains classes like `String` and `List`.
+
+In order to keep the core dependencies as small as possible and avoid dependency clashes, the
+Flink Core Dependencies do not contain any connectors or libraries (i.e. CEP, SQL, ML) in order to
+avoid having an excessive default number of classes and dependencies in the classpath.
+
+### User Application Dependencies
+
+These dependencies include all connectors, formats, or libraries that a specific user application
+needs and explicitly do not include the Flink DataStream and Table APIs and runtime dependencies 
+since those are already part of the Flink core dependencies.
+
+The user application is typically packaged into an *application jar*, which contains the application
+code and the required connector and library dependencies.
+
+## IDE configuration
+
+The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write `-Xmx800m` into the `VM Arguments` box.
+In IntelliJ IDEA, the recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is necessary to tick the
+`Include dependencies with "Provided" scope` box in the run configuration. If this option is not available
+(possibly due to using an older IntelliJ IDEA version), then a workaround is to create a test that
+calls the application's `main()` method.
+
+## Scala Versions
+
+Different Scala versions are not binary compatible with one another. All Flink dependencies that 
+(transitively) depend on Scala are suffixed with the Scala version that they are built for 
+(i.e. `flink-streaming-scala_2.12`).
+
+If you are only using Flink's Java APIs, you can use any Scala version. If you are using Flink's Scala APIs, 
+you need to pick the Scala version that matches the application's Scala version.
+
+Please refer to the [build guide]({{< ref "docs/flinkDev/building" >}}#scala-versions) for details 
+on how to build Flink for a specific Scala version.
+
+Scala versions after 2.12.8 are not binary compatible with previous 2.12.x versions. This prevents
+the Flink project from upgrading its 2.12.x builds beyond 2.12.8. You can build Flink locally for
+later Scala versions by following the [build guide]({{< ref "docs/flinkDev/building" >}}#scala-versions).
+For this to work, you will need to add `-Djapicmp.skip` to skip binary compatibility checks when building.
+
+See the [Scala 2.12.8 release notes](https://github.com/scala/scala/releases/tag/v2.12.8) for more details.
+The relevant section states:
+
+> The second fix is not binary compatible: the 2.12.8 compiler omits certain
+> methods that are generated by earlier 2.12 compilers. However, we believe
+> that these methods are never used and existing compiled code will continue to
+> work.  See the [pull request
+> description](https://github.com/scala/scala/pull/7469) for more details.
+
+
+The Flink distribution contains by default the required JARs to execute Flink SQL Jobs (found in the `/lib` folder), 
+in particular:
+
+-`flink-table-api-java-uber-{{< version >}}.jar` --> contains all the Java APIs
+-`flink-table-runtime-{{< version >}}.jar` --> contains the runtime
+-`flink-table-planner-loader-{{< version >}}.jar` --> contains the query planner
+
+**Note:** Previously, these JARs were all packaged into `flink-table.jar`. This has now been split into 
+three JARs in order to allow users to swap the `flink-table-planner-loader-{{< version >}}.jar` with
+`flink-table-planner{{< scala_version >}}-{{< version >}}.jar`.
+
+When using formats and connectors with the Flink Scala API, you need to either download and manually 
+include these JARs in the `/lib` folder (recommended), or you need to shade them in the uber JAR of your 
+Flink SQL Jobs.
+
+For more details, check out how to [connect to external systems]({{< ref "docs/connectors/table/overview" >}}).

Review comment:
       This is very table related. Perhaps name this paragraph "Anatomy of table dependencies"? and then you make the next paragraph _Table Planner and Table Planner Loader_ a subparagraph of this one?

##########
File path: docs/content/docs/dev/configuration/advanced.md
##########
@@ -0,0 +1,158 @@
+---
+title: "Advanced Configuration"
+weight: 10
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Advanced Configuration Topics
+
+## Dependencies: Flink Core and User Application
+
+There are two broad categories of dependencies and libraries in Flink, which are explained below.
+
+### Flink Core Dependencies
+
+Flink itself consists of a set of classes and dependencies that form the core of Flink's runtime
+and must be present when a Flink application is started. The classes and dependencies needed to run
+the system handle areas such as coordination, networking, checkpointing, failover, APIs,
+operators (such as windowing), resource management, etc.
+
+These core classes and dependencies are packaged in the `flink-dist` jar, are part of Flink's `lib`
+folder, and part of the basic Flink container images. You can think of these dependencies as similar
+to Java's core library (i.e. `rt.jar`, `charsets.jar`), which contains classes like `String` and `List`.
+
+In order to keep the core dependencies as small as possible and avoid dependency clashes, the
+Flink Core Dependencies do not contain any connectors or libraries (i.e. CEP, SQL, ML) in order to
+avoid having an excessive default number of classes and dependencies in the classpath.
+
+### User Application Dependencies
+
+These dependencies include all connectors, formats, or libraries that a specific user application
+needs and explicitly do not include the Flink DataStream and Table APIs and runtime dependencies 
+since those are already part of the Flink core dependencies.
+
+The user application is typically packaged into an *application jar*, which contains the application
+code and the required connector and library dependencies.
+
+## IDE configuration
+
+The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write `-Xmx800m` into the `VM Arguments` box.
+In IntelliJ IDEA, the recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is necessary to tick the
+`Include dependencies with "Provided" scope` box in the run configuration. If this option is not available
+(possibly due to using an older IntelliJ IDEA version), then a workaround is to create a test that
+calls the application's `main()` method.
+
+## Scala Versions
+
+Different Scala versions are not binary compatible with one another. All Flink dependencies that 
+(transitively) depend on Scala are suffixed with the Scala version that they are built for 
+(i.e. `flink-streaming-scala_2.12`).
+
+If you are only using Flink's Java APIs, you can use any Scala version. If you are using Flink's Scala APIs, 
+you need to pick the Scala version that matches the application's Scala version.
+
+Please refer to the [build guide]({{< ref "docs/flinkDev/building" >}}#scala-versions) for details 
+on how to build Flink for a specific Scala version.
+
+Scala versions after 2.12.8 are not binary compatible with previous 2.12.x versions. This prevents
+the Flink project from upgrading its 2.12.x builds beyond 2.12.8. You can build Flink locally for
+later Scala versions by following the [build guide]({{< ref "docs/flinkDev/building" >}}#scala-versions).
+For this to work, you will need to add `-Djapicmp.skip` to skip binary compatibility checks when building.
+
+See the [Scala 2.12.8 release notes](https://github.com/scala/scala/releases/tag/v2.12.8) for more details.
+The relevant section states:
+
+> The second fix is not binary compatible: the 2.12.8 compiler omits certain
+> methods that are generated by earlier 2.12 compilers. However, we believe
+> that these methods are never used and existing compiled code will continue to
+> work.  See the [pull request
+> description](https://github.com/scala/scala/pull/7469) for more details.
+
+
+The Flink distribution contains by default the required JARs to execute Flink SQL Jobs (found in the `/lib` folder), 
+in particular:
+
+-`flink-table-api-java-uber-{{< version >}}.jar` --> contains all the Java APIs
+-`flink-table-runtime-{{< version >}}.jar` --> contains the runtime
+-`flink-table-planner-loader-{{< version >}}.jar` --> contains the query planner
+
+**Note:** Previously, these JARs were all packaged into `flink-table.jar`. This has now been split into 

Review comment:
       Replace _This has now_ with _Since 1.15_

##########
File path: docs/content/docs/dev/configuration/advanced.md
##########
@@ -0,0 +1,158 @@
+---
+title: "Advanced Configuration"
+weight: 10
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Advanced Configuration Topics
+
+## Dependencies: Flink Core and User Application
+
+There are two broad categories of dependencies and libraries in Flink, which are explained below.
+
+### Flink Core Dependencies
+
+Flink itself consists of a set of classes and dependencies that form the core of Flink's runtime
+and must be present when a Flink application is started. The classes and dependencies needed to run
+the system handle areas such as coordination, networking, checkpointing, failover, APIs,
+operators (such as windowing), resource management, etc.
+
+These core classes and dependencies are packaged in the `flink-dist` jar, are part of Flink's `lib`
+folder, and part of the basic Flink container images. You can think of these dependencies as similar
+to Java's core library (i.e. `rt.jar`, `charsets.jar`), which contains classes like `String` and `List`.
+
+In order to keep the core dependencies as small as possible and avoid dependency clashes, the
+Flink Core Dependencies do not contain any connectors or libraries (i.e. CEP, SQL, ML) in order to
+avoid having an excessive default number of classes and dependencies in the classpath.
+
+### User Application Dependencies
+
+These dependencies include all connectors, formats, or libraries that a specific user application
+needs and explicitly do not include the Flink DataStream and Table APIs and runtime dependencies 
+since those are already part of the Flink core dependencies.
+
+The user application is typically packaged into an *application jar*, which contains the application
+code and the required connector and library dependencies.
+
+## IDE configuration
+
+The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write `-Xmx800m` into the `VM Arguments` box.
+In IntelliJ IDEA, the recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is necessary to tick the
+`Include dependencies with "Provided" scope` box in the run configuration. If this option is not available
+(possibly due to using an older IntelliJ IDEA version), then a workaround is to create a test that
+calls the application's `main()` method.
+
+## Scala Versions
+
+Different Scala versions are not binary compatible with one another. All Flink dependencies that 
+(transitively) depend on Scala are suffixed with the Scala version that they are built for 
+(i.e. `flink-streaming-scala_2.12`).
+
+If you are only using Flink's Java APIs, you can use any Scala version. If you are using Flink's Scala APIs, 
+you need to pick the Scala version that matches the application's Scala version.
+
+Please refer to the [build guide]({{< ref "docs/flinkDev/building" >}}#scala-versions) for details 
+on how to build Flink for a specific Scala version.
+
+Scala versions after 2.12.8 are not binary compatible with previous 2.12.x versions. This prevents
+the Flink project from upgrading its 2.12.x builds beyond 2.12.8. You can build Flink locally for
+later Scala versions by following the [build guide]({{< ref "docs/flinkDev/building" >}}#scala-versions).
+For this to work, you will need to add `-Djapicmp.skip` to skip binary compatibility checks when building.
+
+See the [Scala 2.12.8 release notes](https://github.com/scala/scala/releases/tag/v2.12.8) for more details.
+The relevant section states:
+
+> The second fix is not binary compatible: the 2.12.8 compiler omits certain
+> methods that are generated by earlier 2.12 compilers. However, we believe
+> that these methods are never used and existing compiled code will continue to
+> work.  See the [pull request
+> description](https://github.com/scala/scala/pull/7469) for more details.
+
+
+The Flink distribution contains by default the required JARs to execute Flink SQL Jobs (found in the `/lib` folder), 
+in particular:
+
+-`flink-table-api-java-uber-{{< version >}}.jar` --> contains all the Java APIs
+-`flink-table-runtime-{{< version >}}.jar` --> contains the runtime
+-`flink-table-planner-loader-{{< version >}}.jar` --> contains the query planner
+
+**Note:** Previously, these JARs were all packaged into `flink-table.jar`. This has now been split into 
+three JARs in order to allow users to swap the `flink-table-planner-loader-{{< version >}}.jar` with
+`flink-table-planner{{< scala_version >}}-{{< version >}}.jar`.
+
+When using formats and connectors with the Flink Scala API, you need to either download and manually 
+include these JARs in the `/lib` folder (recommended), or you need to shade them in the uber JAR of your 
+Flink SQL Jobs.

Review comment:
       This sentence is incorrect, I think i badly worded it initially. What I was trying to explain is that while for java apis, everything is already in the distribution so the dependency can be "provided" (as explained in the overview and maven page), for scala apis you don't have the jar built-in in the distribution, so you need to package the jar somehoe, effectively like a connector dependency.
   
   Perhaps reword this to something like:
   
   > While Table Java APIs artifacts are built-in in the distribution, Table Scala API artifacts are not included by default. You need to either include them in the distribution `/lib` folder manually (recommended), or package them as dependencies in your job uber jar. 
   

##########
File path: docs/content/docs/connectors/table/upsert-kafka.md
##########
@@ -49,6 +49,9 @@ Dependencies
 
 {{< sql_download_table "upsert-kafka" >}}
 
+The Upsert Kafka connector is not currently part of the binary distribution.

Review comment:
       remove _currently_, as it implies that at some point in the future it will be part of the distribution. Same for other changes to connectors pages

##########
File path: docs/content/docs/dev/table/overview.md
##########
@@ -40,8 +40,8 @@ and later use the DataStream API to build alerting based on the matched patterns
 
 ## Table Program Dependencies
 
-Depending on the target programming language, you need to add the Java or Scala API to a project
-in order to use the Table API & SQL for defining pipelines.
+Depending on the target programming language, you need to add the Table API to a project in order to
+use the Table API & SQL for defining data pipelines.

Review comment:
       I would rather go and remove these tabs all together. Perhaps just link to the project conf page for Java and Scala and the python overview page for Python? @MartijnVisser any opinions here?

##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,253 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or locally for testing),
+the [Flink runtime library](https://ossindex.sonatype.org/component/pkg:maven/org.apache.flink/flink-runtime@1.14.3) 
+must be available.
+
+The guides in this section will show you how to configure your projects via popular build tools
+([Maven]({{< ref "docs/dev/configuration/maven" >}}), [Gradle]({{< ref "docs/dev/configuration/gradle" >}}),
+[sbt]({{< ref "docs/dev/configuration/sbt" >}})), add the necessary dependencies 
+(i.e. [connectors and formats]({{< ref "docs/dev/configuration/connector" >}}), 
+[testing]({{< ref "docs/dev/configuration/testing" >}})), and cover some 
+[advanced]({{< ref "docs/dev/configuration/advanced" >}}) configuration topics. 
+
+## Getting started
+
+To get started working on your Flink application, use the following commands, scripts, and templates 
+to create a Flink project.  
+
+{{< tabs "creating project" >}}
+{{< tab "Maven" >}}
+
+You can create a project based on an [Archetype](https://maven.apache.org/guides/introduction/introduction-to-archetypes.html)
+with the Maven command below or use the provided quickstart bash script.
+
+### Maven command
+```bash
+$ mvn archetype:generate                \
+  -DarchetypeGroupId=org.apache.flink   \
+  -DarchetypeArtifactId=flink-quickstart-java \
+  -DarchetypeVersion={{< version >}}
+```
+This allows you to name your newly created project and will interactively ask you for the groupId,
+artifactId, and package name.
+
+### Quickstart script
+```bash
+$ curl https://flink.apache.org/q/quickstart.sh | bash -s {{< version >}}
+```
+
+{{< hint info >}}
+For Maven 3.0 or higher, it is no longer possible to specify the repository (-DarchetypeCatalog) via
+the command line. For details about this change, please refer to the <a href="http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html">official Maven document</a> If you wish to use a snapshot repository, you need to add a
+repository entry to your `settings.xml` file. For example:
+
+```xml
+<settings>
+  <activeProfiles>
+    <activeProfile>apache</activeProfile>
+  </activeProfiles>
+  <profiles>
+    <profile>
+      <id>apache</id>
+      <repositories>
+        <repository>
+          <id>apache-snapshots</id>
+          <url>https://repository.apache.org/content/repositories/snapshots/</url>
+        </repository>
+      </repositories>
+    </profile>
+  </profiles>
+</settings>
+```
+{{< /hint >}}
+
+{{< /tab >}}
+{{< tab "Gradle" >}}
+You can create a project with a Gradle build script or use the provided quickstart bash script.
+
+### Gradle build script
+
+To execute these build configuration scripts, run the `gradle` command in the directory with these scripts.
+
+**build.gradle**

Review comment:
       @matriv can you check that this build script makes sense and it's updated?

##########
File path: docs/content/docs/dev/configuration/sbt.md
##########
@@ -0,0 +1,115 @@
+---
+title: "Using sbt"
+weight: 4
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use sbt to configure your project

Review comment:
       @MartijnVisser do you know someone that can validate this page?

##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,253 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or locally for testing),
+the [Flink runtime library](https://ossindex.sonatype.org/component/pkg:maven/org.apache.flink/flink-runtime@1.14.3) 
+must be available.
+
+The guides in this section will show you how to configure your projects via popular build tools
+([Maven]({{< ref "docs/dev/configuration/maven" >}}), [Gradle]({{< ref "docs/dev/configuration/gradle" >}}),
+[sbt]({{< ref "docs/dev/configuration/sbt" >}})), add the necessary dependencies 
+(i.e. [connectors and formats]({{< ref "docs/dev/configuration/connector" >}}), 
+[testing]({{< ref "docs/dev/configuration/testing" >}})), and cover some 
+[advanced]({{< ref "docs/dev/configuration/advanced" >}}) configuration topics. 
+
+## Getting started
+
+To get started working on your Flink application, use the following commands, scripts, and templates 
+to create a Flink project.  
+
+{{< tabs "creating project" >}}
+{{< tab "Maven" >}}
+
+You can create a project based on an [Archetype](https://maven.apache.org/guides/introduction/introduction-to-archetypes.html)
+with the Maven command below or use the provided quickstart bash script.
+
+### Maven command
+```bash
+$ mvn archetype:generate                \
+  -DarchetypeGroupId=org.apache.flink   \
+  -DarchetypeArtifactId=flink-quickstart-java \
+  -DarchetypeVersion={{< version >}}
+```
+This allows you to name your newly created project and will interactively ask you for the groupId,
+artifactId, and package name.
+
+### Quickstart script
+```bash
+$ curl https://flink.apache.org/q/quickstart.sh | bash -s {{< version >}}
+```
+
+{{< hint info >}}
+For Maven 3.0 or higher, it is no longer possible to specify the repository (-DarchetypeCatalog) via
+the command line. For details about this change, please refer to the <a href="http://maven.apache.org/archetype/maven-archetype-plugin/archetype-repository.html">official Maven document</a> If you wish to use a snapshot repository, you need to add a
+repository entry to your `settings.xml` file. For example:
+
+```xml
+<settings>
+  <activeProfiles>
+    <activeProfile>apache</activeProfile>
+  </activeProfiles>
+  <profiles>
+    <profile>
+      <id>apache</id>
+      <repositories>
+        <repository>
+          <id>apache-snapshots</id>
+          <url>https://repository.apache.org/content/repositories/snapshots/</url>
+        </repository>
+      </repositories>
+    </profile>
+  </profiles>
+</settings>
+```
+{{< /hint >}}
+
+{{< /tab >}}
+{{< tab "Gradle" >}}
+You can create a project with a Gradle build script or use the provided quickstart bash script.
+
+### Gradle build script
+
+To execute these build configuration scripts, run the `gradle` command in the directory with these scripts.
+
+**build.gradle**
+
+```gradle
+buildscript {
+    repositories {
+        jcenter() // this applies only to the Gradle 'Shadow' plugin
+    }
+    dependencies {
+        classpath 'com.github.jengelman.gradle.plugins:shadow:2.0.4'
+    }
+}
+plugins {
+    id 'java'
+    id 'application'
+    // shadow plugin to produce fat JARs
+    id 'com.github.johnrengelman.shadow' version '2.0.4'
+}
+// artifact properties
+group = 'org.myorg.quickstart'
+version = '0.1-SNAPSHOT'
+mainClassName = 'org.myorg.quickstart.StreamingJob'
+description = """Flink Quickstart Job"""
+ext {
+    javaVersion = '1.8'
+    flinkVersion = '{{< version >}}'
+    scalaBinaryVersion = '{{< scala_version >}}'
+    slf4jVersion = '1.7.32'
+    log4jVersion = '2.17.1'
+}
+sourceCompatibility = javaVersion
+targetCompatibility = javaVersion
+tasks.withType(JavaCompile) {
+	options.encoding = 'UTF-8'
+}
+applicationDefaultJvmArgs = ["-Dlog4j.configurationFile=log4j2.properties"]
+task wrapper(type: Wrapper) {
+    gradleVersion = '3.1'
+}
+// declare where to find the dependencies of your project
+repositories {
+    mavenCentral()
+    maven { url "https://repository.apache.org/content/repositories/snapshots/" }
+}
+// NOTE: We cannot use "compileOnly" or "shadow" configurations since then we could not run code
+// in the IDE or with "gradle run". We also cannot exclude transitive dependencies from the
+// shadowJar yet (see https://github.com/johnrengelman/shadow/issues/159).
+// -> Explicitly define the // libraries we want to be included in the "flinkShadowJar" configuration!
+configurations {
+    flinkShadowJar // dependencies which go into the shadowJar
+    // always exclude these (also from transitive dependencies) since they are provided by Flink
+    flinkShadowJar.exclude group: 'org.apache.flink', module: 'force-shading'
+    flinkShadowJar.exclude group: 'com.google.code.findbugs', module: 'jsr305'
+    flinkShadowJar.exclude group: 'org.slf4j'
+    flinkShadowJar.exclude group: 'org.apache.logging.log4j'
+}
+// declare the dependencies for your production and test code
+dependencies {
+    // --------------------------------------------------------------
+    // Compile-time dependencies that should NOT be part of the
+    // shadow jar and are provided in the lib folder of Flink
+    // --------------------------------------------------------------
+    compile "org.apache.flink:flink-streaming-java:${flinkVersion}"
+    compile "org.apache.flink:flink-clients:${flinkVersion}"
+    // --------------------------------------------------------------
+    // Dependencies that should be part of the shadow jar, e.g.
+    // connectors. These must be in the flinkShadowJar configuration!
+    // --------------------------------------------------------------
+    //flinkShadowJar "org.apache.flink:flink-connector-kafka:${flinkVersion}"
+    compile "org.apache.logging.log4j:log4j-api:${log4jVersion}"
+    compile "org.apache.logging.log4j:log4j-core:${log4jVersion}"
+    compile "org.apache.logging.log4j:log4j-slf4j-impl:${log4jVersion}"
+    compile "org.slf4j:slf4j-log4j12:${slf4jVersion}"
+    // Add test dependencies here.
+    // testCompile "junit:junit:4.12"
+}
+// make compileOnly dependencies available for tests:
+sourceSets {
+    main.compileClasspath += configurations.flinkShadowJar
+    main.runtimeClasspath += configurations.flinkShadowJar
+    test.compileClasspath += configurations.flinkShadowJar
+    test.runtimeClasspath += configurations.flinkShadowJar
+    javadoc.classpath += configurations.flinkShadowJar
+}
+run.classpath = sourceSets.main.runtimeClasspath
+jar {
+    manifest {
+        attributes 'Built-By': System.getProperty('user.name'),
+                'Build-Jdk': System.getProperty('java.version')
+    }
+}
+shadowJar {
+    configurations = [project.configurations.flinkShadowJar]
+}
+```
+
+**settings.gradle**
+```gradle
+rootProject.name = 'quickstart'
+```
+### Quickstart script
+```bash
+bash -c "$(curl https://flink.apache.org/q/gradle-quickstart.sh)" -- {{< version >}} {{< scala_version >}}
+```
+{{< /tab >}}
+{{< tab "sbt" >}}
+You can scaffold a new Flink project with the following [giter8 template](https://github.com/tillrohrmann/flink-project.g8)
+and the `sbt new` command (which creates new build definitions from a template) or use the provided quickstart bash script.
+
+### sbt template
+```bash
+$ sbt new tillrohrmann/flink-project.g8
+```
+
+### Quickstart script
+```bash
+$ bash <(curl https://flink.apache.org/q/sbt-quickstart.sh)
+```
+{{< /tab >}}
+{{< /tabs >}}
+
+## Which dependencies do you need?
+
+Depending on what you want to achieve, you are going to choose a combination of our available APIs, 
+which will require different dependencies. 
+
+Here is a table of artifact/dependency names:
+
+| APIs you want to use              | Dependency you need to add    |
+|-----------------------------------|-------------------------------|
+| DataStream                        | flink-streaming-java          |  
+| DataStream with Scala             | flink-streaming-scala{{< scala_version >}}         |   
+| Table API                         | flink-table-api-java          |   
+| Table API with Scala              | flink-table-api-scala{{< scala_version >}}         |
+| Table API + DataStream            | flink-table-api-java-bridge   |
+| Table API + DataStream with Scala | flink-table-api-scala-bridge{{< scala_version >}}  |

Review comment:
       Link to this page https://nightlies.apache.org/flink/flink-docs-master/docs/dev/table/data_stream_api/

##########
File path: docs/content/docs/dev/configuration/gradle.md
##########
@@ -0,0 +1,110 @@
+---
+title: "Using Gradle"
+weight: 3
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Gradle to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide will show you how to
+do so with [Gradle](https://gradle.org), an open-source general-purpose build tool that can be used 
+to automate tasks in the development process.
+
+## Requirements
+
+- Gradle 3.x (or higher)
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you import this project into
+your IDE for developing and testing.
+
+
+IntelliJ IDEA supports Gradle projects via the `Gradle` plugin.
+
+Eclipse does so via the [Eclipse Buildship](https://projects.eclipse.org/projects/tools.buildship)
+plugin (make sure to specify a Gradle version >= 3.0 in the last step of the import wizard; the `shadow`
+plugin requires it). You may also use [Gradle's IDE integration](https://docs.gradle.org/current/userguide/userguide.html#ide-integration)
+to create project files with Gradle.
+
+*Note*: The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write into the `VM Arguments` box: `-Xmx800m`.
+In IntelliJ IDEA recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+## Building the project
+
+If you want to __build/package your project__, go to your project directory and
+run the '`gradle clean shadowJar`' command.
+You will __find a JAR file__ that contains your application, plus connectors and libraries
+that you may have added as dependencies to the application: `build/libs/<project-name>-<version>-all.jar`.
+
+__Note:__ If you use a different class than *StreamingJob* as the application's main class / entry point,
+we recommend you change the `mainClassName` setting in the `build.gradle` file accordingly. That way, Flink
+can run the application from the JAR file without additionally specifying the main class.
+
+## Adding dependencies to the project
+
+Specify a dependency configuration in the dependencies block of your `build.gradle` file.
+
+For example, you can add the Kafka connector as a dependency like this:
+
+**build.gradle**
+
+```gradle
+...
+dependencies {
+    ...  
+    flinkShadowJar "org.apache.flink:flink-connector-kafka:${flinkVersion}"
+    ...
+}
+...
+```
+
+**Important:** Note that all these (core) dependencies should have their scope set to [*provided*](https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#dependency-scope). This means that
+they are needed to compile against, but that they should not be packaged into the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
+becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR file clash with some of
+your own dependency versions (which is normally avoided through inverted classloading).
+
+To correctly package the dependencies into the application JAR, these application dependencies must
+be set to the *compile* scope.
+
+## Packaging the application
+
+Depending on your use case, you may need to package your Flink application in different ways before it
+gets deployed to a Flink environment.
+
+If you want to create a JAR for a Flink Job and use only Flink dependencies without any third-party
+dependencies (i.e. using the filesystem connector with JSON format), you do not need to create an
+uber/fat JAR or shade any dependencies.
+
+If you want to create a JAR for a Flink Job and use external dependencies not built into the Flink
+distribution, you can either add them to the classpath of the distribution or shade them into your
+uber/fat application JAR.
+
+With the generated uber/fat JAR, you can submit it to a local or remote cluster with:
+
+```sh
+bin/flink run -c org.example.MyJob myFatJar.jar
+```

Review comment:
       @matriv this needs the snippet that explains how to create a fat jar, like the maven page

##########
File path: docs/content/docs/dev/configuration/maven.md
##########
@@ -0,0 +1,164 @@
+---
+title: "Using Maven"
+weight: 2
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Maven to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide will show you how to 
+do so with [Maven](https://maven.apache.org), an open-source build automation tool developed by the 
+Apache Group that enables you to build, publish, and deploy projects. You can use it to manage the 
+entire lifecycle of your software project.
+
+## Requirements
+
+- Maven 3.0.4 (or higher)
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you import this project into
+your IDE for developing and testing.
+
+IntelliJ IDEA supports Maven projects out-of-the-box. Eclipse offers the [m2e plugin](http://www.eclipse.org/m2e/) 
+to [import Maven projects](http://books.sonatype.com/m2eclipse-book/reference/creating-sect-importing-projects.html#fig-creating-import).
+
+*Note*: The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write into the `VM Arguments` box: `-Xmx800m`.
+In IntelliJ IDEA recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+## Building the project
+
+If you want to build/package your project, navigate to your project directory and run the
+'`mvn clean package`' command. You will find a JAR file that contains your application (plus connectors
+and libraries that you may have added as dependencies to the application) here:`target/<artifact-id>-<version>.jar`.
+
+__Note:__ If you used a different class than `DataStreamJob` as the application's main class / entry point,
+we recommend you change the `mainClass` setting in the `pom.xml` file accordingly so that Flink
+can run the application from the JAR file without additionally specifying the main class.
+
+## Adding dependencies to the project
+
+Open the `pom.xml` file in your profile directory and add the dependency in between
+the `dependencies` tab.  
+
+For example, you can add the Kafka connector as a dependency like this:
+
+```xml
+<dependencies>
+    
+    <dependency>
+        <groupId>org.apache.flink</groupId>
+        <artifactId>flink-connector-kafka</artifactId>
+        <version>{{< version >}}</version>
+    </dependency>
+    
+</dependencies>
+```
+
+Then execute `mvn install` on the command line. 
+
+Projects created from the `Java Project Template`, the `Scala Project Template`, or Gradle are configured
+to automatically include the application dependencies into the application JAR when you run `mvn clean package`.
+For projects that are not set up from those templates, we recommend adding the Maven Shade Plugin to
+build the application jar with all required dependencies.
+
+**Important:** Note that all these (core) dependencies should have their scope set to [*provided*](https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#dependency-scope). This means that
+they are needed to compile against, but that they should not be packaged into the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
+becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR file clash with some of
+your own dependency versions (which is normally avoided through inverted classloading).
+
+To correctly package the dependencies into the application JAR, these application dependencies must 
+be set to the *compile* scope.
+
+## Packaging the application
+
+Depending on your use case, you may need to package your Flink application in different ways before it
+gets deployed to a Flink environment. 
+
+If you want to create a JAR for a Flink Job and use only Flink dependencies without any third-party 
+dependencies (i.e. using the filesystem connector with JSON format), you do not need to create an 
+uber/fat JAR or shade any dependencies.
+
+If you want to create a JAR for a Flink Job and use external dependencies not built into the Flink 
+distribution, you can either add them to the classpath of the distribution or shade them into your 
+uber/fat application JAR.
+
+With the generated uber/fat JAR, you can submit it to a local or remote cluster with:
+
+```sh
+bin/flink run -c org.example.MyJob myFatJar.jar
+```
+
+## Template for building a JAR with dependencies
+
+To build an application JAR that contains all dependencies required for declared connectors and libraries,
+you can use the following shade plugin definition:
+
+```xml
+<build>
+    <plugins>
+        <plugin>
+            <groupId>org.apache.maven.plugins</groupId>
+            <artifactId>maven-shade-plugin</artifactId>
+            <version>3.1.1</version>

Review comment:
       Bump this to 3.2.4

##########
File path: docs/content/docs/dev/configuration/connector.md
##########
@@ -0,0 +1,63 @@
+---
+title: "Dependencies: Connectors and Formats"
+weight: 5
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Dependencies: Connectors and Formats

Review comment:
       Rename to just _Connectors and Formats_

##########
File path: docs/content/docs/dev/configuration/maven.md
##########
@@ -0,0 +1,164 @@
+---
+title: "Using Maven"
+weight: 2
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Maven to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide will show you how to 
+do so with [Maven](https://maven.apache.org), an open-source build automation tool developed by the 
+Apache Group that enables you to build, publish, and deploy projects. You can use it to manage the 
+entire lifecycle of your software project.
+
+## Requirements
+
+- Maven 3.0.4 (or higher)
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you import this project into
+your IDE for developing and testing.
+
+IntelliJ IDEA supports Maven projects out-of-the-box. Eclipse offers the [m2e plugin](http://www.eclipse.org/m2e/) 
+to [import Maven projects](http://books.sonatype.com/m2eclipse-book/reference/creating-sect-importing-projects.html#fig-creating-import).
+
+*Note*: The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write into the `VM Arguments` box: `-Xmx800m`.
+In IntelliJ IDEA recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+## Building the project
+
+If you want to build/package your project, navigate to your project directory and run the
+'`mvn clean package`' command. You will find a JAR file that contains your application (plus connectors
+and libraries that you may have added as dependencies to the application) here:`target/<artifact-id>-<version>.jar`.
+
+__Note:__ If you used a different class than `DataStreamJob` as the application's main class / entry point,
+we recommend you change the `mainClass` setting in the `pom.xml` file accordingly so that Flink
+can run the application from the JAR file without additionally specifying the main class.
+
+## Adding dependencies to the project
+
+Open the `pom.xml` file in your profile directory and add the dependency in between
+the `dependencies` tab.  
+
+For example, you can add the Kafka connector as a dependency like this:
+
+```xml
+<dependencies>
+    
+    <dependency>
+        <groupId>org.apache.flink</groupId>
+        <artifactId>flink-connector-kafka</artifactId>
+        <version>{{< version >}}</version>
+    </dependency>
+    
+</dependencies>
+```
+
+Then execute `mvn install` on the command line. 
+
+Projects created from the `Java Project Template`, the `Scala Project Template`, or Gradle are configured
+to automatically include the application dependencies into the application JAR when you run `mvn clean package`.
+For projects that are not set up from those templates, we recommend adding the Maven Shade Plugin to
+build the application jar with all required dependencies.
+
+**Important:** Note that all these (core) dependencies should have their scope set to [*provided*](https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#dependency-scope). This means that

Review comment:
       _core_ here refers to the api deps right?

##########
File path: docs/content/docs/dev/configuration/gradle.md
##########
@@ -0,0 +1,110 @@
+---
+title: "Using Gradle"
+weight: 3
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Gradle to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide will show you how to
+do so with [Gradle](https://gradle.org), an open-source general-purpose build tool that can be used 
+to automate tasks in the development process.
+
+## Requirements
+
+- Gradle 3.x (or higher)
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you import this project into
+your IDE for developing and testing.
+
+
+IntelliJ IDEA supports Gradle projects via the `Gradle` plugin.
+
+Eclipse does so via the [Eclipse Buildship](https://projects.eclipse.org/projects/tools.buildship)
+plugin (make sure to specify a Gradle version >= 3.0 in the last step of the import wizard; the `shadow`
+plugin requires it). You may also use [Gradle's IDE integration](https://docs.gradle.org/current/userguide/userguide.html#ide-integration)
+to create project files with Gradle.
+
+*Note*: The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write into the `VM Arguments` box: `-Xmx800m`.
+In IntelliJ IDEA recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+## Building the project
+
+If you want to __build/package your project__, go to your project directory and
+run the '`gradle clean shadowJar`' command.
+You will __find a JAR file__ that contains your application, plus connectors and libraries
+that you may have added as dependencies to the application: `build/libs/<project-name>-<version>-all.jar`.
+
+__Note:__ If you use a different class than *StreamingJob* as the application's main class / entry point,
+we recommend you change the `mainClassName` setting in the `build.gradle` file accordingly. That way, Flink
+can run the application from the JAR file without additionally specifying the main class.
+
+## Adding dependencies to the project
+
+Specify a dependency configuration in the dependencies block of your `build.gradle` file.
+
+For example, you can add the Kafka connector as a dependency like this:
+
+**build.gradle**
+
+```gradle
+...
+dependencies {
+    ...  
+    flinkShadowJar "org.apache.flink:flink-connector-kafka:${flinkVersion}"
+    ...
+}
+...
+```
+
+**Important:** Note that all these (core) dependencies should have their scope set to [*provided*](https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#dependency-scope). This means that
+they are needed to compile against, but that they should not be packaged into the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
+becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR file clash with some of
+your own dependency versions (which is normally avoided through inverted classloading).
+
+To correctly package the dependencies into the application JAR, these application dependencies must
+be set to the *compile* scope.
+
+## Packaging the application
+
+Depending on your use case, you may need to package your Flink application in different ways before it
+gets deployed to a Flink environment.
+
+If you want to create a JAR for a Flink Job and use only Flink dependencies without any third-party
+dependencies (i.e. using the filesystem connector with JSON format), you do not need to create an
+uber/fat JAR or shade any dependencies.
+
+If you want to create a JAR for a Flink Job and use external dependencies not built into the Flink
+distribution, you can either add them to the classpath of the distribution or shade them into your
+uber/fat application JAR.
+
+With the generated uber/fat JAR, you can submit it to a local or remote cluster with:
+
+```sh
+bin/flink run -c org.example.MyJob myFatJar.jar
+```

Review comment:
       Also @infoverload same comment as the maven page, end this paragraph with a link to the deployment guide

##########
File path: docs/content/docs/dev/configuration/maven.md
##########
@@ -0,0 +1,164 @@
+---
+title: "Using Maven"
+weight: 2
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Maven to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide will show you how to 
+do so with [Maven](https://maven.apache.org), an open-source build automation tool developed by the 
+Apache Group that enables you to build, publish, and deploy projects. You can use it to manage the 
+entire lifecycle of your software project.
+
+## Requirements
+
+- Maven 3.0.4 (or higher)
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you import this project into
+your IDE for developing and testing.
+
+IntelliJ IDEA supports Maven projects out-of-the-box. Eclipse offers the [m2e plugin](http://www.eclipse.org/m2e/) 
+to [import Maven projects](http://books.sonatype.com/m2eclipse-book/reference/creating-sect-importing-projects.html#fig-creating-import).
+
+*Note*: The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write into the `VM Arguments` box: `-Xmx800m`.
+In IntelliJ IDEA recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+## Building the project
+
+If you want to build/package your project, navigate to your project directory and run the
+'`mvn clean package`' command. You will find a JAR file that contains your application (plus connectors
+and libraries that you may have added as dependencies to the application) here:`target/<artifact-id>-<version>.jar`.
+
+__Note:__ If you used a different class than `DataStreamJob` as the application's main class / entry point,
+we recommend you change the `mainClass` setting in the `pom.xml` file accordingly so that Flink
+can run the application from the JAR file without additionally specifying the main class.
+
+## Adding dependencies to the project
+
+Open the `pom.xml` file in your profile directory and add the dependency in between
+the `dependencies` tab.  
+
+For example, you can add the Kafka connector as a dependency like this:
+
+```xml
+<dependencies>
+    
+    <dependency>
+        <groupId>org.apache.flink</groupId>
+        <artifactId>flink-connector-kafka</artifactId>
+        <version>{{< version >}}</version>
+    </dependency>
+    
+</dependencies>
+```
+
+Then execute `mvn install` on the command line. 
+
+Projects created from the `Java Project Template`, the `Scala Project Template`, or Gradle are configured
+to automatically include the application dependencies into the application JAR when you run `mvn clean package`.
+For projects that are not set up from those templates, we recommend adding the Maven Shade Plugin to
+build the application jar with all required dependencies.
+
+**Important:** Note that all these (core) dependencies should have their scope set to [*provided*](https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#dependency-scope). This means that
+they are needed to compile against, but that they should not be packaged into the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
+becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR file clash with some of
+your own dependency versions (which is normally avoided through inverted classloading).
+
+To correctly package the dependencies into the application JAR, these application dependencies must 
+be set to the *compile* scope.
+
+## Packaging the application
+
+Depending on your use case, you may need to package your Flink application in different ways before it
+gets deployed to a Flink environment. 
+
+If you want to create a JAR for a Flink Job and use only Flink dependencies without any third-party 
+dependencies (i.e. using the filesystem connector with JSON format), you do not need to create an 
+uber/fat JAR or shade any dependencies.
+
+If you want to create a JAR for a Flink Job and use external dependencies not built into the Flink 
+distribution, you can either add them to the classpath of the distribution or shade them into your 
+uber/fat application JAR.
+
+With the generated uber/fat JAR, you can submit it to a local or remote cluster with:
+
+```sh
+bin/flink run -c org.example.MyJob myFatJar.jar
+```
+
+## Template for building a JAR with dependencies
+
+To build an application JAR that contains all dependencies required for declared connectors and libraries,
+you can use the following shade plugin definition:
+
+```xml
+<build>
+    <plugins>
+        <plugin>
+            <groupId>org.apache.maven.plugins</groupId>
+            <artifactId>maven-shade-plugin</artifactId>
+            <version>3.1.1</version>
+            <executions>
+                <execution>
+                    <phase>package</phase>
+                    <goals>
+                        <goal>shade</goal>
+                    </goals>
+                    <configuration>
+                        <artifactSet>
+                            <excludes>
+                                <exclude>com.google.code.findbugs:jsr305</exclude>
+                                <exclude>org.slf4j:*</exclude>
+                                <exclude>log4j:*</exclude>
+                            </excludes>
+                        </artifactSet>
+                        <filters>
+                            <filter>
+                                <!-- Do not copy the signatures in the META-INF folder.
+                                Otherwise, this might cause SecurityExceptions when using the JAR. -->
+                                <artifact>*:*</artifact>
+                                <excludes>
+                                    <exclude>META-INF/*.SF</exclude>
+                                    <exclude>META-INF/*.DSA</exclude>
+                                    <exclude>META-INF/*.RSA</exclude>
+                                </excludes>
+                            </filter>
+                        </filters>
+                        <transformers>
+                            <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
+                                <mainClass>my.programs.main.clazz</mainClass>

Review comment:
       Add:
   
   ```
   <!-- Replace this with the main class of your job -->
   ```

##########
File path: docs/content/docs/dev/configuration/connector.md
##########
@@ -0,0 +1,63 @@
+---
+title: "Dependencies: Connectors and Formats"
+weight: 5
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Dependencies: Connectors and Formats
+
+Flink can read from and write to various external systems via [connectors]({{< ref "docs/connectors/table/overview" >}})
+and define the [format]({{< ref "docs/connectors/table/formats/overview" >}}) in which to store the 
+data.  
+
+The way that information is serialized is represented in the external system and that system needs
+to know how to read this data in a format that can be read by Flink.  This is done through format 
+dependencies.
+
+Most applications need specific connectors to run. Flink provides a set of formats that can be used 
+with connectors (with the dependencies for both being fairly unified). These are not part of Flink's 
+core dependencies and must be added as dependencies to the application.
+
+## Adding Dependencies 
+
+For more information on how to add dependencies, refer to the build tools sections on [Maven]({{< ref "docs/dev/configuration/maven" >}}),
+[Gradle]({{< ref "docs/dev/configuration/gradle" >}}), or [sbt]({{< ref "docs/dev/configuration/sbt" >}})..
+
+## Packaging Dependencies
+
+We recommend packaging the application code and all its required dependencies into one *JAR-with-dependencies*
+which we refer to as the *application JAR*. The application JAR can be submitted to an already running
+Flink cluster, or added to a Flink application container image.
+
+On [Maven Central](https://search.maven.org), we publish connectors named "flink-connector-<NAME>" and
+"flink-sql-connector-<NAME>". The former are thin JARs while the latter are uber JARs.
+
+In order to use the uber JARs, you can shade them (including and renaming dependencies to create a 
+private copy) in the uber JAR of your application, or you can add them to the `/lib` folder of the 
+distribution (i.e. SQL client).

Review comment:
       I would remove i.e. SQL client, as it's a different thing. 

##########
File path: docs/content/docs/dev/configuration/advanced.md
##########
@@ -0,0 +1,158 @@
+---
+title: "Advanced Configuration"
+weight: 10
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Advanced Configuration Topics

Review comment:
       Reading this page, I start thinking it sounds more like an FAQ page, WDYT?

##########
File path: docs/content/docs/dev/configuration/connector.md
##########
@@ -0,0 +1,63 @@
+---
+title: "Dependencies: Connectors and Formats"
+weight: 5
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Dependencies: Connectors and Formats
+
+Flink can read from and write to various external systems via [connectors]({{< ref "docs/connectors/table/overview" >}})
+and define the [format]({{< ref "docs/connectors/table/formats/overview" >}}) in which to store the 
+data.  
+
+The way that information is serialized is represented in the external system and that system needs
+to know how to read this data in a format that can be read by Flink.  This is done through format 
+dependencies.
+
+Most applications need specific connectors to run. Flink provides a set of formats that can be used 
+with connectors (with the dependencies for both being fairly unified). These are not part of Flink's 
+core dependencies and must be added as dependencies to the application.
+
+## Adding Dependencies 
+
+For more information on how to add dependencies, refer to the build tools sections on [Maven]({{< ref "docs/dev/configuration/maven" >}}),
+[Gradle]({{< ref "docs/dev/configuration/gradle" >}}), or [sbt]({{< ref "docs/dev/configuration/sbt" >}})..
+
+## Packaging Dependencies
+
+We recommend packaging the application code and all its required dependencies into one *JAR-with-dependencies*
+which we refer to as the *application JAR*. The application JAR can be submitted to an already running
+Flink cluster, or added to a Flink application container image.
+
+On [Maven Central](https://search.maven.org), we publish connectors named "flink-connector-<NAME>" and
+"flink-sql-connector-<NAME>". The former are thin JARs while the latter are uber JARs.
+
+In order to use the uber JARs, you can shade them (including and renaming dependencies to create a 
+private copy) in the uber JAR of your application, or you can add them to the `/lib` folder of the 
+distribution (i.e. SQL client).
+
+If you shade a dependency JAR, you will have more control over the dependency version in the job JAR. 

Review comment:
       _If you shade a dependency, ..._ without JAR, as it's implied that we're talking about JARs here.

##########
File path: docs/content/docs/dev/configuration/testing.md
##########
@@ -0,0 +1,44 @@
+---
+title: "Test Dependencies"
+weight: 6
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Dependencies for Testing 
+
+## DataStream API Test Dependencies
+
+You need to add the following dependencies if you want to develop tests for a job built with the 
+DataStream API:
+
+{{< artifact flink-test-utils withTestScope >}}
+{{< artifact flink-runtime withTestScope >}}
+

Review comment:
       Can we link the page where it's explained how to use these utilities? https://nightlies.apache.org/flink/flink-docs-master/docs/dev/datastream/testing/

##########
File path: docs/content/docs/dev/configuration/connector.md
##########
@@ -0,0 +1,63 @@
+---
+title: "Dependencies: Connectors and Formats"
+weight: 5
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Dependencies: Connectors and Formats

Review comment:
       @fapaul can you do a review of this page?

##########
File path: docs/content/docs/dev/configuration/testing.md
##########
@@ -0,0 +1,44 @@
+---
+title: "Test Dependencies"
+weight: 6
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Dependencies for Testing 
+

Review comment:
       Add some "filler" sentence here? Like "Flink provides utilities for testing your job"...

##########
File path: docs/content/docs/dev/configuration/advanced.md
##########
@@ -0,0 +1,158 @@
+---
+title: "Advanced Configuration"
+weight: 10
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Advanced Configuration Topics
+
+## Dependencies: Flink Core and User Application
+
+There are two broad categories of dependencies and libraries in Flink, which are explained below.
+
+### Flink Core Dependencies
+
+Flink itself consists of a set of classes and dependencies that form the core of Flink's runtime
+and must be present when a Flink application is started. The classes and dependencies needed to run
+the system handle areas such as coordination, networking, checkpointing, failover, APIs,
+operators (such as windowing), resource management, etc.
+
+These core classes and dependencies are packaged in the `flink-dist` jar, are part of Flink's `lib`
+folder, and part of the basic Flink container images. You can think of these dependencies as similar
+to Java's core library (i.e. `rt.jar`, `charsets.jar`), which contains classes like `String` and `List`.
+
+In order to keep the core dependencies as small as possible and avoid dependency clashes, the
+Flink Core Dependencies do not contain any connectors or libraries (i.e. CEP, SQL, ML) in order to
+avoid having an excessive default number of classes and dependencies in the classpath.
+
+### User Application Dependencies
+
+These dependencies include all connectors, formats, or libraries that a specific user application
+needs and explicitly do not include the Flink DataStream and Table APIs and runtime dependencies 
+since those are already part of the Flink core dependencies.
+
+The user application is typically packaged into an *application jar*, which contains the application
+code and the required connector and library dependencies.

Review comment:
       Not sure if this text is needed anymore, as it's roughly repeating what it's already said in the previous pages... Perhaps needs some rewording? @MartijnVisser WDYT about it?

##########
File path: docs/content/docs/dev/configuration/advanced.md
##########
@@ -0,0 +1,158 @@
+---
+title: "Advanced Configuration"
+weight: 10
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Advanced Configuration Topics
+
+## Dependencies: Flink Core and User Application
+
+There are two broad categories of dependencies and libraries in Flink, which are explained below.
+
+### Flink Core Dependencies
+
+Flink itself consists of a set of classes and dependencies that form the core of Flink's runtime
+and must be present when a Flink application is started. The classes and dependencies needed to run
+the system handle areas such as coordination, networking, checkpointing, failover, APIs,
+operators (such as windowing), resource management, etc.
+
+These core classes and dependencies are packaged in the `flink-dist` jar, are part of Flink's `lib`
+folder, and part of the basic Flink container images. You can think of these dependencies as similar
+to Java's core library (i.e. `rt.jar`, `charsets.jar`), which contains classes like `String` and `List`.
+
+In order to keep the core dependencies as small as possible and avoid dependency clashes, the
+Flink Core Dependencies do not contain any connectors or libraries (i.e. CEP, SQL, ML) in order to
+avoid having an excessive default number of classes and dependencies in the classpath.
+
+### User Application Dependencies
+
+These dependencies include all connectors, formats, or libraries that a specific user application
+needs and explicitly do not include the Flink DataStream and Table APIs and runtime dependencies 
+since those are already part of the Flink core dependencies.
+
+The user application is typically packaged into an *application jar*, which contains the application
+code and the required connector and library dependencies.
+
+## IDE configuration
+
+The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write `-Xmx800m` into the `VM Arguments` box.
+In IntelliJ IDEA, the recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is necessary to tick the
+`Include dependencies with "Provided" scope` box in the run configuration. If this option is not available
+(possibly due to using an older IntelliJ IDEA version), then a workaround is to create a test that
+calls the application's `main()` method.

Review comment:
       This one is already in the other project conf pages?

##########
File path: docs/content/docs/dev/configuration/maven.md
##########
@@ -0,0 +1,164 @@
+---
+title: "Using Maven"
+weight: 2
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Maven to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide will show you how to 
+do so with [Maven](https://maven.apache.org), an open-source build automation tool developed by the 
+Apache Group that enables you to build, publish, and deploy projects. You can use it to manage the 
+entire lifecycle of your software project.
+
+## Requirements
+
+- Maven 3.0.4 (or higher)
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you import this project into
+your IDE for developing and testing.
+
+IntelliJ IDEA supports Maven projects out-of-the-box. Eclipse offers the [m2e plugin](http://www.eclipse.org/m2e/) 
+to [import Maven projects](http://books.sonatype.com/m2eclipse-book/reference/creating-sect-importing-projects.html#fig-creating-import).
+
+*Note*: The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write into the `VM Arguments` box: `-Xmx800m`.
+In IntelliJ IDEA recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+## Building the project
+
+If you want to build/package your project, navigate to your project directory and run the
+'`mvn clean package`' command. You will find a JAR file that contains your application (plus connectors
+and libraries that you may have added as dependencies to the application) here:`target/<artifact-id>-<version>.jar`.
+
+__Note:__ If you used a different class than `DataStreamJob` as the application's main class / entry point,
+we recommend you change the `mainClass` setting in the `pom.xml` file accordingly so that Flink
+can run the application from the JAR file without additionally specifying the main class.
+
+## Adding dependencies to the project
+
+Open the `pom.xml` file in your profile directory and add the dependency in between
+the `dependencies` tab.  
+
+For example, you can add the Kafka connector as a dependency like this:
+
+```xml
+<dependencies>
+    
+    <dependency>
+        <groupId>org.apache.flink</groupId>
+        <artifactId>flink-connector-kafka</artifactId>
+        <version>{{< version >}}</version>
+    </dependency>
+    
+</dependencies>
+```
+
+Then execute `mvn install` on the command line. 
+
+Projects created from the `Java Project Template`, the `Scala Project Template`, or Gradle are configured
+to automatically include the application dependencies into the application JAR when you run `mvn clean package`.
+For projects that are not set up from those templates, we recommend adding the Maven Shade Plugin to
+build the application jar with all required dependencies.
+
+**Important:** Note that all these (core) dependencies should have their scope set to [*provided*](https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#dependency-scope). This means that
+they are needed to compile against, but that they should not be packaged into the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
+becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR file clash with some of
+your own dependency versions (which is normally avoided through inverted classloading).
+
+To correctly package the dependencies into the application JAR, these application dependencies must 
+be set to the *compile* scope.
+
+## Packaging the application
+
+Depending on your use case, you may need to package your Flink application in different ways before it
+gets deployed to a Flink environment. 
+
+If you want to create a JAR for a Flink Job and use only Flink dependencies without any third-party 
+dependencies (i.e. using the filesystem connector with JSON format), you do not need to create an 
+uber/fat JAR or shade any dependencies.
+
+If you want to create a JAR for a Flink Job and use external dependencies not built into the Flink 
+distribution, you can either add them to the classpath of the distribution or shade them into your 
+uber/fat application JAR.
+
+With the generated uber/fat JAR, you can submit it to a local or remote cluster with:
+
+```sh
+bin/flink run -c org.example.MyJob myFatJar.jar
+```
+
+## Template for building a JAR with dependencies
+
+To build an application JAR that contains all dependencies required for declared connectors and libraries,
+you can use the following shade plugin definition:
+
+```xml
+<build>
+    <plugins>
+        <plugin>
+            <groupId>org.apache.maven.plugins</groupId>
+            <artifactId>maven-shade-plugin</artifactId>
+            <version>3.1.1</version>
+            <executions>
+                <execution>
+                    <phase>package</phase>
+                    <goals>
+                        <goal>shade</goal>
+                    </goals>
+                    <configuration>
+                        <artifactSet>
+                            <excludes>
+                                <exclude>com.google.code.findbugs:jsr305</exclude>
+                                <exclude>org.slf4j:*</exclude>
+                                <exclude>log4j:*</exclude>
+                            </excludes>
+                        </artifactSet>
+                        <filters>
+                            <filter>
+                                <!-- Do not copy the signatures in the META-INF folder.
+                                Otherwise, this might cause SecurityExceptions when using the JAR. -->
+                                <artifact>*:*</artifact>
+                                <excludes>
+                                    <exclude>META-INF/*.SF</exclude>
+                                    <exclude>META-INF/*.DSA</exclude>
+                                    <exclude>META-INF/*.RSA</exclude>
+                                </excludes>
+                            </filter>
+                        </filters>
+                        <transformers>
+                            <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
+                                <mainClass>my.programs.main.clazz</mainClass>
+                            </transformer>
+                        </transformers>
+                    </configuration>
+                </execution>
+            </executions>
+        </plugin>
+    </plugins>
+</build>
+```

Review comment:
       It's also explained here https://stackoverflow.com/questions/24872859/how-does-the-maven-shade-plugin-decide-which-dependencies-to-put-into-the-final (but don't like the SO answer in the guide)




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 50e221c6373f2774eeb8ec134df27caef83b2d40 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495) 
   * 8e5ae4098896a176d49d4b7319f69b6529412549 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516) 
   * 532af8b3b761f7b589500e5ead9b888f2370675a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 50e221c6373f2774eeb8ec134df27caef83b2d40 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495) 
   * 8e5ae4098896a176d49d4b7319f69b6529412549 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] zentol commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
zentol commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r798446772



##########
File path: docs/content/docs/dev/configuration/gradle.md
##########
@@ -0,0 +1,122 @@
+---
+title: "Using Gradle"
+weight: 3
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Gradle to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide will show you how to
+do so with [Gradle](https://gradle.org), an open-source general-purpose build tool that can be used 
+to automate tasks in the development process.
+
+## Requirements
+
+- Gradle 7.x 
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you import this project into
+your IDE for developing and testing.
+
+IntelliJ IDEA supports Gradle projects via the `Gradle` plugin.
+
+Eclipse does so via the [Eclipse Buildship](https://projects.eclipse.org/projects/tools.buildship)
+plugin (make sure to specify a Gradle version >= 3.0 in the last step of the import wizard; the `shadow`
+plugin requires it). You may also use [Gradle's IDE integration](https://docs.gradle.org/current/userguide/userguide.html#ide-integration)
+to create project files with Gradle.
+
+**Note**: The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write into the `VM Arguments` box: `-Xmx800m`.
+In IntelliJ IDEA recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is necessary to tick the
+`Include dependencies with "Provided" scope` box in the run configuration. If this option is not available
+(possibly due to using an older IntelliJ IDEA version), then a workaround is to create a test that
+calls the application's `main()` method.
+
+## Building the project
+
+If you want to __build/package your project__, go to your project directory and
+run the '`gradle clean shadowJar`' command.
+You will __find a JAR file__ that contains your application, plus connectors and libraries
+that you may have added as dependencies to the application: `build/libs/<project-name>-<version>-all.jar`.
+
+__Note:__ If you use a different class than *StreamingJob* as the application's main class / entry point,
+we recommend you change the `mainClassName` setting in the `build.gradle` file accordingly. That way, Flink
+can run the application from the JAR file without additionally specifying the main class.
+
+## Adding dependencies to the project
+
+Specify a dependency configuration in the dependencies block of your `build.gradle` file.
+
+For example, you can add the Kafka connector as a dependency like this:
+
+**build.gradle**
+
+```gradle
+...
+dependencies {
+    ...  
+    flinkShadowJar "org.apache.flink:flink-connector-kafka:${flinkVersion}"

Review comment:
       Is flinkShadowJar specific to the Flink gradle quickstart? It isn't mentioned anywhere on this page.

##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,219 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or locally for testing),
+the [Flink runtime library](https://ossindex.sonatype.org/component/pkg:maven/org.apache.flink/flink-runtime@1.14.3) 
+must be available.
+
+The guides in this section will show you how to configure your projects via popular build tools
+([Maven]({{< ref "docs/dev/configuration/maven" >}}), [Gradle]({{< ref "docs/dev/configuration/gradle" >}}),
+add the necessary dependencies (i.e. [connectors and formats]({{< ref "docs/dev/configuration/connector" >}}), 
+[testing]({{< ref "docs/dev/configuration/testing" >}})), and cover some 
+[advanced]({{< ref "docs/dev/configuration/advanced" >}}) configuration topics. 
+
+## Getting started
+
+To get started working on your Flink application, use the following commands, scripts, and templates 
+to create a Flink project.  
+
+{{< tabs "creating project" >}}
+{{< tab "Maven" >}}
+
+You can create a project based on an [Archetype](https://maven.apache.org/guides/introduction/introduction-to-archetypes.html)
+with the Maven command below or use the provided quickstart bash script.
+
+### Maven command
+```bash
+$ mvn archetype:generate                \
+  -DarchetypeGroupId=org.apache.flink   \
+  -DarchetypeArtifactId=flink-quickstart-java \
+  -DarchetypeVersion={{< version >}}
+```
+This allows you to name your newly created project and will interactively ask you for the groupId,
+artifactId, and package name.
+
+### Quickstart script
+```bash
+$ curl https://flink.apache.org/q/quickstart.sh | bash -s {{< version >}}
+```
+
+{{< /tab >}}
+{{< tab "Gradle" >}}
+You can create a project with a Gradle build script or use the provided quickstart bash script.
+
+### Gradle build script
+
+To execute these build configuration scripts, run the `gradle` command in the directory with these scripts.
+
+**build.gradle**
+
+```gradle
+plugins {
+    id 'java'
+    id 'application'
+    // shadow plugin to produce fat JARs
+    id 'com.github.johnrengelman.shadow' version '7.1.2'
+}
+// artifact properties
+group = 'org.myorg.quickstart'
+version = '0.1-SNAPSHOT'
+mainClassName = 'org.myorg.quickstart.StreamingJob'
+description = """Flink Quickstart Job"""
+ext {
+    javaVersion = '1.8'
+    flinkVersion = '{{< version >}}'
+    scalaBinaryVersion = '{{< scala_version >}}'
+    slf4jVersion = '1.7.32'
+    log4jVersion = '2.17.1'
+}
+sourceCompatibility = javaVersion
+targetCompatibility = javaVersion
+tasks.withType(JavaCompile) {
+	options.encoding = 'UTF-8'
+}
+applicationDefaultJvmArgs = ["-Dlog4j.configurationFile=log4j2.properties"]
+
+// declare where to find the dependencies of your project
+repositories {
+    mavenCentral()
+    maven { url "https://repository.apache.org/content/repositories/snapshots/" }
+}
+// NOTE: We cannot use "compileOnly" or "shadow" configurations since then we could not run code
+// in the IDE or with "gradle run". We also cannot exclude transitive dependencies from the
+// shadowJar yet (see https://github.com/johnrengelman/shadow/issues/159).
+// -> Explicitly define the // libraries we want to be included in the "flinkShadowJar" configuration!
+configurations {
+    flinkShadowJar // dependencies which go into the shadowJar
+    // always exclude these (also from transitive dependencies) since they are provided by Flink
+    flinkShadowJar.exclude group: 'org.apache.flink', module: 'force-shading'
+    flinkShadowJar.exclude group: 'com.google.code.findbugs', module: 'jsr305'
+    flinkShadowJar.exclude group: 'org.slf4j'
+    flinkShadowJar.exclude group: 'org.apache.logging.log4j'
+}
+// declare the dependencies for your production and test code
+dependencies {
+    // --------------------------------------------------------------
+    // Compile-time dependencies that should NOT be part of the
+    // shadow jar and are provided in the lib folder of Flink
+    // --------------------------------------------------------------
+    implementation "org.apache.flink:flink-streaming-java:${flinkVersion}"
+    implementation "org.apache.flink:flink-clients:${flinkVersion}"
+    // --------------------------------------------------------------
+    // Dependencies that should be part of the shadow jar, e.g.
+    // connectors. These must be in the flinkShadowJar configuration!
+    // --------------------------------------------------------------
+    //flinkShadowJar "org.apache.flink:flink-connector-kafka:${flinkVersion}"
+    runtimeOnly "org.apache.logging.log4j:log4j-api:${log4jVersion}"
+    runtimeOnly "org.apache.logging.log4j:log4j-core:${log4jVersion}"
+    runtimeOnly "org.apache.logging.log4j:log4j-slf4j-impl:${log4jVersion}"
+    runtimeOnly "org.slf4j:slf4j-log4j12:${slf4jVersion}"
+    // Add test dependencies here.
+    // testCompile "junit:junit:4.12"
+}
+// make compileOnly dependencies available for tests:
+sourceSets {
+    main.compileClasspath += configurations.flinkShadowJar
+    main.runtimeClasspath += configurations.flinkShadowJar
+    test.compileClasspath += configurations.flinkShadowJar
+    test.runtimeClasspath += configurations.flinkShadowJar
+    javadoc.classpath += configurations.flinkShadowJar
+}
+run.classpath = sourceSets.main.runtimeClasspath
+jar {
+    manifest {
+        attributes 'Built-By': System.getProperty('user.name'),
+                'Build-Jdk': System.getProperty('java.version')

Review comment:
       Is this actually necessary?

##########
File path: docs/content/docs/dev/configuration/gradle.md
##########
@@ -0,0 +1,122 @@
+---
+title: "Using Gradle"
+weight: 3
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Gradle to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide will show you how to
+do so with [Gradle](https://gradle.org), an open-source general-purpose build tool that can be used 
+to automate tasks in the development process.
+
+## Requirements
+
+- Gradle 7.x 
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you import this project into
+your IDE for developing and testing.

Review comment:
       Which project folder and files? Which project should be imported?
   Are we missing a reference to the gradle quickstart?

##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,219 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or locally for testing),
+the [Flink runtime library](https://ossindex.sonatype.org/component/pkg:maven/org.apache.flink/flink-runtime@1.14.3) 
+must be available.
+
+The guides in this section will show you how to configure your projects via popular build tools
+([Maven]({{< ref "docs/dev/configuration/maven" >}}), [Gradle]({{< ref "docs/dev/configuration/gradle" >}}),
+add the necessary dependencies (i.e. [connectors and formats]({{< ref "docs/dev/configuration/connector" >}}), 
+[testing]({{< ref "docs/dev/configuration/testing" >}})), and cover some 
+[advanced]({{< ref "docs/dev/configuration/advanced" >}}) configuration topics. 
+
+## Getting started
+
+To get started working on your Flink application, use the following commands, scripts, and templates 
+to create a Flink project.  
+
+{{< tabs "creating project" >}}
+{{< tab "Maven" >}}
+
+You can create a project based on an [Archetype](https://maven.apache.org/guides/introduction/introduction-to-archetypes.html)
+with the Maven command below or use the provided quickstart bash script.
+
+### Maven command
+```bash
+$ mvn archetype:generate                \
+  -DarchetypeGroupId=org.apache.flink   \
+  -DarchetypeArtifactId=flink-quickstart-java \
+  -DarchetypeVersion={{< version >}}
+```
+This allows you to name your newly created project and will interactively ask you for the groupId,
+artifactId, and package name.
+
+### Quickstart script
+```bash
+$ curl https://flink.apache.org/q/quickstart.sh | bash -s {{< version >}}
+```
+
+{{< /tab >}}
+{{< tab "Gradle" >}}
+You can create a project with a Gradle build script or use the provided quickstart bash script.
+
+### Gradle build script
+
+To execute these build configuration scripts, run the `gradle` command in the directory with these scripts.
+
+**build.gradle**
+
+```gradle
+plugins {
+    id 'java'
+    id 'application'
+    // shadow plugin to produce fat JARs
+    id 'com.github.johnrengelman.shadow' version '7.1.2'
+}
+// artifact properties
+group = 'org.myorg.quickstart'
+version = '0.1-SNAPSHOT'
+mainClassName = 'org.myorg.quickstart.StreamingJob'
+description = """Flink Quickstart Job"""
+ext {
+    javaVersion = '1.8'
+    flinkVersion = '{{< version >}}'
+    scalaBinaryVersion = '{{< scala_version >}}'
+    slf4jVersion = '1.7.32'
+    log4jVersion = '2.17.1'
+}
+sourceCompatibility = javaVersion
+targetCompatibility = javaVersion
+tasks.withType(JavaCompile) {
+	options.encoding = 'UTF-8'
+}
+applicationDefaultJvmArgs = ["-Dlog4j.configurationFile=log4j2.properties"]
+
+// declare where to find the dependencies of your project
+repositories {
+    mavenCentral()
+    maven { url "https://repository.apache.org/content/repositories/snapshots/" }

Review comment:
       We shouldn't point users to snapshot dependencies.

##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,219 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or locally for testing),
+the [Flink runtime library](https://ossindex.sonatype.org/component/pkg:maven/org.apache.flink/flink-runtime@1.14.3) 
+must be available.
+
+The guides in this section will show you how to configure your projects via popular build tools
+([Maven]({{< ref "docs/dev/configuration/maven" >}}), [Gradle]({{< ref "docs/dev/configuration/gradle" >}}),
+add the necessary dependencies (i.e. [connectors and formats]({{< ref "docs/dev/configuration/connector" >}}), 
+[testing]({{< ref "docs/dev/configuration/testing" >}})), and cover some 
+[advanced]({{< ref "docs/dev/configuration/advanced" >}}) configuration topics. 
+
+## Getting started
+
+To get started working on your Flink application, use the following commands, scripts, and templates 
+to create a Flink project.  
+
+{{< tabs "creating project" >}}
+{{< tab "Maven" >}}
+
+You can create a project based on an [Archetype](https://maven.apache.org/guides/introduction/introduction-to-archetypes.html)
+with the Maven command below or use the provided quickstart bash script.
+
+### Maven command
+```bash
+$ mvn archetype:generate                \
+  -DarchetypeGroupId=org.apache.flink   \
+  -DarchetypeArtifactId=flink-quickstart-java \
+  -DarchetypeVersion={{< version >}}
+```
+This allows you to name your newly created project and will interactively ask you for the groupId,
+artifactId, and package name.
+
+### Quickstart script
+```bash
+$ curl https://flink.apache.org/q/quickstart.sh | bash -s {{< version >}}
+```
+
+{{< /tab >}}
+{{< tab "Gradle" >}}
+You can create a project with a Gradle build script or use the provided quickstart bash script.
+
+### Gradle build script
+
+To execute these build configuration scripts, run the `gradle` command in the directory with these scripts.
+
+**build.gradle**
+
+```gradle
+plugins {
+    id 'java'
+    id 'application'
+    // shadow plugin to produce fat JARs
+    id 'com.github.johnrengelman.shadow' version '7.1.2'
+}
+// artifact properties
+group = 'org.myorg.quickstart'
+version = '0.1-SNAPSHOT'
+mainClassName = 'org.myorg.quickstart.StreamingJob'
+description = """Flink Quickstart Job"""
+ext {
+    javaVersion = '1.8'
+    flinkVersion = '{{< version >}}'
+    scalaBinaryVersion = '{{< scala_version >}}'
+    slf4jVersion = '1.7.32'
+    log4jVersion = '2.17.1'
+}
+sourceCompatibility = javaVersion
+targetCompatibility = javaVersion
+tasks.withType(JavaCompile) {
+	options.encoding = 'UTF-8'
+}
+applicationDefaultJvmArgs = ["-Dlog4j.configurationFile=log4j2.properties"]
+
+// declare where to find the dependencies of your project
+repositories {
+    mavenCentral()
+    maven { url "https://repository.apache.org/content/repositories/snapshots/" }
+}
+// NOTE: We cannot use "compileOnly" or "shadow" configurations since then we could not run code
+// in the IDE or with "gradle run". We also cannot exclude transitive dependencies from the
+// shadowJar yet (see https://github.com/johnrengelman/shadow/issues/159).
+// -> Explicitly define the // libraries we want to be included in the "flinkShadowJar" configuration!
+configurations {
+    flinkShadowJar // dependencies which go into the shadowJar
+    // always exclude these (also from transitive dependencies) since they are provided by Flink
+    flinkShadowJar.exclude group: 'org.apache.flink', module: 'force-shading'
+    flinkShadowJar.exclude group: 'com.google.code.findbugs', module: 'jsr305'
+    flinkShadowJar.exclude group: 'org.slf4j'
+    flinkShadowJar.exclude group: 'org.apache.logging.log4j'
+}
+// declare the dependencies for your production and test code
+dependencies {
+    // --------------------------------------------------------------
+    // Compile-time dependencies that should NOT be part of the
+    // shadow jar and are provided in the lib folder of Flink
+    // --------------------------------------------------------------
+    implementation "org.apache.flink:flink-streaming-java:${flinkVersion}"
+    implementation "org.apache.flink:flink-clients:${flinkVersion}"

Review comment:
       @matriv This is different than what you propose in https://github.com/apache/flink-web/pull/504/files.

##########
File path: docs/content/docs/dev/configuration/maven.md
##########
@@ -0,0 +1,175 @@
+---
+title: "Using Maven"
+weight: 2
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Maven to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide will show you how to 
+do so with [Maven](https://maven.apache.org), an open-source build automation tool developed by the 
+Apache Group that enables you to build, publish, and deploy projects. You can use it to manage the 
+entire lifecycle of your software project.
+
+## Requirements
+
+- Maven 3.0.4 (or higher)
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you import this project into
+your IDE for developing and testing.
+
+IntelliJ IDEA supports Maven projects out-of-the-box. Eclipse offers the [m2e plugin](http://www.eclipse.org/m2e/) 
+to [import Maven projects](http://books.sonatype.com/m2eclipse-book/reference/creating-sect-importing-projects.html#fig-creating-import).
+
+**Note**: The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write into the `VM Arguments` box: `-Xmx800m`.
+In IntelliJ IDEA recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is necessary to tick the
+`Include dependencies with "Provided" scope` box in the run configuration. If this option is not available
+(possibly due to using an older IntelliJ IDEA version), then a workaround is to create a test that
+calls the application's `main()` method.
+
+## Building the project
+
+If you want to build/package your project, navigate to your project directory and run the
+'`mvn clean package`' command. You will find a JAR file that contains your application (plus connectors
+and libraries that you may have added as dependencies to the application) here:`target/<artifact-id>-<version>.jar`.
+
+__Note:__ If you used a different class than `DataStreamJob` as the application's main class / entry point,
+we recommend you change the `mainClass` setting in the `pom.xml` file accordingly so that Flink
+can run the application from the JAR file without additionally specifying the main class.
+
+## Adding dependencies to the project
+
+Open the `pom.xml` file in your profile directory and add the dependency in between
+the `dependencies` tab.  
+
+For example, you can add the Kafka connector as a dependency like this:
+
+```xml
+<dependencies>
+    
+    <dependency>
+        <groupId>org.apache.flink</groupId>
+        <artifactId>flink-connector-kafka</artifactId>
+        <version>{{< version >}}</version>
+    </dependency>
+    
+</dependencies>
+```
+
+Then execute `mvn install` on the command line. 
+
+Projects created from the `Java Project Template`, the `Scala Project Template`, or Gradle are configured
+to automatically include the application dependencies into the application JAR when you run `mvn clean package`.
+For projects that are not set up from those templates, we recommend adding the Maven Shade Plugin to
+build the application jar with all required dependencies.
+
+**Important:** Note that all these core API dependencies should have their scope set to [*provided*](https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#dependency-scope). This means that
+they are needed to compile against, but that they should not be packaged into the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
+becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR file clash with some of
+your own dependency versions (which is normally avoided through inverted classloading).
+
+To correctly package the dependencies into the application JAR, the Flink API dependencies must 
+be set to the *compile* scope.
+
+## Packaging the application
+
+Depending on your use case, you may need to package your Flink application in different ways before it
+gets deployed to a Flink environment. 
+
+If you want to create a JAR for a Flink Job and use only Flink dependencies without any third-party 
+dependencies (i.e. using the filesystem connector with JSON format), you do not need to create an 
+uber/fat JAR or shade any dependencies.
+
+If you want to create a JAR for a Flink Job and use external dependencies not built into the Flink 
+distribution, you can either add them to the classpath of the distribution or shade them into your 
+uber/fat application JAR.
+
+With the generated uber/fat JAR, you can submit it to a local or remote cluster with:
+
+```sh
+bin/flink run -c org.example.MyJob myFatJar.jar
+```
+
+To learn more about how to deploy Flink jobs, check out the [deployment guide]({{< ref "docs/deployment/cli" >}}).
+
+## Template for creating an uber/fat JAR with dependencies
+
+To build an application JAR that contains all dependencies required for declared connectors and libraries,
+you can use the following shade plugin definition:
+
+```xml
+<build>
+    <plugins>
+        <plugin>
+            <groupId>org.apache.maven.plugins</groupId>
+            <artifactId>maven-shade-plugin</artifactId>
+            <version>3.2.4</version>
+            <executions>
+                <execution>
+                    <phase>package</phase>
+                    <goals>
+                        <goal>shade</goal>
+                    </goals>
+                    <configuration>
+                        <artifactSet>
+                            <excludes>
+                                <exclude>com.google.code.findbugs:jsr305</exclude>
+                                <exclude>org.slf4j:*</exclude>
+                                <exclude>log4j:*</exclude>

Review comment:
       This should be unnecessary because all Flink dependencies have a provided dependency on slf4j and log4j.
   Also, the log4j groupId is outdated (`org.apache.logging.log4j` would be the correct choice).

##########
File path: docs/content/docs/dev/configuration/maven.md
##########
@@ -0,0 +1,175 @@
+---
+title: "Using Maven"
+weight: 2
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Maven to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide will show you how to 
+do so with [Maven](https://maven.apache.org), an open-source build automation tool developed by the 
+Apache Group that enables you to build, publish, and deploy projects. You can use it to manage the 
+entire lifecycle of your software project.
+
+## Requirements
+
+- Maven 3.0.4 (or higher)
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you import this project into
+your IDE for developing and testing.
+
+IntelliJ IDEA supports Maven projects out-of-the-box. Eclipse offers the [m2e plugin](http://www.eclipse.org/m2e/) 
+to [import Maven projects](http://books.sonatype.com/m2eclipse-book/reference/creating-sect-importing-projects.html#fig-creating-import).
+
+**Note**: The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write into the `VM Arguments` box: `-Xmx800m`.
+In IntelliJ IDEA recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is necessary to tick the
+`Include dependencies with "Provided" scope` box in the run configuration. If this option is not available
+(possibly due to using an older IntelliJ IDEA version), then a workaround is to create a test that
+calls the application's `main()` method.
+
+## Building the project
+
+If you want to build/package your project, navigate to your project directory and run the
+'`mvn clean package`' command. You will find a JAR file that contains your application (plus connectors
+and libraries that you may have added as dependencies to the application) here:`target/<artifact-id>-<version>.jar`.
+
+__Note:__ If you used a different class than `DataStreamJob` as the application's main class / entry point,
+we recommend you change the `mainClass` setting in the `pom.xml` file accordingly so that Flink
+can run the application from the JAR file without additionally specifying the main class.
+
+## Adding dependencies to the project
+
+Open the `pom.xml` file in your profile directory and add the dependency in between
+the `dependencies` tab.  
+
+For example, you can add the Kafka connector as a dependency like this:
+
+```xml
+<dependencies>
+    
+    <dependency>
+        <groupId>org.apache.flink</groupId>
+        <artifactId>flink-connector-kafka</artifactId>
+        <version>{{< version >}}</version>
+    </dependency>
+    
+</dependencies>
+```
+
+Then execute `mvn install` on the command line. 
+
+Projects created from the `Java Project Template`, the `Scala Project Template`, or Gradle are configured
+to automatically include the application dependencies into the application JAR when you run `mvn clean package`.
+For projects that are not set up from those templates, we recommend adding the Maven Shade Plugin to
+build the application jar with all required dependencies.
+
+**Important:** Note that all these core API dependencies should have their scope set to [*provided*](https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#dependency-scope). This means that
+they are needed to compile against, but that they should not be packaged into the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
+becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR file clash with some of
+your own dependency versions (which is normally avoided through inverted classloading).
+
+To correctly package the dependencies into the application JAR, the Flink API dependencies must 
+be set to the *compile* scope.
+
+## Packaging the application
+
+Depending on your use case, you may need to package your Flink application in different ways before it
+gets deployed to a Flink environment. 
+
+If you want to create a JAR for a Flink Job and use only Flink dependencies without any third-party 
+dependencies (i.e. using the filesystem connector with JSON format), you do not need to create an 
+uber/fat JAR or shade any dependencies.
+
+If you want to create a JAR for a Flink Job and use external dependencies not built into the Flink 
+distribution, you can either add them to the classpath of the distribution or shade them into your 
+uber/fat application JAR.
+
+With the generated uber/fat JAR, you can submit it to a local or remote cluster with:
+
+```sh
+bin/flink run -c org.example.MyJob myFatJar.jar
+```
+
+To learn more about how to deploy Flink jobs, check out the [deployment guide]({{< ref "docs/deployment/cli" >}}).
+
+## Template for creating an uber/fat JAR with dependencies
+
+To build an application JAR that contains all dependencies required for declared connectors and libraries,
+you can use the following shade plugin definition:
+
+```xml

Review comment:
       Overall I find it weird that we don't just link to the quickstarts. They are ultimately the source of truth of how a recommended project looks like, and are also tested.

##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,219 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or locally for testing),
+the [Flink runtime library](https://ossindex.sonatype.org/component/pkg:maven/org.apache.flink/flink-runtime@1.14.3) 

Review comment:
       This is oddly specific, technically incorrect (you need way more than flink-runtime), it shouldn't contain a hard-coded version, and the link is quite weird (why not maven central?)

##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,219 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or locally for testing),
+the [Flink runtime library](https://ossindex.sonatype.org/component/pkg:maven/org.apache.flink/flink-runtime@1.14.3) 
+must be available.
+
+The guides in this section will show you how to configure your projects via popular build tools
+([Maven]({{< ref "docs/dev/configuration/maven" >}}), [Gradle]({{< ref "docs/dev/configuration/gradle" >}}),
+add the necessary dependencies (i.e. [connectors and formats]({{< ref "docs/dev/configuration/connector" >}}), 
+[testing]({{< ref "docs/dev/configuration/testing" >}})), and cover some 
+[advanced]({{< ref "docs/dev/configuration/advanced" >}}) configuration topics. 
+
+## Getting started
+
+To get started working on your Flink application, use the following commands, scripts, and templates 
+to create a Flink project.  
+
+{{< tabs "creating project" >}}
+{{< tab "Maven" >}}
+
+You can create a project based on an [Archetype](https://maven.apache.org/guides/introduction/introduction-to-archetypes.html)
+with the Maven command below or use the provided quickstart bash script.
+
+### Maven command
+```bash
+$ mvn archetype:generate                \
+  -DarchetypeGroupId=org.apache.flink   \
+  -DarchetypeArtifactId=flink-quickstart-java \
+  -DarchetypeVersion={{< version >}}
+```
+This allows you to name your newly created project and will interactively ask you for the groupId,
+artifactId, and package name.
+
+### Quickstart script
+```bash
+$ curl https://flink.apache.org/q/quickstart.sh | bash -s {{< version >}}
+```
+
+{{< /tab >}}
+{{< tab "Gradle" >}}
+You can create a project with a Gradle build script or use the provided quickstart bash script.
+
+### Gradle build script
+
+To execute these build configuration scripts, run the `gradle` command in the directory with these scripts.
+
+**build.gradle**
+
+```gradle
+plugins {
+    id 'java'
+    id 'application'
+    // shadow plugin to produce fat JARs
+    id 'com.github.johnrengelman.shadow' version '7.1.2'
+}
+// artifact properties
+group = 'org.myorg.quickstart'
+version = '0.1-SNAPSHOT'
+mainClassName = 'org.myorg.quickstart.StreamingJob'
+description = """Flink Quickstart Job"""
+ext {
+    javaVersion = '1.8'
+    flinkVersion = '{{< version >}}'
+    scalaBinaryVersion = '{{< scala_version >}}'

Review comment:
       As is this property is unused, and none of the instructions on this page reference it.

##########
File path: docs/content/docs/dev/configuration/advanced.md
##########
@@ -0,0 +1,148 @@
+---
+title: "Advanced Configuration"
+weight: 10
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Advanced Configuration Topics
+
+## Dependencies: Flink Core and User Application
+
+There are two broad categories of dependencies and libraries in Flink, which are explained below.
+
+### Flink Core Dependencies
+
+Flink itself consists of a set of classes and dependencies that form the core of Flink's runtime
+and must be present when a Flink application is started. The classes and dependencies needed to run
+the system handle areas such as coordination, networking, checkpointing, failover, APIs,
+operators (such as windowing), resource management, etc.
+
+These core classes and dependencies are packaged in the `flink-dist` jar, are part of Flink's `lib`
+folder, and part of the basic Flink container images. You can think of these dependencies as similar
+to Java's core library (i.e. `rt.jar`, `charsets.jar`), which contains classes like `String` and `List`.

Review comment:
       rt.jar doesn't exist in JDK 9+. I would just remove the references to any jars.

##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,219 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or locally for testing),
+the [Flink runtime library](https://ossindex.sonatype.org/component/pkg:maven/org.apache.flink/flink-runtime@1.14.3) 
+must be available.
+
+The guides in this section will show you how to configure your projects via popular build tools
+([Maven]({{< ref "docs/dev/configuration/maven" >}}), [Gradle]({{< ref "docs/dev/configuration/gradle" >}}),
+add the necessary dependencies (i.e. [connectors and formats]({{< ref "docs/dev/configuration/connector" >}}), 
+[testing]({{< ref "docs/dev/configuration/testing" >}})), and cover some 
+[advanced]({{< ref "docs/dev/configuration/advanced" >}}) configuration topics. 
+
+## Getting started
+
+To get started working on your Flink application, use the following commands, scripts, and templates 
+to create a Flink project.  
+
+{{< tabs "creating project" >}}
+{{< tab "Maven" >}}
+
+You can create a project based on an [Archetype](https://maven.apache.org/guides/introduction/introduction-to-archetypes.html)
+with the Maven command below or use the provided quickstart bash script.
+
+### Maven command
+```bash
+$ mvn archetype:generate                \
+  -DarchetypeGroupId=org.apache.flink   \
+  -DarchetypeArtifactId=flink-quickstart-java \
+  -DarchetypeVersion={{< version >}}
+```
+This allows you to name your newly created project and will interactively ask you for the groupId,
+artifactId, and package name.
+
+### Quickstart script
+```bash
+$ curl https://flink.apache.org/q/quickstart.sh | bash -s {{< version >}}
+```
+
+{{< /tab >}}
+{{< tab "Gradle" >}}
+You can create a project with a Gradle build script or use the provided quickstart bash script.
+
+### Gradle build script
+
+To execute these build configuration scripts, run the `gradle` command in the directory with these scripts.
+
+**build.gradle**
+
+```gradle
+plugins {
+    id 'java'
+    id 'application'
+    // shadow plugin to produce fat JARs
+    id 'com.github.johnrengelman.shadow' version '7.1.2'
+}
+// artifact properties
+group = 'org.myorg.quickstart'
+version = '0.1-SNAPSHOT'
+mainClassName = 'org.myorg.quickstart.StreamingJob'

Review comment:
       `myorg.org` points to an actively used domain. I would suggest to just remove it.

##########
File path: docs/content/docs/dev/configuration/maven.md
##########
@@ -0,0 +1,175 @@
+---
+title: "Using Maven"
+weight: 2
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Maven to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide will show you how to 
+do so with [Maven](https://maven.apache.org), an open-source build automation tool developed by the 
+Apache Group that enables you to build, publish, and deploy projects. You can use it to manage the 
+entire lifecycle of your software project.
+
+## Requirements
+
+- Maven 3.0.4 (or higher)
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you import this project into
+your IDE for developing and testing.
+
+IntelliJ IDEA supports Maven projects out-of-the-box. Eclipse offers the [m2e plugin](http://www.eclipse.org/m2e/) 
+to [import Maven projects](http://books.sonatype.com/m2eclipse-book/reference/creating-sect-importing-projects.html#fig-creating-import).
+
+**Note**: The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write into the `VM Arguments` box: `-Xmx800m`.
+In IntelliJ IDEA recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is necessary to tick the
+`Include dependencies with "Provided" scope` box in the run configuration. If this option is not available
+(possibly due to using an older IntelliJ IDEA version), then a workaround is to create a test that
+calls the application's `main()` method.
+
+## Building the project
+
+If you want to build/package your project, navigate to your project directory and run the
+'`mvn clean package`' command. You will find a JAR file that contains your application (plus connectors
+and libraries that you may have added as dependencies to the application) here:`target/<artifact-id>-<version>.jar`.
+
+__Note:__ If you used a different class than `DataStreamJob` as the application's main class / entry point,
+we recommend you change the `mainClass` setting in the `pom.xml` file accordingly so that Flink
+can run the application from the JAR file without additionally specifying the main class.
+
+## Adding dependencies to the project
+
+Open the `pom.xml` file in your profile directory and add the dependency in between
+the `dependencies` tab.  
+
+For example, you can add the Kafka connector as a dependency like this:
+
+```xml
+<dependencies>
+    
+    <dependency>
+        <groupId>org.apache.flink</groupId>
+        <artifactId>flink-connector-kafka</artifactId>
+        <version>{{< version >}}</version>
+    </dependency>
+    
+</dependencies>
+```
+
+Then execute `mvn install` on the command line. 
+
+Projects created from the `Java Project Template`, the `Scala Project Template`, or Gradle are configured
+to automatically include the application dependencies into the application JAR when you run `mvn clean package`.
+For projects that are not set up from those templates, we recommend adding the Maven Shade Plugin to
+build the application jar with all required dependencies.
+
+**Important:** Note that all these core API dependencies should have their scope set to [*provided*](https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#dependency-scope). This means that
+they are needed to compile against, but that they should not be packaged into the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
+becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR file clash with some of
+your own dependency versions (which is normally avoided through inverted classloading).
+
+To correctly package the dependencies into the application JAR, the Flink API dependencies must 
+be set to the *compile* scope.
+
+## Packaging the application
+
+Depending on your use case, you may need to package your Flink application in different ways before it
+gets deployed to a Flink environment. 
+
+If you want to create a JAR for a Flink Job and use only Flink dependencies without any third-party 
+dependencies (i.e. using the filesystem connector with JSON format), you do not need to create an 
+uber/fat JAR or shade any dependencies.
+
+If you want to create a JAR for a Flink Job and use external dependencies not built into the Flink 
+distribution, you can either add them to the classpath of the distribution or shade them into your 
+uber/fat application JAR.
+
+With the generated uber/fat JAR, you can submit it to a local or remote cluster with:
+
+```sh
+bin/flink run -c org.example.MyJob myFatJar.jar
+```
+
+To learn more about how to deploy Flink jobs, check out the [deployment guide]({{< ref "docs/deployment/cli" >}}).
+
+## Template for creating an uber/fat JAR with dependencies
+
+To build an application JAR that contains all dependencies required for declared connectors and libraries,
+you can use the following shade plugin definition:
+
+```xml
+<build>
+    <plugins>
+        <plugin>
+            <groupId>org.apache.maven.plugins</groupId>
+            <artifactId>maven-shade-plugin</artifactId>
+            <version>3.2.4</version>
+            <executions>
+                <execution>
+                    <phase>package</phase>
+                    <goals>
+                        <goal>shade</goal>
+                    </goals>
+                    <configuration>
+                        <artifactSet>
+                            <excludes>
+                                <exclude>com.google.code.findbugs:jsr305</exclude>
+                                <exclude>org.slf4j:*</exclude>
+                                <exclude>log4j:*</exclude>
+                            </excludes>
+                        </artifactSet>
+                        <filters>
+                            <filter>
+                                <!-- Do not copy the signatures in the META-INF folder.
+                                Otherwise, this might cause SecurityExceptions when using the JAR. -->
+                                <artifact>*:*</artifact>
+                                <excludes>
+                                    <exclude>META-INF/*.SF</exclude>
+                                    <exclude>META-INF/*.DSA</exclude>
+                                    <exclude>META-INF/*.RSA</exclude>
+                                </excludes>
+                            </filter>
+                        </filters>
+                        <transformers>
+                            <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">

Review comment:
       This is missing the `ServicesResourceTransformer`.

##########
File path: docs/content/docs/dev/configuration/maven.md
##########
@@ -0,0 +1,175 @@
+---
+title: "Using Maven"
+weight: 2
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Maven to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide will show you how to 
+do so with [Maven](https://maven.apache.org), an open-source build automation tool developed by the 
+Apache Group that enables you to build, publish, and deploy projects. You can use it to manage the 
+entire lifecycle of your software project.
+
+## Requirements
+
+- Maven 3.0.4 (or higher)
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you import this project into
+your IDE for developing and testing.
+
+IntelliJ IDEA supports Maven projects out-of-the-box. Eclipse offers the [m2e plugin](http://www.eclipse.org/m2e/) 
+to [import Maven projects](http://books.sonatype.com/m2eclipse-book/reference/creating-sect-importing-projects.html#fig-creating-import).
+
+**Note**: The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write into the `VM Arguments` box: `-Xmx800m`.
+In IntelliJ IDEA recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is necessary to tick the
+`Include dependencies with "Provided" scope` box in the run configuration. If this option is not available
+(possibly due to using an older IntelliJ IDEA version), then a workaround is to create a test that
+calls the application's `main()` method.
+
+## Building the project
+
+If you want to build/package your project, navigate to your project directory and run the
+'`mvn clean package`' command. You will find a JAR file that contains your application (plus connectors
+and libraries that you may have added as dependencies to the application) here:`target/<artifact-id>-<version>.jar`.
+
+__Note:__ If you used a different class than `DataStreamJob` as the application's main class / entry point,
+we recommend you change the `mainClass` setting in the `pom.xml` file accordingly so that Flink
+can run the application from the JAR file without additionally specifying the main class.
+
+## Adding dependencies to the project
+
+Open the `pom.xml` file in your profile directory and add the dependency in between
+the `dependencies` tab.  
+
+For example, you can add the Kafka connector as a dependency like this:
+
+```xml
+<dependencies>
+    
+    <dependency>
+        <groupId>org.apache.flink</groupId>
+        <artifactId>flink-connector-kafka</artifactId>
+        <version>{{< version >}}</version>
+    </dependency>
+    
+</dependencies>
+```
+
+Then execute `mvn install` on the command line. 
+
+Projects created from the `Java Project Template`, the `Scala Project Template`, or Gradle are configured
+to automatically include the application dependencies into the application JAR when you run `mvn clean package`.
+For projects that are not set up from those templates, we recommend adding the Maven Shade Plugin to
+build the application jar with all required dependencies.
+
+**Important:** Note that all these core API dependencies should have their scope set to [*provided*](https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#dependency-scope). This means that
+they are needed to compile against, but that they should not be packaged into the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
+becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR file clash with some of
+your own dependency versions (which is normally avoided through inverted classloading).
+
+To correctly package the dependencies into the application JAR, the Flink API dependencies must 
+be set to the *compile* scope.
+
+## Packaging the application
+
+Depending on your use case, you may need to package your Flink application in different ways before it
+gets deployed to a Flink environment. 
+
+If you want to create a JAR for a Flink Job and use only Flink dependencies without any third-party 
+dependencies (i.e. using the filesystem connector with JSON format), you do not need to create an 
+uber/fat JAR or shade any dependencies.
+
+If you want to create a JAR for a Flink Job and use external dependencies not built into the Flink 
+distribution, you can either add them to the classpath of the distribution or shade them into your 
+uber/fat application JAR.
+
+With the generated uber/fat JAR, you can submit it to a local or remote cluster with:
+
+```sh
+bin/flink run -c org.example.MyJob myFatJar.jar
+```
+
+To learn more about how to deploy Flink jobs, check out the [deployment guide]({{< ref "docs/deployment/cli" >}}).
+
+## Template for creating an uber/fat JAR with dependencies
+
+To build an application JAR that contains all dependencies required for declared connectors and libraries,
+you can use the following shade plugin definition:
+
+```xml
+<build>
+    <plugins>
+        <plugin>
+            <groupId>org.apache.maven.plugins</groupId>
+            <artifactId>maven-shade-plugin</artifactId>
+            <version>3.2.4</version>

Review comment:
       this is different than the version in our quickstarts.

##########
File path: docs/content/docs/dev/configuration/maven.md
##########
@@ -0,0 +1,175 @@
+---
+title: "Using Maven"
+weight: 2
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Maven to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide will show you how to 
+do so with [Maven](https://maven.apache.org), an open-source build automation tool developed by the 
+Apache Group that enables you to build, publish, and deploy projects. You can use it to manage the 
+entire lifecycle of your software project.
+
+## Requirements
+
+- Maven 3.0.4 (or higher)
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you import this project into
+your IDE for developing and testing.

Review comment:
       same issue as with the gradle guide.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1bc80c1579db9e7f2d57112fdb31f0190f71db67 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277) 
   * 274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] zentol merged pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
zentol merged pull request #18353:
URL: https://github.com/apache/flink/pull/18353


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] matriv commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
matriv commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r798477397



##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,219 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or locally for testing),
+the [Flink runtime library](https://ossindex.sonatype.org/component/pkg:maven/org.apache.flink/flink-runtime@1.14.3) 
+must be available.
+
+The guides in this section will show you how to configure your projects via popular build tools
+([Maven]({{< ref "docs/dev/configuration/maven" >}}), [Gradle]({{< ref "docs/dev/configuration/gradle" >}}),
+add the necessary dependencies (i.e. [connectors and formats]({{< ref "docs/dev/configuration/connector" >}}), 
+[testing]({{< ref "docs/dev/configuration/testing" >}})), and cover some 
+[advanced]({{< ref "docs/dev/configuration/advanced" >}}) configuration topics. 
+
+## Getting started
+
+To get started working on your Flink application, use the following commands, scripts, and templates 
+to create a Flink project.  
+
+{{< tabs "creating project" >}}
+{{< tab "Maven" >}}
+
+You can create a project based on an [Archetype](https://maven.apache.org/guides/introduction/introduction-to-archetypes.html)
+with the Maven command below or use the provided quickstart bash script.
+
+### Maven command
+```bash
+$ mvn archetype:generate                \
+  -DarchetypeGroupId=org.apache.flink   \
+  -DarchetypeArtifactId=flink-quickstart-java \
+  -DarchetypeVersion={{< version >}}
+```
+This allows you to name your newly created project and will interactively ask you for the groupId,
+artifactId, and package name.
+
+### Quickstart script
+```bash
+$ curl https://flink.apache.org/q/quickstart.sh | bash -s {{< version >}}
+```
+
+{{< /tab >}}
+{{< tab "Gradle" >}}
+You can create a project with a Gradle build script or use the provided quickstart bash script.
+
+### Gradle build script
+
+To execute these build configuration scripts, run the `gradle` command in the directory with these scripts.
+
+**build.gradle**
+
+```gradle
+plugins {
+    id 'java'
+    id 'application'
+    // shadow plugin to produce fat JARs
+    id 'com.github.johnrengelman.shadow' version '7.1.2'
+}
+// artifact properties
+group = 'org.myorg.quickstart'
+version = '0.1-SNAPSHOT'
+mainClassName = 'org.myorg.quickstart.StreamingJob'
+description = """Flink Quickstart Job"""
+ext {
+    javaVersion = '1.8'
+    flinkVersion = '{{< version >}}'
+    scalaBinaryVersion = '{{< scala_version >}}'
+    slf4jVersion = '1.7.32'
+    log4jVersion = '2.17.1'
+}
+sourceCompatibility = javaVersion
+targetCompatibility = javaVersion
+tasks.withType(JavaCompile) {
+	options.encoding = 'UTF-8'
+}
+applicationDefaultJvmArgs = ["-Dlog4j.configurationFile=log4j2.properties"]
+
+// declare where to find the dependencies of your project
+repositories {
+    mavenCentral()
+    maven { url "https://repository.apache.org/content/repositories/snapshots/" }

Review comment:
       But if they want `1.15-SNAPSHOT` ?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] matriv commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
matriv commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r798344074



##########
File path: docs/content/docs/dev/configuration/gradle.md
##########
@@ -0,0 +1,120 @@
+---
+title: "Using Gradle"
+weight: 3
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Gradle to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide will show you how to
+do so with [Gradle](https://gradle.org), an open-source general-purpose build tool that can be used 
+to automate tasks in the development process.
+
+## Requirements
+
+- Gradle 7.x 
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you import this project into
+your IDE for developing and testing.
+
+IntelliJ IDEA supports Gradle projects via the `Gradle` plugin.
+
+Eclipse does so via the [Eclipse Buildship](https://projects.eclipse.org/projects/tools.buildship)
+plugin (make sure to specify a Gradle version >= 3.0 in the last step of the import wizard; the `shadow`
+plugin requires it). You may also use [Gradle's IDE integration](https://docs.gradle.org/current/userguide/userguide.html#ide-integration)
+to create project files with Gradle.
+
+**Note**: The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write into the `VM Arguments` box: `-Xmx800m`.
+In IntelliJ IDEA recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is necessary to tick the
+`Include dependencies with "Provided" scope` box in the run configuration. If this option is not available
+(possibly due to using an older IntelliJ IDEA version), then a workaround is to create a test that
+calls the application's `main()` method.
+
+## Building the project
+
+If you want to __build/package your project__, go to your project directory and
+run the '`gradle clean shadowJar`' command.
+You will __find a JAR file__ that contains your application, plus connectors and libraries
+that you may have added as dependencies to the application: `build/libs/<project-name>-<version>-all.jar`.
+
+__Note:__ If you use a different class than *StreamingJob* as the application's main class / entry point,
+we recommend you change the `mainClassName` setting in the `build.gradle` file accordingly. That way, Flink
+can run the application from the JAR file without additionally specifying the main class.
+
+## Adding dependencies to the project
+
+Specify a dependency configuration in the dependencies block of your `build.gradle` file.
+
+For example, you can add the Kafka connector as a dependency like this:
+
+**build.gradle**
+
+```gradle
+...
+dependencies {
+    ...  
+    flinkShadowJar "org.apache.flink:flink-connector-kafka:${flinkVersion}"
+    ...
+}
+...
+```
+
+**Important:** Note that all these (core) dependencies should have their scope set to [*provided*](https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#dependency-scope). This means that
+they are needed to compile against, but that they should not be packaged into the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
+becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR file clash with some of
+your own dependency versions (which is normally avoided through inverted classloading).
+
+To correctly package the dependencies into the application JAR, these application dependencies must
+be set to the *compile* scope.
+
+## Packaging the application
+
+Depending on your use case, you may need to package your Flink application in different ways before 
+it gets deployed to a Flink environment.
+
+If you want to create a JAR for a Flink Job and use only Flink dependencies without any third-party
+dependencies (i.e. using the filesystem connector with JSON format), you do not need to create an
+uber/fat JAR or shade any dependencies.

Review comment:
       @infoverload Please state here the `installDist`




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30537",
       "triggerID" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1845471184d68e8edd89fd19a591030290695cf3",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30593",
       "triggerID" : "1845471184d68e8edd89fd19a591030290695cf3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e33ad31f47e113b480de1c4f6ac6efc40992084b",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30651",
       "triggerID" : "e33ad31f47e113b480de1c4f6ac6efc40992084b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "78c5075dd300c8e74705afbb13b10377da61865a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30675",
       "triggerID" : "78c5075dd300c8e74705afbb13b10377da61865a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * e33ad31f47e113b480de1c4f6ac6efc40992084b Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30651) 
   * 78c5075dd300c8e74705afbb13b10377da61865a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30675) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] MartijnVisser commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
MartijnVisser commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r796397547



##########
File path: docs/content/docs/dev/table/overview.md
##########
@@ -40,8 +40,8 @@ and later use the DataStream API to build alerting based on the matched patterns
 
 ## Table Program Dependencies
 
-Depending on the target programming language, you need to add the Java or Scala API to a project
-in order to use the Table API & SQL for defining pipelines.
+Depending on the target programming language, you need to add the Table API to a project in order to
+use the Table API & SQL for defining data pipelines.

Review comment:
       I would +1 removal of these tabs here. The overview page for Table should be comparable to the current DataStream Overview, so an overview of the Table APIs capabilities etc. We've moved the setup to the project configuration page, which fits better. Python is kind of a weird one with its own requirements, so for now better to link to the Python. 
   
   So instead of having a section Table Program Dependencies, perhaps just a link to the project configuration page with a text like "Check out the project configuration for setting up your environment so you can start building a Table API program"? (But then in nicer phrasing 😅 )




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 50e221c6373f2774eeb8ec134df27caef83b2d40 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495) 
   * 8e5ae4098896a176d49d4b7319f69b6529412549 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516) 
   * 532af8b3b761f7b589500e5ead9b888f2370675a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528) 
   * 3de39d65f624fb6cf075e9e5ddfe005c197ee3dc UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1bc80c1579db9e7f2d57112fdb31f0190f71db67 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277) 
   * 274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312) 
   * 26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316) 
   * 6f44bc92aca6fc133b10cefd438f99810ce65293 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1bc80c1579db9e7f2d57112fdb31f0190f71db67 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277) 
   * 274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312) 
   * 26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316) 
   * 6f44bc92aca6fc133b10cefd438f99810ce65293 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321) 
   * 1a637d30a2332b6c7059be2118a3cb8d40718eb9 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30537",
       "triggerID" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1845471184d68e8edd89fd19a591030290695cf3",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30593",
       "triggerID" : "1845471184d68e8edd89fd19a591030290695cf3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e33ad31f47e113b480de1c4f6ac6efc40992084b",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30651",
       "triggerID" : "e33ad31f47e113b480de1c4f6ac6efc40992084b",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1845471184d68e8edd89fd19a591030290695cf3 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30593) 
   * e33ad31f47e113b480de1c4f6ac6efc40992084b Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30651) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] zentol commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
zentol commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r799237764



##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -181,22 +171,6 @@ rootProject.name = 'quickstart'
 bash -c "$(curl https://flink.apache.org/q/gradle-quickstart.sh)" -- {{< version >}} {{< scala_version >}}
 ```
 {{< /tab >}}
-{{< tab "sbt" >}}
-You can scaffold a new Flink project with the following [giter8 template](https://github.com/tillrohrmann/flink-project.g8)
-and the `sbt new` command (which creates new build definitions from a template) or use the provided quickstart bash script.
-
-### sbt template
-
-```bash
-$ sbt new tillrohrmann/flink-project.g8

Review comment:
       Why was this removed?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] matriv commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
matriv commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r798478483



##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,219 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or locally for testing),
+the [Flink runtime library](https://ossindex.sonatype.org/component/pkg:maven/org.apache.flink/flink-runtime@1.14.3) 
+must be available.
+
+The guides in this section will show you how to configure your projects via popular build tools
+([Maven]({{< ref "docs/dev/configuration/maven" >}}), [Gradle]({{< ref "docs/dev/configuration/gradle" >}}),
+add the necessary dependencies (i.e. [connectors and formats]({{< ref "docs/dev/configuration/connector" >}}), 
+[testing]({{< ref "docs/dev/configuration/testing" >}})), and cover some 
+[advanced]({{< ref "docs/dev/configuration/advanced" >}}) configuration topics. 
+
+## Getting started
+
+To get started working on your Flink application, use the following commands, scripts, and templates 
+to create a Flink project.  
+
+{{< tabs "creating project" >}}
+{{< tab "Maven" >}}
+
+You can create a project based on an [Archetype](https://maven.apache.org/guides/introduction/introduction-to-archetypes.html)
+with the Maven command below or use the provided quickstart bash script.
+
+### Maven command
+```bash
+$ mvn archetype:generate                \
+  -DarchetypeGroupId=org.apache.flink   \
+  -DarchetypeArtifactId=flink-quickstart-java \
+  -DarchetypeVersion={{< version >}}
+```
+This allows you to name your newly created project and will interactively ask you for the groupId,
+artifactId, and package name.
+
+### Quickstart script
+```bash
+$ curl https://flink.apache.org/q/quickstart.sh | bash -s {{< version >}}
+```
+
+{{< /tab >}}
+{{< tab "Gradle" >}}
+You can create a project with a Gradle build script or use the provided quickstart bash script.
+
+### Gradle build script
+
+To execute these build configuration scripts, run the `gradle` command in the directory with these scripts.
+
+**build.gradle**
+
+```gradle
+plugins {
+    id 'java'
+    id 'application'
+    // shadow plugin to produce fat JARs
+    id 'com.github.johnrengelman.shadow' version '7.1.2'
+}
+// artifact properties
+group = 'org.myorg.quickstart'
+version = '0.1-SNAPSHOT'
+mainClassName = 'org.myorg.quickstart.StreamingJob'
+description = """Flink Quickstart Job"""
+ext {
+    javaVersion = '1.8'
+    flinkVersion = '{{< version >}}'
+    scalaBinaryVersion = '{{< scala_version >}}'
+    slf4jVersion = '1.7.32'
+    log4jVersion = '2.17.1'
+}
+sourceCompatibility = javaVersion
+targetCompatibility = javaVersion
+tasks.withType(JavaCompile) {
+	options.encoding = 'UTF-8'
+}
+applicationDefaultJvmArgs = ["-Dlog4j.configurationFile=log4j2.properties"]
+
+// declare where to find the dependencies of your project
+repositories {
+    mavenCentral()
+    maven { url "https://repository.apache.org/content/repositories/snapshots/" }
+}
+// NOTE: We cannot use "compileOnly" or "shadow" configurations since then we could not run code
+// in the IDE or with "gradle run". We also cannot exclude transitive dependencies from the
+// shadowJar yet (see https://github.com/johnrengelman/shadow/issues/159).
+// -> Explicitly define the // libraries we want to be included in the "flinkShadowJar" configuration!
+configurations {
+    flinkShadowJar // dependencies which go into the shadowJar
+    // always exclude these (also from transitive dependencies) since they are provided by Flink
+    flinkShadowJar.exclude group: 'org.apache.flink', module: 'force-shading'
+    flinkShadowJar.exclude group: 'com.google.code.findbugs', module: 'jsr305'
+    flinkShadowJar.exclude group: 'org.slf4j'
+    flinkShadowJar.exclude group: 'org.apache.logging.log4j'
+}
+// declare the dependencies for your production and test code
+dependencies {
+    // --------------------------------------------------------------
+    // Compile-time dependencies that should NOT be part of the
+    // shadow jar and are provided in the lib folder of Flink
+    // --------------------------------------------------------------
+    implementation "org.apache.flink:flink-streaming-java:${flinkVersion}"
+    implementation "org.apache.flink:flink-clients:${flinkVersion}"

Review comment:
       It's a mistake I did there, I'll fix it. thx!




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] zentol commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
zentol commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r798502467



##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,219 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or locally for testing),
+the [Flink runtime library](https://ossindex.sonatype.org/component/pkg:maven/org.apache.flink/flink-runtime@1.14.3) 
+must be available.
+
+The guides in this section will show you how to configure your projects via popular build tools
+([Maven]({{< ref "docs/dev/configuration/maven" >}}), [Gradle]({{< ref "docs/dev/configuration/gradle" >}}),
+add the necessary dependencies (i.e. [connectors and formats]({{< ref "docs/dev/configuration/connector" >}}), 
+[testing]({{< ref "docs/dev/configuration/testing" >}})), and cover some 
+[advanced]({{< ref "docs/dev/configuration/advanced" >}}) configuration topics. 
+
+## Getting started
+
+To get started working on your Flink application, use the following commands, scripts, and templates 
+to create a Flink project.  
+
+{{< tabs "creating project" >}}
+{{< tab "Maven" >}}
+
+You can create a project based on an [Archetype](https://maven.apache.org/guides/introduction/introduction-to-archetypes.html)
+with the Maven command below or use the provided quickstart bash script.
+
+### Maven command
+```bash
+$ mvn archetype:generate                \
+  -DarchetypeGroupId=org.apache.flink   \
+  -DarchetypeArtifactId=flink-quickstart-java \
+  -DarchetypeVersion={{< version >}}
+```
+This allows you to name your newly created project and will interactively ask you for the groupId,
+artifactId, and package name.
+
+### Quickstart script
+```bash
+$ curl https://flink.apache.org/q/quickstart.sh | bash -s {{< version >}}
+```
+
+{{< /tab >}}
+{{< tab "Gradle" >}}
+You can create a project with a Gradle build script or use the provided quickstart bash script.
+
+### Gradle build script
+
+To execute these build configuration scripts, run the `gradle` command in the directory with these scripts.
+
+**build.gradle**
+
+```gradle
+plugins {
+    id 'java'
+    id 'application'
+    // shadow plugin to produce fat JARs
+    id 'com.github.johnrengelman.shadow' version '7.1.2'
+}
+// artifact properties
+group = 'org.myorg.quickstart'
+version = '0.1-SNAPSHOT'
+mainClassName = 'org.myorg.quickstart.StreamingJob'
+description = """Flink Quickstart Job"""
+ext {
+    javaVersion = '1.8'
+    flinkVersion = '{{< version >}}'
+    scalaBinaryVersion = '{{< scala_version >}}'
+    slf4jVersion = '1.7.32'
+    log4jVersion = '2.17.1'
+}
+sourceCompatibility = javaVersion
+targetCompatibility = javaVersion
+tasks.withType(JavaCompile) {
+	options.encoding = 'UTF-8'
+}
+applicationDefaultJvmArgs = ["-Dlog4j.configurationFile=log4j2.properties"]
+
+// declare where to find the dependencies of your project
+repositories {
+    mavenCentral()
+    maven { url "https://repository.apache.org/content/repositories/snapshots/" }

Review comment:
       Then they need to figure out how to use it. Users are not the target audience for snapshot artifacts; in fact we are not allowed to advertise such artifacts.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30537",
       "triggerID" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1845471184d68e8edd89fd19a591030290695cf3",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30593",
       "triggerID" : "1845471184d68e8edd89fd19a591030290695cf3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e33ad31f47e113b480de1c4f6ac6efc40992084b",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30651",
       "triggerID" : "e33ad31f47e113b480de1c4f6ac6efc40992084b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "78c5075dd300c8e74705afbb13b10377da61865a",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "78c5075dd300c8e74705afbb13b10377da61865a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * e33ad31f47e113b480de1c4f6ac6efc40992084b Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30651) 
   * 78c5075dd300c8e74705afbb13b10377da61865a UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] matriv commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
matriv commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r795703782



##########
File path: docs/content/docs/dev/configuration/gradle.md
##########
@@ -0,0 +1,110 @@
+---
+title: "Using Gradle"
+weight: 3
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Gradle to configure your project

Review comment:
       I think we can link to: https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/dev/datastream/project-configuration/#gradle and create an issue to update that page using the latest 7.x gradle version since it's very old currently (`3.1`)

##########
File path: docs/content/docs/dev/configuration/gradle.md
##########
@@ -0,0 +1,110 @@
+---
+title: "Using Gradle"
+weight: 3
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Gradle to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide will show you how to
+do so with [Gradle](https://gradle.org), an open-source general-purpose build tool that can be used 
+to automate tasks in the development process.
+
+## Requirements
+
+- Gradle 3.x (or higher)
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you import this project into
+your IDE for developing and testing.
+
+
+IntelliJ IDEA supports Gradle projects via the `Gradle` plugin.
+
+Eclipse does so via the [Eclipse Buildship](https://projects.eclipse.org/projects/tools.buildship)
+plugin (make sure to specify a Gradle version >= 3.0 in the last step of the import wizard; the `shadow`
+plugin requires it). You may also use [Gradle's IDE integration](https://docs.gradle.org/current/userguide/userguide.html#ide-integration)
+to create project files with Gradle.
+
+*Note*: The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write into the `VM Arguments` box: `-Xmx800m`.
+In IntelliJ IDEA recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+## Building the project
+
+If you want to __build/package your project__, go to your project directory and
+run the '`gradle clean shadowJar`' command.
+You will __find a JAR file__ that contains your application, plus connectors and libraries
+that you may have added as dependencies to the application: `build/libs/<project-name>-<version>-all.jar`.
+
+__Note:__ If you use a different class than *StreamingJob* as the application's main class / entry point,
+we recommend you change the `mainClassName` setting in the `build.gradle` file accordingly. That way, Flink
+can run the application from the JAR file without additionally specifying the main class.
+
+## Adding dependencies to the project
+
+Specify a dependency configuration in the dependencies block of your `build.gradle` file.
+
+For example, you can add the Kafka connector as a dependency like this:
+
+**build.gradle**
+
+```gradle
+...
+dependencies {
+    ...  
+    flinkShadowJar "org.apache.flink:flink-connector-kafka:${flinkVersion}"
+    ...
+}
+...
+```
+
+**Important:** Note that all these (core) dependencies should have their scope set to [*provided*](https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#dependency-scope). This means that
+they are needed to compile against, but that they should not be packaged into the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
+becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR file clash with some of
+your own dependency versions (which is normally avoided through inverted classloading).
+
+To correctly package the dependencies into the application JAR, these application dependencies must
+be set to the *compile* scope.
+
+## Packaging the application
+
+Depending on your use case, you may need to package your Flink application in different ways before it
+gets deployed to a Flink environment.
+
+If you want to create a JAR for a Flink Job and use only Flink dependencies without any third-party
+dependencies (i.e. using the filesystem connector with JSON format), you do not need to create an
+uber/fat JAR or shade any dependencies.
+
+If you want to create a JAR for a Flink Job and use external dependencies not built into the Flink
+distribution, you can either add them to the classpath of the distribution or shade them into your
+uber/fat application JAR.
+
+With the generated uber/fat JAR, you can submit it to a local or remote cluster with:
+
+```sh
+bin/flink run -c org.example.MyJob myFatJar.jar
+```

Review comment:
       Based on the configuration described here: https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/dev/datastream/project-configuration/#gradle the user can use `gradle clean installDist` and `gradle clean installShadowDist` or `./gradlew clean installDist` and `./gradlew clean installShadowDist` if he/she wants to use the autonomous gradle wrapper instead of an "OS based" installed gradle.
   




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 50e221c6373f2774eeb8ec134df27caef83b2d40 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1bc80c1579db9e7f2d57112fdb31f0190f71db67 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1bc80c1579db9e7f2d57112fdb31f0190f71db67 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277) 
   * 274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312) 
   * 26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316) 
   * 6f44bc92aca6fc133b10cefd438f99810ce65293 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d0b2b188c37443b7bbda39af499398326cd56979 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207) 
   * 1bc80c1579db9e7f2d57112fdb31f0190f71db67 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] MartijnVisser commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
MartijnVisser commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r789763661



##########
File path: docs/content/docs/dev/configuration/advanced.md
##########
@@ -0,0 +1,203 @@
+---
+title: "Advanced Configuration"
+weight: 10
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Dependencies: Flink Core and User Application
+
+There are two broad categories of dependencies and libraries in Flink, which are explained below.
+
+## Flink Core Dependencies
+
+Flink itself consists of a set of classes and dependencies that form the core of Flink's runtime
+and must be present when a Flink application is started. The classes and dependencies needed to run
+the system handle areas such as coordination, networking, checkpointing, failover, APIs,
+operations (such as windowing), resource management, etc.

Review comment:
       I'm wondering if it's operations or operators. I think the latte right be more correct

##########
File path: docs/content/docs/connectors/table/elasticsearch.md
##########
@@ -40,6 +40,9 @@ Dependencies
 
 {{< sql_download_table "elastic" >}}
 
+The Elastic connector is not currently part of the binary distribution.

Review comment:
       Should be Elasticsearch

##########
File path: docs/content/docs/dev/configuration/advanced.md
##########
@@ -0,0 +1,203 @@
+---
+title: "Advanced Configuration"
+weight: 10
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Dependencies: Flink Core and User Application
+
+There are two broad categories of dependencies and libraries in Flink, which are explained below.
+
+## Flink Core Dependencies
+
+Flink itself consists of a set of classes and dependencies that form the core of Flink's runtime
+and must be present when a Flink application is started. The classes and dependencies needed to run
+the system handle areas such as coordination, networking, checkpointing, failover, APIs,
+operations (such as windowing), resource management, etc.
+
+These core classes and dependencies are packaged in the `flink-dist` jar, are part of Flink's `lib`
+folder, and part of the basic Flink container images. You can think of these dependencies as similar
+to Java's core library (i.e. `rt.jar`, `charsets.jar`), which contains classes like `String` and `List`.
+
+In order to keep the core dependencies as small as possible and avoid dependency clashes, the
+Flink Core Dependencies do not contain any connectors or libraries (i.e. CEP, SQL, ML) in order to
+avoid having an excessive default number of classes and dependencies in the classpath.
+
+## User Application Dependencies
+
+These dependencies include all connectors, formats, or libraries that a specific user application
+needs and explicitly do not include the Flink DataStream APIs and runtime dependencies since those
+are already part of the Flink Core Dependencies.
+
+The user application is typically packaged into an *application jar*, which contains the application
+code and the required connector and library dependencies.
+
+## IDE configuration
+
+The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write `-Xmx800m` into the `VM Arguments` box.
+In IntelliJ IDEA, the recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is necessary to tick the
+`Include dependencies with "Provided" scope` box in the run configuration. If this option is not available
+(possibly due to using an older IntelliJ IDEA version), then a workaround is to create a test that
+calls the application's `main()` method.
+
+# Scala Versions
+
+Different Scala versions are not binary compatible with one another. For that reason, Flink for 
+Scala 2.11 cannot be used with an application that uses Scala 2.12. All Flink dependencies that 
+(transitively) depend on Scala are suffixed with the Scala version that they are built for 
+(i.e. `flink-streaming-scala_2.12`).
+
+If you are only using Java, you can use any Scala version. If you are using Scala, you need to pick 
+the Scala version that matches the application's Scala version.
+
+Please refer to the [build guide]({{< ref "docs/flinkDev/building" >}}#scala-versions) for details 
+on how to build Flink for a specific Scala version.
+
+Scala versions after 2.12.8 are not binary compatible with previous 2.12.x versions. This prevents
+the Flink project from upgrading its 2.12.x builds beyond 2.12.8. You can build Flink locally for
+later Scala versions by following the [build guide]({{< ref "docs/flinkDev/building" >}}#scala-versions).
+For this to work, you will need to add `-Djapicmp.skip` to skip binary compatibility checks when building.
+
+See the [Scala 2.12.8 release notes](https://github.com/scala/scala/releases/tag/v2.12.8) for more details.
+The relevant section states:
+
+> The second fix is not binary compatible: the 2.12.8 compiler omits certain
+> methods that are generated by earlier 2.12 compilers. However, we believe
+> that these methods are never used and existing compiled code will continue to
+> work.  See the [pull request
+> description](https://github.com/scala/scala/pull/7469) for more details.
+
+# Distribution
+
+The Flink distribution contains by default the required JARs to execute Flink SQL Jobs in `/lib`, in particular:
+
+-`flink-table-api-java-uber-{{< version >}}.jar` containing all the Java APIs
+-`flink-table-runtime-{{< version >}}.jar` containing the runtime
+-`flink-table-planner-loader-{{< version >}}.jar` containing the query planner
+
+When using formats and connectors with the Scala API, you need to either download and manually include 

Review comment:
       Flink Scala API

##########
File path: docs/content/docs/dev/configuration/advanced.md
##########
@@ -0,0 +1,203 @@
+---
+title: "Advanced Configuration"
+weight: 10
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Dependencies: Flink Core and User Application
+
+There are two broad categories of dependencies and libraries in Flink, which are explained below.
+
+## Flink Core Dependencies
+
+Flink itself consists of a set of classes and dependencies that form the core of Flink's runtime
+and must be present when a Flink application is started. The classes and dependencies needed to run
+the system handle areas such as coordination, networking, checkpointing, failover, APIs,
+operations (such as windowing), resource management, etc.
+
+These core classes and dependencies are packaged in the `flink-dist` jar, are part of Flink's `lib`
+folder, and part of the basic Flink container images. You can think of these dependencies as similar
+to Java's core library (i.e. `rt.jar`, `charsets.jar`), which contains classes like `String` and `List`.
+
+In order to keep the core dependencies as small as possible and avoid dependency clashes, the
+Flink Core Dependencies do not contain any connectors or libraries (i.e. CEP, SQL, ML) in order to
+avoid having an excessive default number of classes and dependencies in the classpath.
+
+## User Application Dependencies
+
+These dependencies include all connectors, formats, or libraries that a specific user application
+needs and explicitly do not include the Flink DataStream APIs and runtime dependencies since those
+are already part of the Flink Core Dependencies.
+
+The user application is typically packaged into an *application jar*, which contains the application
+code and the required connector and library dependencies.
+
+## IDE configuration
+
+The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write `-Xmx800m` into the `VM Arguments` box.
+In IntelliJ IDEA, the recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is necessary to tick the
+`Include dependencies with "Provided" scope` box in the run configuration. If this option is not available
+(possibly due to using an older IntelliJ IDEA version), then a workaround is to create a test that
+calls the application's `main()` method.
+
+# Scala Versions
+
+Different Scala versions are not binary compatible with one another. For that reason, Flink for 
+Scala 2.11 cannot be used with an application that uses Scala 2.12. All Flink dependencies that 
+(transitively) depend on Scala are suffixed with the Scala version that they are built for 
+(i.e. `flink-streaming-scala_2.12`).
+
+If you are only using Java, you can use any Scala version. If you are using Scala, you need to pick 
+the Scala version that matches the application's Scala version.
+
+Please refer to the [build guide]({{< ref "docs/flinkDev/building" >}}#scala-versions) for details 
+on how to build Flink for a specific Scala version.
+
+Scala versions after 2.12.8 are not binary compatible with previous 2.12.x versions. This prevents
+the Flink project from upgrading its 2.12.x builds beyond 2.12.8. You can build Flink locally for
+later Scala versions by following the [build guide]({{< ref "docs/flinkDev/building" >}}#scala-versions).
+For this to work, you will need to add `-Djapicmp.skip` to skip binary compatibility checks when building.
+
+See the [Scala 2.12.8 release notes](https://github.com/scala/scala/releases/tag/v2.12.8) for more details.
+The relevant section states:
+
+> The second fix is not binary compatible: the 2.12.8 compiler omits certain
+> methods that are generated by earlier 2.12 compilers. However, we believe
+> that these methods are never used and existing compiled code will continue to
+> work.  See the [pull request
+> description](https://github.com/scala/scala/pull/7469) for more details.
+
+# Distribution
+
+The Flink distribution contains by default the required JARs to execute Flink SQL Jobs in `/lib`, in particular:
+
+-`flink-table-api-java-uber-{{< version >}}.jar` containing all the Java APIs
+-`flink-table-runtime-{{< version >}}.jar` containing the runtime
+-`flink-table-planner-loader-{{< version >}}.jar` containing the query planner
+
+When using formats and connectors with the Scala API, you need to either download and manually include 
+the JARs in the `/lib` folder (recommended), or you need to shade them in the uber JAR of your Flink SQL Jobs.
+
+For more details, check out [Connect to External Systems]({{< ref "docs/connectors/table/overview" >}}).
+
+# Table Planner and Table Planner Loader
+
+Starting from Flink 1.15, the distribution contains two planners:
+
+-`flink-table-planner{{< scala_version >}}-{{< version >}}.jar`, in `/opt`, contains the query planner
+-`flink-table-planner-loader-{{< version >}}.jar`, loaded by default in `/lib`, contains the query planner 
+  hidden behind an isolated classpath (you won't be able to address any `io.apache.flink.table.planner` directly)
+
+The planners contain the same code, but they are packaged differently. In one case, you must use the 
+same Scala version of the JAR. In the other, you do not need to make considerations about Scala, since
+it is hidden inside the JAR.
+
+If you need to access and use the internals of the query planner, you can swap the JARs (copying and
+pasting them in the downloaded distribution). Be aware that you will be constrained to using the Scala 
+version of the Flink distribution that you are using.
+
+**Note:** The two planners cannot co-exist at the same time in the classpath. If you load both of them
+in `/lib` your Table Jobs will fail.
+
+# Hadoop Dependencies
+
+**General rule:** It should not be necessary to add Hadoop dependencies directly to your application.
+The only exception is when you use existing Hadoop input/output formats with [Flink's Hadoop compatibility 
+wrappers](https://nightlies.apache.org/flink/flink-docs-master/docs/dev/dataset/hadoop_compatibility/).
+
+If you want to use Flink with Hadoop, you need to have a Flink setup that includes the Hadoop dependencies, 
+rather than adding Hadoop as an application dependency. In other words, Hadoop must be a dependency 
+of the Flink system itself and not of the user code that contains the application. Flink will use the
+Hadoop dependencies specified by the `HADOOP_CLASSPATH` environment variable, which can be set like this:
+
+```bash
+export HADOOP_CLASSPATH=`hadoop classpath`
+```
+
+There are two main reasons for this design:
+
+- Some Hadoop interaction happens in Flink's core, possibly before the user application is started. 

Review comment:
       interactions

##########
File path: docs/content/docs/dev/configuration/advanced.md
##########
@@ -0,0 +1,203 @@
+---
+title: "Advanced Configuration"
+weight: 10
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Dependencies: Flink Core and User Application
+
+There are two broad categories of dependencies and libraries in Flink, which are explained below.
+
+## Flink Core Dependencies
+
+Flink itself consists of a set of classes and dependencies that form the core of Flink's runtime
+and must be present when a Flink application is started. The classes and dependencies needed to run
+the system handle areas such as coordination, networking, checkpointing, failover, APIs,
+operations (such as windowing), resource management, etc.
+
+These core classes and dependencies are packaged in the `flink-dist` jar, are part of Flink's `lib`
+folder, and part of the basic Flink container images. You can think of these dependencies as similar
+to Java's core library (i.e. `rt.jar`, `charsets.jar`), which contains classes like `String` and `List`.
+
+In order to keep the core dependencies as small as possible and avoid dependency clashes, the
+Flink Core Dependencies do not contain any connectors or libraries (i.e. CEP, SQL, ML) in order to
+avoid having an excessive default number of classes and dependencies in the classpath.
+
+## User Application Dependencies
+
+These dependencies include all connectors, formats, or libraries that a specific user application
+needs and explicitly do not include the Flink DataStream APIs and runtime dependencies since those
+are already part of the Flink Core Dependencies.
+
+The user application is typically packaged into an *application jar*, which contains the application
+code and the required connector and library dependencies.
+
+## IDE configuration
+
+The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write `-Xmx800m` into the `VM Arguments` box.
+In IntelliJ IDEA, the recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is necessary to tick the
+`Include dependencies with "Provided" scope` box in the run configuration. If this option is not available
+(possibly due to using an older IntelliJ IDEA version), then a workaround is to create a test that
+calls the application's `main()` method.
+
+# Scala Versions
+
+Different Scala versions are not binary compatible with one another. For that reason, Flink for 
+Scala 2.11 cannot be used with an application that uses Scala 2.12. All Flink dependencies that 

Review comment:
       Flink doesn't support Scala 2.11 anymore since the next release

##########
File path: docs/content/docs/dev/configuration/advanced.md
##########
@@ -0,0 +1,203 @@
+---
+title: "Advanced Configuration"
+weight: 10
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Dependencies: Flink Core and User Application
+
+There are two broad categories of dependencies and libraries in Flink, which are explained below.
+
+## Flink Core Dependencies
+
+Flink itself consists of a set of classes and dependencies that form the core of Flink's runtime
+and must be present when a Flink application is started. The classes and dependencies needed to run
+the system handle areas such as coordination, networking, checkpointing, failover, APIs,
+operations (such as windowing), resource management, etc.
+
+These core classes and dependencies are packaged in the `flink-dist` jar, are part of Flink's `lib`
+folder, and part of the basic Flink container images. You can think of these dependencies as similar
+to Java's core library (i.e. `rt.jar`, `charsets.jar`), which contains classes like `String` and `List`.
+
+In order to keep the core dependencies as small as possible and avoid dependency clashes, the
+Flink Core Dependencies do not contain any connectors or libraries (i.e. CEP, SQL, ML) in order to
+avoid having an excessive default number of classes and dependencies in the classpath.
+
+## User Application Dependencies
+
+These dependencies include all connectors, formats, or libraries that a specific user application
+needs and explicitly do not include the Flink DataStream APIs and runtime dependencies since those
+are already part of the Flink Core Dependencies.
+
+The user application is typically packaged into an *application jar*, which contains the application
+code and the required connector and library dependencies.
+
+## IDE configuration
+
+The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write `-Xmx800m` into the `VM Arguments` box.
+In IntelliJ IDEA, the recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is necessary to tick the
+`Include dependencies with "Provided" scope` box in the run configuration. If this option is not available
+(possibly due to using an older IntelliJ IDEA version), then a workaround is to create a test that
+calls the application's `main()` method.
+
+# Scala Versions
+
+Different Scala versions are not binary compatible with one another. For that reason, Flink for 
+Scala 2.11 cannot be used with an application that uses Scala 2.12. All Flink dependencies that 
+(transitively) depend on Scala are suffixed with the Scala version that they are built for 
+(i.e. `flink-streaming-scala_2.12`).
+
+If you are only using Java, you can use any Scala version. If you are using Scala, you need to pick 

Review comment:
       I would say only use Flink's Java APIs and Flink's Scala APIs

##########
File path: docs/content/docs/dev/configuration/advanced.md
##########
@@ -0,0 +1,203 @@
+---
+title: "Advanced Configuration"
+weight: 10
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Dependencies: Flink Core and User Application
+
+There are two broad categories of dependencies and libraries in Flink, which are explained below.
+
+## Flink Core Dependencies
+
+Flink itself consists of a set of classes and dependencies that form the core of Flink's runtime
+and must be present when a Flink application is started. The classes and dependencies needed to run
+the system handle areas such as coordination, networking, checkpointing, failover, APIs,
+operations (such as windowing), resource management, etc.
+
+These core classes and dependencies are packaged in the `flink-dist` jar, are part of Flink's `lib`
+folder, and part of the basic Flink container images. You can think of these dependencies as similar
+to Java's core library (i.e. `rt.jar`, `charsets.jar`), which contains classes like `String` and `List`.
+
+In order to keep the core dependencies as small as possible and avoid dependency clashes, the
+Flink Core Dependencies do not contain any connectors or libraries (i.e. CEP, SQL, ML) in order to
+avoid having an excessive default number of classes and dependencies in the classpath.
+
+## User Application Dependencies
+
+These dependencies include all connectors, formats, or libraries that a specific user application
+needs and explicitly do not include the Flink DataStream APIs and runtime dependencies since those

Review comment:
       This probably needs to be both DataStream and Table API




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 988fcbec1b60588c4c0cb2249a94f258145f7040 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420) 
   * 80fd50ad46865be06e2c83b2470fb3eb2d35cd96 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * d0b2b188c37443b7bbda39af499398326cd56979 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207) 
   * 1bc80c1579db9e7f2d57112fdb31f0190f71db67 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] MartijnVisser commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
MartijnVisser commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r790812045



##########
File path: docs/content/docs/dev/configuration/maven.md
##########
@@ -0,0 +1,106 @@
+---

Review comment:
       That's a fair point but I'm also not sure if we provide perfect support for both Gradle and SBT. I would lean towards integrating those in one page. 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 8e5ae4098896a176d49d4b7319f69b6529412549 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516) 
   * 532af8b3b761f7b589500e5ead9b888f2370675a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528) 
   * 3de39d65f624fb6cf075e9e5ddfe005c197ee3dc UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30537",
       "triggerID" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 3de39d65f624fb6cf075e9e5ddfe005c197ee3dc Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30537) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312) 
   * 26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316) 
   * 6f44bc92aca6fc133b10cefd438f99810ce65293 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321) 
   * 1a637d30a2332b6c7059be2118a3cb8d40718eb9 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 50e221c6373f2774eeb8ec134df27caef83b2d40 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495) 
   * 8e5ae4098896a176d49d4b7319f69b6529412549 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516) 
   * 532af8b3b761f7b589500e5ead9b888f2370675a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528) 
   * 3de39d65f624fb6cf075e9e5ddfe005c197ee3dc UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] matriv commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
matriv commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r797664576



##########
File path: docs/content/docs/dev/configuration/gradle.md
##########
@@ -0,0 +1,110 @@
+---
+title: "Using Gradle"
+weight: 3
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Gradle to configure your project

Review comment:
       I've opened 2 PRs to ugprade to 7.3.3: 
   https://github.com/apache/flink-web/pull/504
   https://github.com/apache/flink/pull/18609




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1bc80c1579db9e7f2d57112fdb31f0190f71db67 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277) 
   * 274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312) 
   * 26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316) 
   * 6f44bc92aca6fc133b10cefd438f99810ce65293 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1bc80c1579db9e7f2d57112fdb31f0190f71db67 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277) 
   * 274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312) 
   * 26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316) 
   * 6f44bc92aca6fc133b10cefd438f99810ce65293 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321) 
   * 1a637d30a2332b6c7059be2118a3cb8d40718eb9 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1bc80c1579db9e7f2d57112fdb31f0190f71db67 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277) 
   * 274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1bc80c1579db9e7f2d57112fdb31f0190f71db67 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277) 
   * 274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312) 
   * 26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1bc80c1579db9e7f2d57112fdb31f0190f71db67 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277) 
   * 274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312) 
   * 26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316) 
   * 6f44bc92aca6fc133b10cefd438f99810ce65293 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321) 
   * 1a637d30a2332b6c7059be2118a3cb8d40718eb9 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 988fcbec1b60588c4c0cb2249a94f258145f7040 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 988fcbec1b60588c4c0cb2249a94f258145f7040 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420) 
   * 80fd50ad46865be06e2c83b2470fb3eb2d35cd96 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] slinkydeveloper commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
slinkydeveloper commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r790867925



##########
File path: docs/content/docs/dev/configuration/maven.md
##########
@@ -0,0 +1,106 @@
+---

Review comment:
       @MartijnVisser you mean something like this? https://github.com/apache/flink/pull/18353#discussion_r789423489




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 50e221c6373f2774eeb8ec134df27caef83b2d40 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495) 
   * 8e5ae4098896a176d49d4b7319f69b6529412549 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 50e221c6373f2774eeb8ec134df27caef83b2d40 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495) 
   * 8e5ae4098896a176d49d4b7319f69b6529412549 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516) 
   * 532af8b3b761f7b589500e5ead9b888f2370675a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 50e221c6373f2774eeb8ec134df27caef83b2d40 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495) 
   * 8e5ae4098896a176d49d4b7319f69b6529412549 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516) 
   * 532af8b3b761f7b589500e5ead9b888f2370675a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30537",
       "triggerID" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 8e5ae4098896a176d49d4b7319f69b6529412549 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516) 
   * 532af8b3b761f7b589500e5ead9b888f2370675a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528) 
   * 3de39d65f624fb6cf075e9e5ddfe005c197ee3dc Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30537) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] matriv commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
matriv commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r798503652



##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,219 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or locally for testing),
+the [Flink runtime library](https://ossindex.sonatype.org/component/pkg:maven/org.apache.flink/flink-runtime@1.14.3) 
+must be available.
+
+The guides in this section will show you how to configure your projects via popular build tools
+([Maven]({{< ref "docs/dev/configuration/maven" >}}), [Gradle]({{< ref "docs/dev/configuration/gradle" >}}),
+add the necessary dependencies (i.e. [connectors and formats]({{< ref "docs/dev/configuration/connector" >}}), 
+[testing]({{< ref "docs/dev/configuration/testing" >}})), and cover some 
+[advanced]({{< ref "docs/dev/configuration/advanced" >}}) configuration topics. 
+
+## Getting started
+
+To get started working on your Flink application, use the following commands, scripts, and templates 
+to create a Flink project.  
+
+{{< tabs "creating project" >}}
+{{< tab "Maven" >}}
+
+You can create a project based on an [Archetype](https://maven.apache.org/guides/introduction/introduction-to-archetypes.html)
+with the Maven command below or use the provided quickstart bash script.
+
+### Maven command
+```bash
+$ mvn archetype:generate                \
+  -DarchetypeGroupId=org.apache.flink   \
+  -DarchetypeArtifactId=flink-quickstart-java \
+  -DarchetypeVersion={{< version >}}
+```
+This allows you to name your newly created project and will interactively ask you for the groupId,
+artifactId, and package name.
+
+### Quickstart script
+```bash
+$ curl https://flink.apache.org/q/quickstart.sh | bash -s {{< version >}}
+```
+
+{{< /tab >}}
+{{< tab "Gradle" >}}
+You can create a project with a Gradle build script or use the provided quickstart bash script.
+
+### Gradle build script
+
+To execute these build configuration scripts, run the `gradle` command in the directory with these scripts.
+
+**build.gradle**
+
+```gradle
+plugins {
+    id 'java'
+    id 'application'
+    // shadow plugin to produce fat JARs
+    id 'com.github.johnrengelman.shadow' version '7.1.2'
+}
+// artifact properties
+group = 'org.myorg.quickstart'
+version = '0.1-SNAPSHOT'
+mainClassName = 'org.myorg.quickstart.StreamingJob'
+description = """Flink Quickstart Job"""
+ext {
+    javaVersion = '1.8'
+    flinkVersion = '{{< version >}}'
+    scalaBinaryVersion = '{{< scala_version >}}'
+    slf4jVersion = '1.7.32'
+    log4jVersion = '2.17.1'
+}
+sourceCompatibility = javaVersion
+targetCompatibility = javaVersion
+tasks.withType(JavaCompile) {
+	options.encoding = 'UTF-8'
+}
+applicationDefaultJvmArgs = ["-Dlog4j.configurationFile=log4j2.properties"]
+
+// declare where to find the dependencies of your project
+repositories {
+    mavenCentral()
+    maven { url "https://repository.apache.org/content/repositories/snapshots/" }

Review comment:
       Ok, because in the docs/examples we are using the <<flinkVersion>> which for master is `1.15-SNAPSHOT`




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] infoverload commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
infoverload commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r798691818



##########
File path: docs/content/docs/dev/configuration/maven.md
##########
@@ -0,0 +1,175 @@
+---
+title: "Using Maven"
+weight: 2
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Maven to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide will show you how to 
+do so with [Maven](https://maven.apache.org), an open-source build automation tool developed by the 
+Apache Group that enables you to build, publish, and deploy projects. You can use it to manage the 
+entire lifecycle of your software project.
+
+## Requirements
+
+- Maven 3.0.4 (or higher)
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you import this project into
+your IDE for developing and testing.
+
+IntelliJ IDEA supports Maven projects out-of-the-box. Eclipse offers the [m2e plugin](http://www.eclipse.org/m2e/) 
+to [import Maven projects](http://books.sonatype.com/m2eclipse-book/reference/creating-sect-importing-projects.html#fig-creating-import).
+
+**Note**: The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write into the `VM Arguments` box: `-Xmx800m`.
+In IntelliJ IDEA recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is necessary to tick the
+`Include dependencies with "Provided" scope` box in the run configuration. If this option is not available
+(possibly due to using an older IntelliJ IDEA version), then a workaround is to create a test that
+calls the application's `main()` method.
+
+## Building the project
+
+If you want to build/package your project, navigate to your project directory and run the
+'`mvn clean package`' command. You will find a JAR file that contains your application (plus connectors
+and libraries that you may have added as dependencies to the application) here:`target/<artifact-id>-<version>.jar`.
+
+__Note:__ If you used a different class than `DataStreamJob` as the application's main class / entry point,
+we recommend you change the `mainClass` setting in the `pom.xml` file accordingly so that Flink
+can run the application from the JAR file without additionally specifying the main class.
+
+## Adding dependencies to the project
+
+Open the `pom.xml` file in your profile directory and add the dependency in between
+the `dependencies` tab.  
+
+For example, you can add the Kafka connector as a dependency like this:
+
+```xml
+<dependencies>
+    
+    <dependency>
+        <groupId>org.apache.flink</groupId>
+        <artifactId>flink-connector-kafka</artifactId>
+        <version>{{< version >}}</version>
+    </dependency>
+    
+</dependencies>
+```
+
+Then execute `mvn install` on the command line. 
+
+Projects created from the `Java Project Template`, the `Scala Project Template`, or Gradle are configured
+to automatically include the application dependencies into the application JAR when you run `mvn clean package`.
+For projects that are not set up from those templates, we recommend adding the Maven Shade Plugin to
+build the application jar with all required dependencies.
+
+**Important:** Note that all these core API dependencies should have their scope set to [*provided*](https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#dependency-scope). This means that
+they are needed to compile against, but that they should not be packaged into the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
+becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR file clash with some of
+your own dependency versions (which is normally avoided through inverted classloading).
+
+To correctly package the dependencies into the application JAR, the Flink API dependencies must 
+be set to the *compile* scope.
+
+## Packaging the application
+
+Depending on your use case, you may need to package your Flink application in different ways before it
+gets deployed to a Flink environment. 
+
+If you want to create a JAR for a Flink Job and use only Flink dependencies without any third-party 
+dependencies (i.e. using the filesystem connector with JSON format), you do not need to create an 
+uber/fat JAR or shade any dependencies.
+
+If you want to create a JAR for a Flink Job and use external dependencies not built into the Flink 
+distribution, you can either add them to the classpath of the distribution or shade them into your 
+uber/fat application JAR.
+
+With the generated uber/fat JAR, you can submit it to a local or remote cluster with:
+
+```sh
+bin/flink run -c org.example.MyJob myFatJar.jar
+```
+
+To learn more about how to deploy Flink jobs, check out the [deployment guide]({{< ref "docs/deployment/cli" >}}).
+
+## Template for creating an uber/fat JAR with dependencies
+
+To build an application JAR that contains all dependencies required for declared connectors and libraries,
+you can use the following shade plugin definition:
+
+```xml

Review comment:
       After some discussion, I think we can improve it after the PR. 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1bc80c1579db9e7f2d57112fdb31f0190f71db67 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277) 
   * 274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312) 
   * 26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316) 
   * 6f44bc92aca6fc133b10cefd438f99810ce65293 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e955fd5db3754a069a1ed8c48ce2e581aaef51b Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30537",
       "triggerID" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1845471184d68e8edd89fd19a591030290695cf3",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "1845471184d68e8edd89fd19a591030290695cf3",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 3de39d65f624fb6cf075e9e5ddfe005c197ee3dc Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30537) 
   * 1845471184d68e8edd89fd19a591030290695cf3 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] slinkydeveloper commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
slinkydeveloper commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r784668771



##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,97 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or in the IDE for testing),
+the Flink runtime library must be available.
+
+## Setting up a Flink project: Getting started
+
+Every Flink application needs, at a minimum, the API dependencies to develop against. When setting up
+a project manually, you need to add the following dependencies for the Java/Scala API.
+
+In Maven syntax, it would look like:
+
+{{< tabs "a49d57a4-27ee-4dd3-a2b8-a673b99b011e" >}}
+{{< tab "Java" >}}
+```xml
+<dependency>
+  <groupId>org.apache.flink</groupId>
+  <artifactId>flink-streaming-java</artifactId>
+  <version>{{< version >}}</version>
+  <scope>provided</scope>
+</dependency>
+```
+{{< /tab >}}
+{{< tab "Scala" >}}
+```xml
+<dependency>
+  <groupId>org.apache.flink</groupId>
+  <artifactId>flink-streaming-scala{{< scala_version >}}</artifactId>
+  <version>{{< version >}}</version>
+  <scope>provided</scope>
+</dependency>
+```
+{{< /tab >}}
+{{< /tabs >}}
+
+**Important:** Note that all these dependencies have their scope set to *provided*. This means that
+they are needed to compile against, but that they should not be packaged into the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
+becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR file clash with some of
+your own dependency versions (which is normally avoided through inverted classloading).
+
+**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is necessary to tick the
+`Include dependencies with "Provided" scope` box in the run configuration. If this option is not available
+(possibly due to using an older IntelliJ IDEA version), then a workaround is to create a test that
+calls the application's `main()` method.
+
+## Which dependencies do you need?
+
+| APIs you want to use              | Dependency you need to add    |
+|-----------------------------------|-------------------------------|
+| DataStream                        | flink-streaming-java          |  
+| DataStream with Scala             | flink-streaming-scala         |   

Review comment:
       Everytime you refer to a scala artifact you need the scala binary version, like done above 

##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -0,0 +1,97 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/project-configuration.html
+- /start/dependencies.html
+- /getting-started/project-setup/dependencies.html
+- /quickstart/java_api_quickstart.html
+- /dev/projectsetup/java_api_quickstart.html
+- /dev/linking_with_flink.html
+- /dev/linking.html
+- /dev/projectsetup/dependencies.html
+- /dev/projectsetup/java_api_quickstart.html
+- /getting-started/project-setup/java_api_quickstart.html
+- /dev/getting-started/project-setup/scala_api_quickstart.html
+- /getting-started/project-setup/scala_api_quickstart.html
+- /quickstart/scala_api_quickstart.html
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Project Configuration
+
+Every Flink application depends on a set of Flink libraries. At a minimum, the application depends
+on the Flink APIs and, in addition, on certain connector libraries (i.e. Kafka, Cassandra).
+When running Flink applications (either in a distributed deployment or in the IDE for testing),
+the Flink runtime library must be available.
+
+## Setting up a Flink project: Getting started

Review comment:
       Just Getting started?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316) 
   * 6f44bc92aca6fc133b10cefd438f99810ce65293 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321) 
   * 1a637d30a2332b6c7059be2118a3cb8d40718eb9 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e955fd5db3754a069a1ed8c48ce2e581aaef51b Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417) 
   * 50e221c6373f2774eeb8ec134df27caef83b2d40 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 8e5ae4098896a176d49d4b7319f69b6529412549 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516) 
   * 532af8b3b761f7b589500e5ead9b888f2370675a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528) 
   * 3de39d65f624fb6cf075e9e5ddfe005c197ee3dc UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1a637d30a2332b6c7059be2118a3cb8d40718eb9 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334) 
   * 2e955fd5db3754a069a1ed8c48ce2e581aaef51b UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 50e221c6373f2774eeb8ec134df27caef83b2d40 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495) 
   * 8e5ae4098896a176d49d4b7319f69b6529412549 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516) 
   * 532af8b3b761f7b589500e5ead9b888f2370675a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1bc80c1579db9e7f2d57112fdb31f0190f71db67 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277) 
   * 274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312) 
   * 26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316) 
   * 6f44bc92aca6fc133b10cefd438f99810ce65293 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321) 
   * 1a637d30a2332b6c7059be2118a3cb8d40718eb9 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1bc80c1579db9e7f2d57112fdb31f0190f71db67 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277) 
   * 274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312) 
   * 26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1a637d30a2332b6c7059be2118a3cb8d40718eb9 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] matriv commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
matriv commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r798344325



##########
File path: docs/content/docs/dev/configuration/gradle.md
##########
@@ -0,0 +1,120 @@
+---
+title: "Using Gradle"
+weight: 3
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Gradle to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide will show you how to
+do so with [Gradle](https://gradle.org), an open-source general-purpose build tool that can be used 
+to automate tasks in the development process.
+
+## Requirements
+
+- Gradle 7.x 
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you import this project into
+your IDE for developing and testing.
+
+IntelliJ IDEA supports Gradle projects via the `Gradle` plugin.
+
+Eclipse does so via the [Eclipse Buildship](https://projects.eclipse.org/projects/tools.buildship)
+plugin (make sure to specify a Gradle version >= 3.0 in the last step of the import wizard; the `shadow`
+plugin requires it). You may also use [Gradle's IDE integration](https://docs.gradle.org/current/userguide/userguide.html#ide-integration)
+to create project files with Gradle.
+
+**Note**: The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write into the `VM Arguments` box: `-Xmx800m`.
+In IntelliJ IDEA recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is necessary to tick the
+`Include dependencies with "Provided" scope` box in the run configuration. If this option is not available
+(possibly due to using an older IntelliJ IDEA version), then a workaround is to create a test that
+calls the application's `main()` method.
+
+## Building the project
+
+If you want to __build/package your project__, go to your project directory and
+run the '`gradle clean shadowJar`' command.
+You will __find a JAR file__ that contains your application, plus connectors and libraries
+that you may have added as dependencies to the application: `build/libs/<project-name>-<version>-all.jar`.
+
+__Note:__ If you use a different class than *StreamingJob* as the application's main class / entry point,
+we recommend you change the `mainClassName` setting in the `build.gradle` file accordingly. That way, Flink
+can run the application from the JAR file without additionally specifying the main class.
+
+## Adding dependencies to the project
+
+Specify a dependency configuration in the dependencies block of your `build.gradle` file.
+
+For example, you can add the Kafka connector as a dependency like this:
+
+**build.gradle**
+
+```gradle
+...
+dependencies {
+    ...  
+    flinkShadowJar "org.apache.flink:flink-connector-kafka:${flinkVersion}"
+    ...
+}
+...
+```
+
+**Important:** Note that all these (core) dependencies should have their scope set to [*provided*](https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#dependency-scope). This means that
+they are needed to compile against, but that they should not be packaged into the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
+becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR file clash with some of
+your own dependency versions (which is normally avoided through inverted classloading).
+
+To correctly package the dependencies into the application JAR, these application dependencies must
+be set to the *compile* scope.
+
+## Packaging the application
+
+Depending on your use case, you may need to package your Flink application in different ways before 
+it gets deployed to a Flink environment.
+
+If you want to create a JAR for a Flink Job and use only Flink dependencies without any third-party
+dependencies (i.e. using the filesystem connector with JSON format), you do not need to create an
+uber/fat JAR or shade any dependencies.
+
+If you want to create a JAR for a Flink Job and use external dependencies not built into the Flink

Review comment:
       And here the installShadowDist, so the next paragraph can be removed, thx!




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30537",
       "triggerID" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1845471184d68e8edd89fd19a591030290695cf3",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30593",
       "triggerID" : "1845471184d68e8edd89fd19a591030290695cf3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e33ad31f47e113b480de1c4f6ac6efc40992084b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30651",
       "triggerID" : "e33ad31f47e113b480de1c4f6ac6efc40992084b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "78c5075dd300c8e74705afbb13b10377da61865a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30675",
       "triggerID" : "78c5075dd300c8e74705afbb13b10377da61865a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ddabc7bb39a84b79407d5c9b85de9c83d0959de2",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30708",
       "triggerID" : "ddabc7bb39a84b79407d5c9b85de9c83d0959de2",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * ddabc7bb39a84b79407d5c9b85de9c83d0959de2 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30708) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 6f44bc92aca6fc133b10cefd438f99810ce65293 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321) 
   * 1a637d30a2332b6c7059be2118a3cb8d40718eb9 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30537",
       "triggerID" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1845471184d68e8edd89fd19a591030290695cf3",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30593",
       "triggerID" : "1845471184d68e8edd89fd19a591030290695cf3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e33ad31f47e113b480de1c4f6ac6efc40992084b",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "e33ad31f47e113b480de1c4f6ac6efc40992084b",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1845471184d68e8edd89fd19a591030290695cf3 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30593) 
   * e33ad31f47e113b480de1c4f6ac6efc40992084b UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30537",
       "triggerID" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1845471184d68e8edd89fd19a591030290695cf3",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30593",
       "triggerID" : "1845471184d68e8edd89fd19a591030290695cf3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e33ad31f47e113b480de1c4f6ac6efc40992084b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30651",
       "triggerID" : "e33ad31f47e113b480de1c4f6ac6efc40992084b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "78c5075dd300c8e74705afbb13b10377da61865a",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30675",
       "triggerID" : "78c5075dd300c8e74705afbb13b10377da61865a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ddabc7bb39a84b79407d5c9b85de9c83d0959de2",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "ddabc7bb39a84b79407d5c9b85de9c83d0959de2",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 78c5075dd300c8e74705afbb13b10377da61865a Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30675) 
   * ddabc7bb39a84b79407d5c9b85de9c83d0959de2 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30537",
       "triggerID" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1845471184d68e8edd89fd19a591030290695cf3",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30593",
       "triggerID" : "1845471184d68e8edd89fd19a591030290695cf3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e33ad31f47e113b480de1c4f6ac6efc40992084b",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30651",
       "triggerID" : "e33ad31f47e113b480de1c4f6ac6efc40992084b",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * e33ad31f47e113b480de1c4f6ac6efc40992084b Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30651) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] MartijnVisser commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
MartijnVisser commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r790740403



##########
File path: docs/content/docs/connectors/datastream/cassandra.md
##########
@@ -37,7 +37,7 @@ To use this connector, add the following dependency to your project:
 
 {{< artifact flink-connector-cassandra withScalaVersion >}}
 
-Note that the streaming connectors are currently __NOT__ part of the binary distribution. See how to link with them for cluster execution [here]({{< ref "docs/dev/datastream/project-configuration" >}}).
+Note that the streaming connectors are currently __NOT__ part of the binary distribution. See how to link with them for cluster execution [here]({{< ref "docs/dev/configuration" >}}).

Review comment:
       NB: All these links currently send me to http://localhost:1313/flink/flink-docs-master/docs/dev/configuration/. That page is blank, it should be http://localhost:1313/flink/flink-docs-master/docs/dev/configuration/overview/ I think. So:
   
   ```suggestion
   Note that the streaming connectors are currently __NOT__ part of the binary distribution. See how to link with them for cluster execution [here]({{< ref "docs/dev/configuration/overview" >}}).
   ```
   

##########
File path: docs/content/docs/dev/configuration/maven.md
##########
@@ -0,0 +1,106 @@
+---

Review comment:
       I'm leaning towards moving the content of Maven, Gradle and SBT all to sections under the Overview. Reasons for that is that if something changes on the Project Configuration, you can immediately change it in one go and keep things in sync. Probably with the tabs option to see different setups (a tab for a Maven example, tab for Gradle etc). Something like that?
   
   What made you think about splitting them to separate pages? 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] infoverload commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
infoverload commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r790802957



##########
File path: docs/content/docs/dev/configuration/maven.md
##########
@@ -0,0 +1,106 @@
+---

Review comment:
       In the first iteration, I thought that it made the page too long and that too much scrolling makes for bad UX, but I can try the tabs idea. 




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 50e221c6373f2774eeb8ec134df27caef83b2d40 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495) 
   * 8e5ae4098896a176d49d4b7319f69b6529412549 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516) 
   * 532af8b3b761f7b589500e5ead9b888f2370675a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528) 
   * 3de39d65f624fb6cf075e9e5ddfe005c197ee3dc UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30537",
       "triggerID" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1845471184d68e8edd89fd19a591030290695cf3",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30593",
       "triggerID" : "1845471184d68e8edd89fd19a591030290695cf3",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 3de39d65f624fb6cf075e9e5ddfe005c197ee3dc Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30537) 
   * 1845471184d68e8edd89fd19a591030290695cf3 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30593) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 50e221c6373f2774eeb8ec134df27caef83b2d40 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495) 
   * 8e5ae4098896a176d49d4b7319f69b6529412549 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516) 
   * 532af8b3b761f7b589500e5ead9b888f2370675a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 2e955fd5db3754a069a1ed8c48ce2e581aaef51b Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417) 
   * 50e221c6373f2774eeb8ec134df27caef83b2d40 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 988fcbec1b60588c4c0cb2249a94f258145f7040 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot commented on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot commented on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 988fcbec1b60588c4c0cb2249a94f258145f7040 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] infoverload commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
infoverload commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r799252625



##########
File path: docs/content/docs/dev/configuration/overview.md
##########
@@ -181,22 +171,6 @@ rootProject.name = 'quickstart'
 bash -c "$(curl https://flink.apache.org/q/gradle-quickstart.sh)" -- {{< version >}} {{< scala_version >}}
 ```
 {{< /tab >}}
-{{< tab "sbt" >}}
-You can scaffold a new Flink project with the following [giter8 template](https://github.com/tillrohrmann/flink-project.g8)
-and the `sbt new` command (which creates new build definitions from a template) or use the provided quickstart bash script.
-
-### sbt template
-
-```bash
-$ sbt new tillrohrmann/flink-project.g8

Review comment:
       I had already removed the standalone page on SBT and had forgotten to remove this section.  




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] matriv commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
matriv commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r798344074



##########
File path: docs/content/docs/dev/configuration/gradle.md
##########
@@ -0,0 +1,120 @@
+---
+title: "Using Gradle"
+weight: 3
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Gradle to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide will show you how to
+do so with [Gradle](https://gradle.org), an open-source general-purpose build tool that can be used 
+to automate tasks in the development process.
+
+## Requirements
+
+- Gradle 7.x 
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you import this project into
+your IDE for developing and testing.
+
+IntelliJ IDEA supports Gradle projects via the `Gradle` plugin.
+
+Eclipse does so via the [Eclipse Buildship](https://projects.eclipse.org/projects/tools.buildship)
+plugin (make sure to specify a Gradle version >= 3.0 in the last step of the import wizard; the `shadow`
+plugin requires it). You may also use [Gradle's IDE integration](https://docs.gradle.org/current/userguide/userguide.html#ide-integration)
+to create project files with Gradle.
+
+**Note**: The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write into the `VM Arguments` box: `-Xmx800m`.
+In IntelliJ IDEA recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is necessary to tick the
+`Include dependencies with "Provided" scope` box in the run configuration. If this option is not available
+(possibly due to using an older IntelliJ IDEA version), then a workaround is to create a test that
+calls the application's `main()` method.
+
+## Building the project
+
+If you want to __build/package your project__, go to your project directory and
+run the '`gradle clean shadowJar`' command.
+You will __find a JAR file__ that contains your application, plus connectors and libraries
+that you may have added as dependencies to the application: `build/libs/<project-name>-<version>-all.jar`.
+
+__Note:__ If you use a different class than *StreamingJob* as the application's main class / entry point,
+we recommend you change the `mainClassName` setting in the `build.gradle` file accordingly. That way, Flink
+can run the application from the JAR file without additionally specifying the main class.
+
+## Adding dependencies to the project
+
+Specify a dependency configuration in the dependencies block of your `build.gradle` file.
+
+For example, you can add the Kafka connector as a dependency like this:
+
+**build.gradle**
+
+```gradle
+...
+dependencies {
+    ...  
+    flinkShadowJar "org.apache.flink:flink-connector-kafka:${flinkVersion}"
+    ...
+}
+...
+```
+
+**Important:** Note that all these (core) dependencies should have their scope set to [*provided*](https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#dependency-scope). This means that
+they are needed to compile against, but that they should not be packaged into the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
+becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR file clash with some of
+your own dependency versions (which is normally avoided through inverted classloading).
+
+To correctly package the dependencies into the application JAR, these application dependencies must
+be set to the *compile* scope.
+
+## Packaging the application
+
+Depending on your use case, you may need to package your Flink application in different ways before 
+it gets deployed to a Flink environment.
+
+If you want to create a JAR for a Flink Job and use only Flink dependencies without any third-party
+dependencies (i.e. using the filesystem connector with JSON format), you do not need to create an
+uber/fat JAR or shade any dependencies.

Review comment:
       @infoverload Please state here the `installDist` with and without the wrapper.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] slinkydeveloper commented on a change in pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
slinkydeveloper commented on a change in pull request #18353:
URL: https://github.com/apache/flink/pull/18353#discussion_r798580117



##########
File path: docs/content/docs/dev/configuration/maven.md
##########
@@ -0,0 +1,175 @@
+---
+title: "Using Maven"
+weight: 2
+type: docs
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# How to use Maven to configure your project
+
+You will likely need a build tool to configure your Flink project. This guide will show you how to 
+do so with [Maven](https://maven.apache.org), an open-source build automation tool developed by the 
+Apache Group that enables you to build, publish, and deploy projects. You can use it to manage the 
+entire lifecycle of your software project.
+
+## Requirements
+
+- Maven 3.0.4 (or higher)
+- Java 8.x
+
+## Importing the project into your IDE
+
+Once the project folder and files have been created, we recommend that you import this project into
+your IDE for developing and testing.
+
+IntelliJ IDEA supports Maven projects out-of-the-box. Eclipse offers the [m2e plugin](http://www.eclipse.org/m2e/) 
+to [import Maven projects](http://books.sonatype.com/m2eclipse-book/reference/creating-sect-importing-projects.html#fig-creating-import).
+
+**Note**: The default JVM heap size for Java may be too small for Flink and you have to manually increase it.
+In Eclipse, choose `Run Configurations -> Arguments` and write into the `VM Arguments` box: `-Xmx800m`.
+In IntelliJ IDEA recommended way to change JVM options is from the `Help | Edit Custom VM Options` menu.
+See [this article](https://intellij-support.jetbrains.com/hc/en-us/articles/206544869-Configuring-JVM-options-and-platform-properties) for details.
+
+**Note on IntelliJ:** To make the applications run within IntelliJ IDEA, it is necessary to tick the
+`Include dependencies with "Provided" scope` box in the run configuration. If this option is not available
+(possibly due to using an older IntelliJ IDEA version), then a workaround is to create a test that
+calls the application's `main()` method.
+
+## Building the project
+
+If you want to build/package your project, navigate to your project directory and run the
+'`mvn clean package`' command. You will find a JAR file that contains your application (plus connectors
+and libraries that you may have added as dependencies to the application) here:`target/<artifact-id>-<version>.jar`.
+
+__Note:__ If you used a different class than `DataStreamJob` as the application's main class / entry point,
+we recommend you change the `mainClass` setting in the `pom.xml` file accordingly so that Flink
+can run the application from the JAR file without additionally specifying the main class.
+
+## Adding dependencies to the project
+
+Open the `pom.xml` file in your profile directory and add the dependency in between
+the `dependencies` tab.  
+
+For example, you can add the Kafka connector as a dependency like this:
+
+```xml
+<dependencies>
+    
+    <dependency>
+        <groupId>org.apache.flink</groupId>
+        <artifactId>flink-connector-kafka</artifactId>
+        <version>{{< version >}}</version>
+    </dependency>
+    
+</dependencies>
+```
+
+Then execute `mvn install` on the command line. 
+
+Projects created from the `Java Project Template`, the `Scala Project Template`, or Gradle are configured
+to automatically include the application dependencies into the application JAR when you run `mvn clean package`.
+For projects that are not set up from those templates, we recommend adding the Maven Shade Plugin to
+build the application jar with all required dependencies.
+
+**Important:** Note that all these core API dependencies should have their scope set to [*provided*](https://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#dependency-scope). This means that
+they are needed to compile against, but that they should not be packaged into the project's resulting
+application JAR file. If not set to *provided*, the best case scenario is that the resulting JAR
+becomes excessively large, because it also contains all Flink core dependencies. The worst case scenario
+is that the Flink core dependencies that are added to the application's JAR file clash with some of
+your own dependency versions (which is normally avoided through inverted classloading).
+
+To correctly package the dependencies into the application JAR, the Flink API dependencies must 
+be set to the *compile* scope.
+
+## Packaging the application
+
+Depending on your use case, you may need to package your Flink application in different ways before it
+gets deployed to a Flink environment. 
+
+If you want to create a JAR for a Flink Job and use only Flink dependencies without any third-party 
+dependencies (i.e. using the filesystem connector with JSON format), you do not need to create an 
+uber/fat JAR or shade any dependencies.
+
+If you want to create a JAR for a Flink Job and use external dependencies not built into the Flink 
+distribution, you can either add them to the classpath of the distribution or shade them into your 
+uber/fat application JAR.
+
+With the generated uber/fat JAR, you can submit it to a local or remote cluster with:
+
+```sh
+bin/flink run -c org.example.MyJob myFatJar.jar
+```
+
+To learn more about how to deploy Flink jobs, check out the [deployment guide]({{< ref "docs/deployment/cli" >}}).
+
+## Template for creating an uber/fat JAR with dependencies
+
+To build an application JAR that contains all dependencies required for declared connectors and libraries,
+you can use the following shade plugin definition:
+
+```xml

Review comment:
       I think this page is more for detailed/advanced topics for maven. This is a bit duplicated but I think it should be synced with what we have in the quickstarts. We can also improve it later after this PR.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30334",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     }, {
       "hash" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30417",
       "triggerID" : "2e955fd5db3754a069a1ed8c48ce2e581aaef51b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30495",
       "triggerID" : "50e221c6373f2774eeb8ec134df27caef83b2d40",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30516",
       "triggerID" : "8e5ae4098896a176d49d4b7319f69b6529412549",
       "triggerType" : "PUSH"
     }, {
       "hash" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30528",
       "triggerID" : "532af8b3b761f7b589500e5ead9b888f2370675a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30537",
       "triggerID" : "3de39d65f624fb6cf075e9e5ddfe005c197ee3dc",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1845471184d68e8edd89fd19a591030290695cf3",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30593",
       "triggerID" : "1845471184d68e8edd89fd19a591030290695cf3",
       "triggerType" : "PUSH"
     }, {
       "hash" : "e33ad31f47e113b480de1c4f6ac6efc40992084b",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30651",
       "triggerID" : "e33ad31f47e113b480de1c4f6ac6efc40992084b",
       "triggerType" : "PUSH"
     }, {
       "hash" : "78c5075dd300c8e74705afbb13b10377da61865a",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30675",
       "triggerID" : "78c5075dd300c8e74705afbb13b10377da61865a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "ddabc7bb39a84b79407d5c9b85de9c83d0959de2",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30708",
       "triggerID" : "ddabc7bb39a84b79407d5c9b85de9c83d0959de2",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 78c5075dd300c8e74705afbb13b10377da61865a Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30675) 
   * ddabc7bb39a84b79407d5c9b85de9c83d0959de2 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30708) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1bc80c1579db9e7f2d57112fdb31f0190f71db67 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277) 
   * 274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312) 
   * 26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "1a637d30a2332b6c7059be2118a3cb8d40718eb9",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1bc80c1579db9e7f2d57112fdb31f0190f71db67 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277) 
   * 274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312) 
   * 26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316) 
   * 6f44bc92aca6fc133b10cefd438f99810ce65293 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321) 
   * 1a637d30a2332b6c7059be2118a3cb8d40718eb9 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1bc80c1579db9e7f2d57112fdb31f0190f71db67 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277) 
   * 274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312) 
   * 26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316) 
   * 6f44bc92aca6fc133b10cefd438f99810ce65293 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1bc80c1579db9e7f2d57112fdb31f0190f71db67 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277) 
   * 274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312) 
   * 26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316) 
   * 6f44bc92aca6fc133b10cefd438f99810ce65293 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18353: [FLINK-25129][docs]project configuation changes in docs

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18353:
URL: https://github.com/apache/flink/pull/18353#issuecomment-1012981377


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29420",
       "triggerID" : "988fcbec1b60588c4c0cb2249a94f258145f7040",
       "triggerType" : "PUSH"
     }, {
       "hash" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29823",
       "triggerID" : "80fd50ad46865be06e2c83b2470fb3eb2d35cd96",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30207",
       "triggerID" : "d0b2b188c37443b7bbda39af499398326cd56979",
       "triggerType" : "PUSH"
     }, {
       "hash" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277",
       "triggerID" : "1bc80c1579db9e7f2d57112fdb31f0190f71db67",
       "triggerType" : "PUSH"
     }, {
       "hash" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312",
       "triggerID" : "274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f",
       "triggerType" : "PUSH"
     }, {
       "hash" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316",
       "triggerID" : "26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5",
       "triggerType" : "PUSH"
     }, {
       "hash" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321",
       "triggerID" : "6f44bc92aca6fc133b10cefd438f99810ce65293",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 1bc80c1579db9e7f2d57112fdb31f0190f71db67 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30277) 
   * 274e87d5fab469dcfeab1b1eccf99ea6d4d84d3f Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30312) 
   * 26c44fa0b0b49e93bfe0404f5ab6db3fad9bf8a5 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30316) 
   * 6f44bc92aca6fc133b10cefd438f99810ce65293 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=30321) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org