You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@flink.apache.org by ja...@apache.org on 2019/08/30 02:13:17 UTC

[flink] branch release-1.9 updated: [hotfix][FLINK-13901][docs] Fix documentation links check errors

This is an automated email from the ASF dual-hosted git repository.

jark pushed a commit to branch release-1.9
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.9 by this push:
     new c175cc4  [hotfix][FLINK-13901][docs] Fix documentation links check errors
c175cc4 is described below

commit c175cc424458fd2425cd8e8463e4f6cb7bd228f5
Author: Jark Wu <wu...@alibaba-inc.com>
AuthorDate: Fri Aug 30 10:11:53 2019 +0800

    [hotfix][FLINK-13901][docs] Fix documentation links check errors
---
 docs/dev/table/config.zh.md | 103 ++++++++++++++++++++++++++++++++++++++++++++
 docs/dev/table/sql.zh.md    |   2 +-
 2 files changed, 104 insertions(+), 1 deletion(-)

diff --git a/docs/dev/table/config.zh.md b/docs/dev/table/config.zh.md
new file mode 100644
index 0000000..fa1f849
--- /dev/null
+++ b/docs/dev/table/config.zh.md
@@ -0,0 +1,103 @@
+---
+title: "配置"
+nav-parent_id: tableapi
+nav-pos: 150
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+
+  http://www.apache.org/licenses/LICENSE-2.0
+
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+By default, the Table & SQL API is preconfigured for producing accurate results with acceptable
+performance.
+
+Depending on the requirements of a table program, it might be necessary to adjust
+certain parameters for optimization. For example, unbounded streaming programs may need to ensure
+that the required state size is capped (see [streaming concepts](./streaming/query_configuration.html)).
+
+* This will be replaced by the TOC
+{:toc}
+
+### Overview
+
+In every table environment, the `TableConfig` offers options for configuring the current session.
+
+For common or important configuration options, the `TableConfig` provides getters and setters methods
+with detailed inline documentation.
+
+For more advanced configuration, users can directly access the underlying key-value map. The following
+sections list all available options that can be used to adjust Flink Table & SQL API programs.
+
+<span class="label label-danger">Attention</span> Because options are read at different point in time
+when performing operations, it is recommended to set configuration options early after instantiating a
+table environment.
+
+<div class="codetabs" markdown="1">
+<div data-lang="java" markdown="1">
+{% highlight java %}
+// instantiate table environment
+TableEnvironment tEnv = ...
+
+tEnv.getConfig()        // access high-level configuration
+  .getConfiguration()   // set low-level key-value options
+  .setString("table.exec.mini-batch.enabled", "true")
+  .setString("table.exec.mini-batch.allow-latency", "5 s")
+  .setString("table.exec.mini-batch.size", "5000");
+{% endhighlight %}
+</div>
+
+<div data-lang="scala" markdown="1">
+{% highlight scala %}
+// instantiate table environment
+val tEnv: TableEnvironment = ...
+
+tEnv.getConfig         // access high-level configuration
+  .getConfiguration    // set low-level key-value options
+  .setString("table.exec.mini-batch.enabled", "true")
+  .setString("table.exec.mini-batch.allow-latency", "5 s")
+  .setString("table.exec.mini-batch.size", "5000")
+{% endhighlight %}
+</div>
+
+<div data-lang="python" markdown="1">
+{% highlight python %}
+# instantiate table environment
+t_env = ...
+
+t_env.get_config()        # access high-level configuration
+  .get_configuration()    # set low-level key-value options
+  .set_string("table.exec.mini-batch.enabled", "true")
+  .set_string("table.exec.mini-batch.allow-latency", "5 s")
+  .set_string("table.exec.mini-batch.size", "5000");
+{% endhighlight %}
+</div>
+</div>
+
+<span class="label label-danger">Attention</span> Currently, key-value options are only supported
+for the Blink planner.
+
+### Execution Options
+
+The following options can be used to tune the performance of the query execution.
+
+{% include generated/execution_config_configuration.html %}
+
+### Optimizer Options
+
+The following options can be used to adjust the behavior of the query optimizer to get a better execution plan.
+
+{% include generated/optimizer_config_configuration.html %}
diff --git a/docs/dev/table/sql.zh.md b/docs/dev/table/sql.zh.md
index 2d7975e..86e9d42 100644
--- a/docs/dev/table/sql.zh.md
+++ b/docs/dev/table/sql.zh.md
@@ -1081,7 +1081,7 @@ If the table does not exist, nothing happens.
 
 ## DDL
 
-DDLs are specified with the `sqlUpdate()` method of the `TableEnvironment`. The method returns nothing for a success table creation. A `Table` can be register into the [Catalog](catalog.html) with a `CREATE TABLE` statement, then be referenced in the SQL queries in method `sqlQuery()` of `TableEnvironment`.
+DDLs are specified with the `sqlUpdate()` method of the `TableEnvironment`. The method returns nothing for a success table creation. A `Table` can be register into the [Catalog](catalogs.html) with a `CREATE TABLE` statement, then be referenced in the SQL queries in method `sqlQuery()` of `TableEnvironment`.
 
 **Note:** Flink's DDL support is not yet feature complete. Queries that include unsupported SQL features cause a `TableException`. The supported features of SQL DDL on batch and streaming tables are listed in the following sections.