You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@flink.apache.org by lz...@apache.org on 2022/08/19 05:44:06 UTC

[flink-table-store] branch release-0.2 updated: [hotfix] Document HDFS env required

This is an automated email from the ASF dual-hosted git repository.

lzljs3620320 pushed a commit to branch release-0.2
in repository https://gitbox.apache.org/repos/asf/flink-table-store.git


The following commit(s) were added to refs/heads/release-0.2 by this push:
     new 1063dced [hotfix] Document HDFS env required
1063dced is described below

commit 1063dced477ab79ddd86d79ff8a46195f785901d
Author: JingsongLi <lz...@aliyun.com>
AuthorDate: Fri Aug 19 13:43:01 2022 +0800

    [hotfix] Document HDFS env required
---
 docs/content/docs/engines/hive.md   | 4 ++++
 docs/content/docs/engines/spark2.md | 4 ++++
 docs/content/docs/engines/spark3.md | 4 ++++
 3 files changed, 12 insertions(+)

diff --git a/docs/content/docs/engines/hive.md b/docs/content/docs/engines/hive.md
index 07cb7983..a51feb7e 100644
--- a/docs/content/docs/engines/hive.md
+++ b/docs/content/docs/engines/hive.md
@@ -85,6 +85,10 @@ There are several ways to add this jar to Hive.
 * You can create an `auxlib` folder under the root directory of Hive, and copy `flink-table-store-hive-connector-{{< version >}}.jar` into `auxlib`.
 * You can also copy this jar to a path accessible by Hive, then use `add jar /path/to/flink-table-store-hive-connector-{{< version >}}.jar` to enable table store support in Hive. Note that this method is not recommended. If you're using the MR execution engine and running a join statement, you may be faced with the exception `org.apache.hive.com.esotericsoftware.kryo.kryoexception: unable to find class`.
 
+{{< hint info >}}
+__Note:__ If you are using HDFS, make sure that the environment variable `HADOOP_HOME` or `HADOOP_CONF_DIR` is set.
+{{< /hint >}}
+
 ## Using Table Store Hive Catalog
 
 By using table store Hive catalog, you can create, drop and insert into table store tables from Flink. These operations directly affect the corresponding Hive metastore. Tables created in this way can also be accessed directly from Hive.
diff --git a/docs/content/docs/engines/spark2.md b/docs/content/docs/engines/spark2.md
index f97b1d1c..863ca5bf 100644
--- a/docs/content/docs/engines/spark2.md
+++ b/docs/content/docs/engines/spark2.md
@@ -48,6 +48,10 @@ spark-shell ... --jars flink-table-store-spark2-{{< version >}}.jar
 
 Alternatively, you can copy `flink-table-store-spark2-{{< version >}}.jar` under `spark/jars` in your Spark installation.
 
+{{< hint info >}}
+__Note:__ If you are using HDFS, make sure that the environment variable `HADOOP_HOME` or `HADOOP_CONF_DIR` is set.
+{{< /hint >}}
+
 ## Read
 
 Table store with Spark 2.4 does not support DDL, you can use the Dataset reader
diff --git a/docs/content/docs/engines/spark3.md b/docs/content/docs/engines/spark3.md
index 7dea8380..a48b4fc0 100644
--- a/docs/content/docs/engines/spark3.md
+++ b/docs/content/docs/engines/spark3.md
@@ -48,6 +48,10 @@ spark-sql ... --jars flink-table-store-spark-{{< version >}}.jar
 
 Alternatively, you can copy `flink-table-store-spark-{{< version >}}.jar` under `spark/jars` in your Spark installation.
 
+{{< hint info >}}
+__Note:__ If you are using HDFS, make sure that the environment variable `HADOOP_HOME` or `HADOOP_CONF_DIR` is set.
+{{< /hint >}}
+
 ## Catalog
 
 The following command registers the Table Store's Spark catalog with the name `tablestore`: