You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by GitBox <gi...@apache.org> on 2022/07/26 09:08:19 UTC

[GitHub] [flink-table-store] JingsongLi opened a new pull request, #241: [FLINK-28689] Optimize Spark documentation to Catalog and Dataset

JingsongLi opened a new pull request, #241:
URL: https://github.com/apache/flink-table-store/pull/241

   - Introduce Dataset API.
   - Unify table_store and tablestore.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-table-store] JingsongLi merged pull request #241: [FLINK-28689] Optimize Spark documentation to Catalog and Dataset

Posted by GitBox <gi...@apache.org>.
JingsongLi merged PR #241:
URL: https://github.com/apache/flink-table-store/pull/241


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-table-store] LadyForest commented on a diff in pull request #241: [FLINK-28689] Optimize Spark documentation to Catalog and Dataset

Posted by GitBox <gi...@apache.org>.
LadyForest commented on code in PR #241:
URL: https://github.com/apache/flink-table-store/pull/241#discussion_r929725419


##########
docs/content/docs/engines/spark.md:
##########
@@ -50,42 +50,37 @@ Alternatively, you can copy `flink-table-store-spark-{{< version >}}.jar` under
 
 ## Catalog
 
-The following command registers the Table Store's Spark catalog with the name `table_store`:
+The following command registers the Table Store's Spark catalog with the name `tablestore`:
 
 ```bash
 spark-sql ... \
-    --conf spark.sql.catalog.table_store=org.apache.flink.table.store.spark.SparkCatalog \
-    --conf spark.sql.catalog.table_store.warehouse=file:/tmp/warehouse
+    --conf spark.sql.catalog.tablestore=org.apache.flink.table.store.spark.SparkCatalog \
+    --conf spark.sql.catalog.tablestore.warehouse=file:/tmp/warehouse
 ```
 
 Some extra configurations are needed if your Table Store Catalog uses the Hive
 Metastore (No extra configuration is required for read-only).
 
 ```bash
 spark-sql ... \
-    --conf spark.sql.catalog.table_store=org.apache.flink.table.store.spark.SparkCatalog \
-    --conf spark.sql.catalog.table_store.warehouse=file:/tmp/warehouse \
-    --conf spark.sql.catalog.table_store.metastore=hive \
-    --conf spark.sql.catalog.table_store.uri=thrift://...
+    --conf spark.sql.catalog.tablestore=org.apache.flink.table.store.spark.SparkCatalog \
+    --conf spark.sql.catalog.tablestore.warehouse=file:/tmp/warehouse \
+    --conf spark.sql.catalog.tablestore.metastore=hive \
+    --conf spark.sql.catalog.tablestore.uri=thrift://...
 ```
 
-## Create Temporary View
-
-Use the `CREATE TEMPORARY VIEW` command to create a Spark mapping table on top of
-an existing Table Store table if you don't want to use Table Store Catalog.
+## Query Table
 
 ```sql
-CREATE TEMPORARY VIEW myTable
-USING tablestore
-OPTIONS (
-  path "file:/tmp/warehouse/default.db/myTable"
-)
+SELECT * FROM tablestore.default.myTable;
 ```
 
-## Query Table
+## DataSet
+
+You can load a mapping table as DataSet on top of an existing Table Store table if you don't want to use Table Store Catalog.
 
 ```sql

Review Comment:
   ```suggestion
   ```spark
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org