You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by GitBox <gi...@apache.org> on 2022/07/11 11:43:34 UTC

[GitHub] [flink-table-store] JingsongLi opened a new pull request, #208: [FLINK-28483] Basic schema evolution for table store

JingsongLi opened a new pull request, #208:
URL: https://github.com/apache/flink-table-store/pull/208

   Currently Flink 1.15 does not have the DDL for modifying table schema, but we can expose the most basic modifications of Schema at Catalog level and Spark read side, such as adding fields.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-table-store] LadyForest commented on a diff in pull request #208: [FLINK-28483] Basic schema evolution for table store

Posted by GitBox <gi...@apache.org>.
LadyForest commented on code in PR #208:
URL: https://github.com/apache/flink-table-store/pull/208#discussion_r918611551


##########
flink-table-store-connector/src/main/java/org/apache/flink/table/store/connector/FlinkCatalog.java:
##########
@@ -178,40 +206,75 @@ public void createTable(ObjectPath tablePath, CatalogBaseTable table, boolean ig
     public void alterTable(
             ObjectPath tablePath, CatalogBaseTable newTable, boolean ignoreIfNotExists)
             throws TableNotExistException, CatalogException {
+        if (ignoreIfNotExists && !tableExists(tablePath)) {
+            return;
+        }
+
+        CatalogTable table = getTable(tablePath);
+
+        // Currently, Flink SQL only support altering table properties.
+        validateAlterTable(table, (CatalogTable) newTable);
+
+        List<SchemaChange> changes = new ArrayList<>();
+        Map<String, String> oldProperties = table.getOptions();
+        for (Map.Entry<String, String> entry : newTable.getOptions().entrySet()) {
+            String key = entry.getKey();
+            String value = entry.getValue();
+
+            if (Objects.equals(value, oldProperties.get(key))) {
+                continue;
+            }
+
+            if (PATH.key().equalsIgnoreCase(key)) {
+                throw new IllegalArgumentException("Illegal table path in table options: " + value);
+            }
+
+            changes.add(SchemaChange.setOption(key, value));
+        }
+
+        oldProperties
+                .keySet()
+                .forEach(
+                        k -> {
+                            if (!newTable.getOptions().containsKey(k)) {
+                                changes.add(SchemaChange.removeOption(k));
+                            }
+                        });
+
         try {
-            catalog.alterTable(
-                    tablePath, convertTableToSchema(tablePath, newTable), ignoreIfNotExists);
+            catalog.alterTable(tablePath, changes, ignoreIfNotExists);
         } catch (Catalog.TableNotExistException e) {
             throw new TableNotExistException(getName(), e.tablePath());
         }
     }
 
-    private UpdateSchema convertTableToSchema(ObjectPath tablePath, CatalogBaseTable baseTable) {
-        if (!(baseTable instanceof CatalogTable)) {
-            throw new UnsupportedOperationException(
-                    "Only support CatalogTable, but is: " + baseTable.getClass());
-        }
-        CatalogTable table = (CatalogTable) baseTable;
-        Map<String, String> options = table.getOptions();
-        if (options.containsKey(CONNECTOR.key())) {
-            throw new CatalogException(
-                    String.format(
-                            "Table Store Catalog only supports table store tables, not '%s' connector."
-                                    + " You can create TEMPORARY table instead.",
-                            options.get(CONNECTOR.key())));
+    private static void validateAlterTable(CatalogTable ct1, CatalogTable ct2) {
+        org.apache.flink.table.api.TableSchema ts1 = ct1.getSchema();
+        org.apache.flink.table.api.TableSchema ts2 = ct2.getSchema();
+        boolean equalsPrimary = false;

Review Comment:
   Nit: `pkEquals`? 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-table-store] LadyForest commented on a diff in pull request #208: [FLINK-28483] Basic schema evolution for table store

Posted by GitBox <gi...@apache.org>.
LadyForest commented on code in PR #208:
URL: https://github.com/apache/flink-table-store/pull/208#discussion_r918664592


##########
flink-table-store-spark/src/main/java/org/apache/flink/table/store/spark/SparkTable.java:
##########
@@ -61,4 +62,8 @@ public Set<TableCapability> capabilities() {
         capabilities.add(TableCapability.BATCH_READ);
         return capabilities;
     }
+
+    public TableSchema tableSchema() {

Review Comment:
   This method is never used?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-table-store] JingsongLi merged pull request #208: [FLINK-28483] Basic schema evolution for table store

Posted by GitBox <gi...@apache.org>.
JingsongLi merged PR #208:
URL: https://github.com/apache/flink-table-store/pull/208


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-table-store] JingsongLi commented on a diff in pull request #208: [FLINK-28483] Basic schema evolution for table store

Posted by GitBox <gi...@apache.org>.
JingsongLi commented on code in PR #208:
URL: https://github.com/apache/flink-table-store/pull/208#discussion_r917854006


##########
flink-table-store-spark/src/main/java/org/apache/flink/table/store/spark/SparkCatalog.java:
##########
@@ -150,12 +200,12 @@ public Table createTable(
     }
 
     @Override
-    public Table alterTable(Identifier ident, TableChange... changes) {

Review Comment:
   We can add more supports for SparkCatalog, like database related. create table.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-table-store] LadyForest commented on a diff in pull request #208: [FLINK-28483] Basic schema evolution for table store

Posted by GitBox <gi...@apache.org>.
LadyForest commented on code in PR #208:
URL: https://github.com/apache/flink-table-store/pull/208#discussion_r918611551


##########
flink-table-store-connector/src/main/java/org/apache/flink/table/store/connector/FlinkCatalog.java:
##########
@@ -178,40 +206,75 @@ public void createTable(ObjectPath tablePath, CatalogBaseTable table, boolean ig
     public void alterTable(
             ObjectPath tablePath, CatalogBaseTable newTable, boolean ignoreIfNotExists)
             throws TableNotExistException, CatalogException {
+        if (ignoreIfNotExists && !tableExists(tablePath)) {
+            return;
+        }
+
+        CatalogTable table = getTable(tablePath);
+
+        // Currently, Flink SQL only support altering table properties.
+        validateAlterTable(table, (CatalogTable) newTable);
+
+        List<SchemaChange> changes = new ArrayList<>();
+        Map<String, String> oldProperties = table.getOptions();
+        for (Map.Entry<String, String> entry : newTable.getOptions().entrySet()) {
+            String key = entry.getKey();
+            String value = entry.getValue();
+
+            if (Objects.equals(value, oldProperties.get(key))) {
+                continue;
+            }
+
+            if (PATH.key().equalsIgnoreCase(key)) {
+                throw new IllegalArgumentException("Illegal table path in table options: " + value);
+            }
+
+            changes.add(SchemaChange.setOption(key, value));
+        }
+
+        oldProperties
+                .keySet()
+                .forEach(
+                        k -> {
+                            if (!newTable.getOptions().containsKey(k)) {
+                                changes.add(SchemaChange.removeOption(k));
+                            }
+                        });
+
         try {
-            catalog.alterTable(
-                    tablePath, convertTableToSchema(tablePath, newTable), ignoreIfNotExists);
+            catalog.alterTable(tablePath, changes, ignoreIfNotExists);
         } catch (Catalog.TableNotExistException e) {
             throw new TableNotExistException(getName(), e.tablePath());
         }
     }
 
-    private UpdateSchema convertTableToSchema(ObjectPath tablePath, CatalogBaseTable baseTable) {
-        if (!(baseTable instanceof CatalogTable)) {
-            throw new UnsupportedOperationException(
-                    "Only support CatalogTable, but is: " + baseTable.getClass());
-        }
-        CatalogTable table = (CatalogTable) baseTable;
-        Map<String, String> options = table.getOptions();
-        if (options.containsKey(CONNECTOR.key())) {
-            throw new CatalogException(
-                    String.format(
-                            "Table Store Catalog only supports table store tables, not '%s' connector."
-                                    + " You can create TEMPORARY table instead.",
-                            options.get(CONNECTOR.key())));
+    private static void validateAlterTable(CatalogTable ct1, CatalogTable ct2) {
+        org.apache.flink.table.api.TableSchema ts1 = ct1.getSchema();
+        org.apache.flink.table.api.TableSchema ts2 = ct2.getSchema();
+        boolean equalsPrimary = false;

Review Comment:
   Nit: `pkEquality` ? 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-table-store] LadyForest commented on a diff in pull request #208: [FLINK-28483] Basic schema evolution for table store

Posted by GitBox <gi...@apache.org>.
LadyForest commented on code in PR #208:
URL: https://github.com/apache/flink-table-store/pull/208#discussion_r918622451


##########
flink-table-store-connector/src/test/java/org/apache/flink/table/store/connector/SchemaChangeITCase.java:
##########
@@ -0,0 +1,45 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.store.connector;
+
+import org.junit.Test;
+
+import java.util.Map;
+
+import static org.assertj.core.api.Assertions.assertThat;
+
+/** ITCase for schema changes. */
+public class SchemaChangeITCase extends CatalogITCaseBase {

Review Comment:
   Nit: How about adding a TODO to denote that this test should cover more cases once Flink supports more ALTER operations.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-table-store] LadyForest commented on a diff in pull request #208: [FLINK-28483] Basic schema evolution for table store

Posted by GitBox <gi...@apache.org>.
LadyForest commented on code in PR #208:
URL: https://github.com/apache/flink-table-store/pull/208#discussion_r918514001


##########
flink-table-store-core/src/main/java/org/apache/flink/table/store/file/schema/SchemaChange.java:
##########
@@ -0,0 +1,206 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.store.file.schema;
+
+import org.apache.flink.table.types.logical.LogicalType;
+
+import javax.annotation.Nullable;
+
+import java.util.Objects;
+
+/** Schema change to table. */
+public interface SchemaChange {

Review Comment:
   > Lack `updateColumnNullability` and `updateColumnComment`.
   
   What about adding a TODO to track?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink-table-store] JingsongLi commented on a diff in pull request #208: [FLINK-28483] Basic schema evolution for table store

Posted by GitBox <gi...@apache.org>.
JingsongLi commented on code in PR #208:
URL: https://github.com/apache/flink-table-store/pull/208#discussion_r917853557


##########
flink-table-store-core/src/main/java/org/apache/flink/table/store/file/schema/SchemaChange.java:
##########
@@ -0,0 +1,206 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.store.file.schema;
+
+import org.apache.flink.table.types.logical.LogicalType;
+
+import javax.annotation.Nullable;
+
+import java.util.Objects;
+
+/** Schema change to table. */
+public interface SchemaChange {

Review Comment:
   Lack `updateColumnNullability` and `updateColumnComment`.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org