You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by GitBox <gi...@apache.org> on 2022/01/05 15:45:49 UTC

[GitHub] [flink] twalthr opened a new pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

twalthr opened a new pull request #18274:
URL: https://github.com/apache/flink/pull/18274


   ## What is the purpose of the change
   
   This refactors the serialization for types in the JSON plan. It enables both `DataType` and `LogicalType` for all non-legacy flavors.
   
   Also data views are represented as regular structured types with custom conversion class if necessary.
   
   ## Brief change log
   
   - Use official serializable representation if possible (`LogicalType.asSerializableString`) for compact JSON
   - Omit default values for compact JSON
   - Introduce a data type serializer for data views and other use cases of FLIP-190
   
   ## Verifying this change
   
   This change is already covered by existing tests.
   
   ## Does this pull request potentially affect one of the following parts:
   
     - Dependencies (does it add or upgrade a dependency): no
     - The public API, i.e., is any changed class annotated with `@Public(Evolving)`: yes
     - The serializers: no
     - The runtime per-record code paths (performance sensitive): no
     - Anything that affects deployment or recovery: JobManager (and its components), Checkpointing, Kubernetes/Yarn, ZooKeeper: no
     - The S3 file system connector: no
   
   ## Documentation
   
     - Does this pull request introduce a new feature? yes
     - If yes, how is the feature documented? JavaDocs
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18274:
URL: https://github.com/apache/flink/pull/18274#issuecomment-1005845834


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=28985",
       "triggerID" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29041",
       "triggerID" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29092",
       "triggerID" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29112",
       "triggerID" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 9f800b453598a8cbf015583929e5fabda9fedbf6 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29092) 
   * 718ea97838ff10191544cc7460d3c18380b0e119 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29112) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18274:
URL: https://github.com/apache/flink/pull/18274#issuecomment-1005845834


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=28985",
       "triggerID" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29041",
       "triggerID" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 8b30dbb1acd4bffe0c4c5d25b669705deb19463e Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29041) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] slinkydeveloper commented on a change in pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
slinkydeveloper commented on a change in pull request #18274:
URL: https://github.com/apache/flink/pull/18274#discussion_r780310451



##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/FlinkSerializationProvider.java
##########
@@ -0,0 +1,54 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.SerializationConfig;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.SerializerProvider;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.DefaultSerializerProvider;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.SerializerFactory;
+
+/** {@link SerializerProvider} that offers a Flink-specific {@link SerdeContext}. */
+class FlinkSerializationProvider extends DefaultSerializerProvider {

Review comment:
       You don't need this one anymore

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/DataTypeJsonDeserializer.java
##########
@@ -0,0 +1,190 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.annotation.Internal;
+import org.apache.flink.table.api.DataTypes;
+import org.apache.flink.table.api.TableException;
+import org.apache.flink.table.types.CollectionDataType;
+import org.apache.flink.table.types.DataType;
+import org.apache.flink.table.types.FieldsDataType;
+import org.apache.flink.table.types.KeyValueDataType;
+import org.apache.flink.table.types.extraction.ExtractionUtils;
+import org.apache.flink.table.types.logical.DistinctType;
+import org.apache.flink.table.types.logical.LogicalType;
+import org.apache.flink.table.types.logical.MapType;
+import org.apache.flink.table.types.logical.utils.LogicalTypeChecks;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonParser;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.JsonNode;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.StdDeserializer;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ArrayNode;
+
+import javax.annotation.Nullable;
+
+import java.io.IOException;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.stream.Collectors;
+import java.util.stream.IntStream;
+
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_CONVERSION_CLASS;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_ELEMENT_CLASS;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_FIELDS;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_FIELD_NAME;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_KEY_CLASS;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_TYPE;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_VALUE_CLASS;
+
+/**
+ * JSON deserializer for {@link DataType}.
+ *
+ * @see DataTypeJsonSerializer for the reverse operation
+ */
+@Internal
+public class DataTypeJsonDeserializer extends StdDeserializer<DataType> {
+
+    public DataTypeJsonDeserializer() {
+        super(DataType.class);
+    }
+
+    @Override
+    public DataType deserialize(JsonParser jsonParser, DeserializationContext ctx)
+            throws IOException {
+        final JsonNode dataTypeNode = jsonParser.readValueAsTree();
+        final SerdeContext serdeContext = SerdeContext.get(ctx);
+        return deserialize(dataTypeNode, serdeContext);
+    }
+
+    public static DataType deserialize(JsonNode dataTypeNode, SerdeContext serdeContext) {
+        if (dataTypeNode.isTextual()) {
+            return deserializeWithInternalClass(dataTypeNode, serdeContext);
+        } else {
+            return deserializeWithExternalClass(dataTypeNode, serdeContext);
+        }
+    }
+
+    private static DataType deserializeWithInternalClass(
+            JsonNode logicalTypeNode, SerdeContext serdeContext) {
+        final LogicalType logicalType =
+                LogicalTypeJsonDeserializer.deserialize(logicalTypeNode, serdeContext);
+        return DataTypes.of(logicalType).toInternal();
+    }
+
+    private static DataType deserializeWithExternalClass(
+            JsonNode dataTypeNode, SerdeContext serdeContext) {
+        final LogicalType logicalType =
+                LogicalTypeJsonDeserializer.deserialize(
+                        dataTypeNode.get(FIELD_NAME_TYPE), serdeContext);
+        return deserializeClass(logicalType, dataTypeNode, serdeContext);
+    }
+
+    private static DataType deserializeClass(
+            LogicalType logicalType, @Nullable JsonNode classNode, SerdeContext serdeContext) {
+        if (classNode == null) {
+            return DataTypes.of(logicalType).toInternal();
+        }
+
+        final DataType dataType;
+        switch (logicalType.getTypeRoot()) {
+            case ARRAY:
+            case MULTISET:
+                final DataType elementDataType =
+                        deserializeClass(
+                                logicalType.getChildren().get(0),
+                                classNode.get(FIELD_NAME_ELEMENT_CLASS),
+                                serdeContext);
+                dataType = new CollectionDataType(logicalType, elementDataType);
+                break;
+
+            case MAP:
+                final MapType mapType = (MapType) logicalType;
+                final DataType keyDataType =
+                        deserializeClass(
+                                mapType.getKeyType(),
+                                classNode.get(FIELD_NAME_KEY_CLASS),
+                                serdeContext);
+                final DataType valueDataType =
+                        deserializeClass(
+                                mapType.getValueType(),
+                                classNode.get(FIELD_NAME_VALUE_CLASS),
+                                serdeContext);
+                dataType = new KeyValueDataType(mapType, keyDataType, valueDataType);
+                break;
+
+            case ROW:
+            case STRUCTURED_TYPE:
+                final List<String> fieldNames = LogicalTypeChecks.getFieldNames(logicalType);
+                final List<LogicalType> fieldTypes = LogicalTypeChecks.getFieldTypes(logicalType);
+
+                final ArrayNode fieldNodes = (ArrayNode) classNode.get(FIELD_NAME_FIELDS);
+                final Map<String, JsonNode> fieldNodesByName = new HashMap<>();
+                if (fieldNodes != null) {
+                    fieldNodes.forEach(
+                            fieldNode ->
+                                    fieldNodesByName.put(
+                                            fieldNode.get(FIELD_NAME_FIELD_NAME).asText(),
+                                            fieldNode));
+                }
+
+                final List<DataType> fieldDataTypes =
+                        IntStream.range(0, fieldNames.size())
+                                .mapToObj(
+                                        i -> {
+                                            final String fieldName = fieldNames.get(i);
+                                            final LogicalType fieldType = fieldTypes.get(i);
+                                            return deserializeClass(
+                                                    fieldType,
+                                                    fieldNodesByName.get(fieldName),
+                                                    serdeContext);
+                                        })
+                                .collect(Collectors.toList());
+
+                dataType = new FieldsDataType(logicalType, fieldDataTypes);
+                break;
+
+            case DISTINCT_TYPE:
+                final DistinctType distinctType = (DistinctType) logicalType;
+                dataType = deserializeClass(distinctType.getSourceType(), classNode, serdeContext);
+                break;
+
+            default:
+                dataType = DataTypes.of(logicalType).toInternal();
+        }
+
+        final Class<?> conversionClass =
+                loadClass(
+                        classNode.get(FIELD_NAME_CONVERSION_CLASS).asText(),
+                        serdeContext,
+                        String.format("conversion class of data type '%s'", dataType));
+        return dataType.bridgedTo(conversionClass);
+    }
+
+    private static Class<?> loadClass(
+            String className, SerdeContext serdeContext, String explanation) {
+        try {
+            return ExtractionUtils.classForName(className, true, serdeContext.getClassLoader());
+        } catch (ClassNotFoundException e) {
+            throw new TableException(
+                    String.format("Could not load class '%s' for %s.", className, explanation));

Review comment:
       Can you propagate `e` in the exception cause?

##########
File path: flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/serde/LogicalTypeJsonSerdeTest.java
##########
@@ -249,28 +331,44 @@ public void testLogicalTypeSerde() throws IOException {
                                         ObjectIdentifier.of("cat", "db", "distinctType"),
                                         new VarCharType(false, 5))
                                 .build(),
+                        // custom RawType
+                        new RawType<>(Integer.class, IntSerializer.INSTANCE),

Review comment:
       Can you add a test case here for a "real" raw type? For example take `RAW(LocalDateTime.class, LocalDateTimeSerializer.INSTANCE)` (I use it in `CastRulesTest`)

##########
File path: flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/serde/LogicalTypeJsonSerdeTest.java
##########
@@ -60,68 +58,142 @@
 import org.apache.flink.table.types.logical.TimestampKind;
 import org.apache.flink.table.types.logical.TimestampType;
 import org.apache.flink.table.types.logical.TinyIntType;
-import org.apache.flink.table.types.logical.TypeInformationRawType;
 import org.apache.flink.table.types.logical.VarBinaryType;
 import org.apache.flink.table.types.logical.VarCharType;
 import org.apache.flink.table.types.logical.YearMonthIntervalType;
 import org.apache.flink.table.types.logical.ZonedTimestampType;
-import org.apache.flink.table.types.utils.DataTypeUtils;
+import org.apache.flink.table.types.utils.DataTypeFactoryMock;
+import org.apache.flink.table.utils.CatalogManagerMocks;
+import org.apache.flink.types.Row;
 
-import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonGenerator;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonProcessingException;
 import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectReader;
 import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectWriter;
 
 import org.junit.Test;
-import org.junit.runner.RunWith;
-import org.junit.runners.Parameterized;
+import org.junit.jupiter.params.ParameterizedTest;
+import org.junit.jupiter.params.provider.MethodSource;
+import org.junit.runners.Parameterized.Parameters;
 
 import java.io.IOException;
-import java.io.StringWriter;
 import java.util.ArrayList;
 import java.util.Arrays;
 import java.util.Collections;
 import java.util.List;
+import java.util.Optional;
 
-import static org.apache.flink.table.types.utils.LogicalTypeDataTypeConverter.toDataType;
-import static org.apache.flink.table.types.utils.LogicalTypeDataTypeConverter.toLogicalType;
-import static org.junit.Assert.assertEquals;
+import static org.apache.flink.core.testutils.FlinkAssertions.anyCauseMatches;
+import static org.apache.flink.table.api.config.TableConfigOptions.CatalogPlanCompilation.ALL;
+import static org.apache.flink.table.api.config.TableConfigOptions.CatalogPlanCompilation.IDENTIFIER;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerdeTest.configuredSerdeContext;
+import static org.apache.flink.table.utils.CatalogManagerMocks.preparedCatalogManager;
+import static org.assertj.core.api.Assertions.assertThat;
+import static org.assertj.core.api.Assertions.assertThatThrownBy;
 
 /** Tests for {@link LogicalType} serialization and deserialization. */
-@RunWith(Parameterized.class)
-public class LogicalTypeSerdeTest {
+public class LogicalTypeJsonSerdeTest {
 
-    @Parameterized.Parameter public LogicalType logicalType;
+    @ParameterizedTest
+    @MethodSource("testLogicalTypeSerde")
+    public void testLogicalTypeSerde(LogicalType logicalType) throws IOException {
+        final SerdeContext serdeContext = configuredSerdeContext();
+
+        final String json = toJson(serdeContext, logicalType);
+        final LogicalType actual = toLogicalType(serdeContext, json);
+
+        assertThat(actual).isEqualTo(logicalType);
+    }
 
     @Test
-    public void testLogicalTypeSerde() throws IOException {
-        SerdeContext serdeCtx =
-                new SerdeContext(
-                        new FlinkContextImpl(
-                                false,
-                                TableConfig.getDefault(),
-                                new ModuleManager(),
-                                null,
-                                null,
-                                null),
-                        Thread.currentThread().getContextClassLoader(),
-                        FlinkTypeFactory.INSTANCE(),
-                        FlinkSqlOperatorTable.instance());
-        ObjectReader objectReader = JsonSerdeUtil.createObjectReader(serdeCtx);
-        ObjectWriter objectWriter = JsonSerdeUtil.createObjectWriter(serdeCtx);
-
-        StringWriter writer = new StringWriter(100);
-        try (JsonGenerator gen = objectWriter.getFactory().createGenerator(writer)) {
-            gen.writeObject(logicalType);
-        }
-        String json = writer.toString();
-        LogicalType actual = objectReader.readValue(json, LogicalType.class);
-        assertEquals(logicalType, actual);
-        assertEquals(logicalType.asSummaryString(), actual.asSummaryString());
+    public void testIdentifierSerde() {
+        final DataTypeFactoryMock dataTypeFactoryMock = new DataTypeFactoryMock();
+        final TableConfig tableConfig = TableConfig.getDefault();
+        final Configuration config = tableConfig.getConfiguration();
+        final CatalogManager catalogManager =
+                preparedCatalogManager().dataTypeFactory(dataTypeFactoryMock).build();
+        final SerdeContext serdeContext = configuredSerdeContext(catalogManager, tableConfig);
+
+        // minimal plan content
+        config.set(TableConfigOptions.PLAN_COMPILE_CATALOG_OBJECTS, IDENTIFIER);
+        final String minimalJson = toJson(serdeContext, STRUCTURED_TYPE);
+        assertThat(minimalJson).isEqualTo("\"`default_catalog`.`default_database`.`MyType`\"");
+
+        // catalog lookup with miss
+        config.set(
+                TableConfigOptions.PLAN_RESTORE_CATALOG_OBJECTS,
+                TableConfigOptions.CatalogPlanRestore.IDENTIFIER);
+        dataTypeFactoryMock.logicalType = Optional.empty();
+        assertThatThrownBy(() -> toLogicalType(serdeContext, minimalJson))
+                .satisfies(anyCauseMatches(ValidationException.class, "No type found."));
+
+        // catalog lookup
+        config.set(
+                TableConfigOptions.PLAN_RESTORE_CATALOG_OBJECTS,
+                TableConfigOptions.CatalogPlanRestore.IDENTIFIER);
+        dataTypeFactoryMock.logicalType = Optional.of(STRUCTURED_TYPE);
+        assertThat(toLogicalType(serdeContext, minimalJson)).isEqualTo(STRUCTURED_TYPE);
+
+        // maximum plan content
+        config.set(TableConfigOptions.PLAN_COMPILE_CATALOG_OBJECTS, ALL);
+        final String maximumJson = toJson(serdeContext, STRUCTURED_TYPE);
+        assertThat(maximumJson)
+                .isEqualTo(
+                        "{\"type\":\"STRUCTURED_TYPE\","
+                                + "\"objectIdentifier\":"
+                                + "{\"catalogName\":\"default_catalog\","
+                                + "\"databaseName\":\"default_database\","
+                                + "\"tableName\":\"MyType\"},"
+                                + "\"description\":\"My original type.\","
+                                + "\"attributes\":[]}");

Review comment:
       Let's avoid these tests, but instead execute equalities on `ObjectNode`




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18274:
URL: https://github.com/apache/flink/pull/18274#issuecomment-1005845834


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=28985",
       "triggerID" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29041",
       "triggerID" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 8b30dbb1acd4bffe0c4c5d25b669705deb19463e Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29041) 
   * 9f800b453598a8cbf015583929e5fabda9fedbf6 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18274:
URL: https://github.com/apache/flink/pull/18274#issuecomment-1005845834


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=28985",
       "triggerID" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29041",
       "triggerID" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29092",
       "triggerID" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29112",
       "triggerID" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "triggerType" : "PUSH"
     }, {
       "hash" : "c445781fa99453f887f9d91978e5bb9ca9d1f91a",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29114",
       "triggerID" : "c445781fa99453f887f9d91978e5bb9ca9d1f91a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a872e8c5d8aa822a0358d859a462c979a4965750",
       "status" : "CANCELED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29181",
       "triggerID" : "a872e8c5d8aa822a0358d859a462c979a4965750",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ce106cfbce02a10dcddc3a02ddb56a2183a763",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29194",
       "triggerID" : "d9ce106cfbce02a10dcddc3a02ddb56a2183a763",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * a872e8c5d8aa822a0358d859a462c979a4965750 Azure: [CANCELED](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29181) 
   * d9ce106cfbce02a10dcddc3a02ddb56a2183a763 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29194) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18274:
URL: https://github.com/apache/flink/pull/18274#issuecomment-1005845834


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=28985",
       "triggerID" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29041",
       "triggerID" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29092",
       "triggerID" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 9f800b453598a8cbf015583929e5fabda9fedbf6 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29092) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot commented on pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
flinkbot commented on pull request #18274:
URL: https://github.com/apache/flink/pull/18274#issuecomment-1005845834


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * dd2c1d149708b916cb05bd2b0580015ae2e1f889 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18274:
URL: https://github.com/apache/flink/pull/18274#issuecomment-1005845834


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=28985",
       "triggerID" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29041",
       "triggerID" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29092",
       "triggerID" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29112",
       "triggerID" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "triggerType" : "PUSH"
     }, {
       "hash" : "c445781fa99453f887f9d91978e5bb9ca9d1f91a",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29114",
       "triggerID" : "c445781fa99453f887f9d91978e5bb9ca9d1f91a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a872e8c5d8aa822a0358d859a462c979a4965750",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "a872e8c5d8aa822a0358d859a462c979a4965750",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * c445781fa99453f887f9d91978e5bb9ca9d1f91a Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29114) 
   * a872e8c5d8aa822a0358d859a462c979a4965750 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] slinkydeveloper commented on a change in pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
slinkydeveloper commented on a change in pull request #18274:
URL: https://github.com/apache/flink/pull/18274#discussion_r779451037



##########
File path: flink-table/flink-table-common/src/main/java/org/apache/flink/table/api/DataTypes.java
##########
@@ -104,6 +106,9 @@
      * @see LogicalType#getDefaultConversion()
      */
     public static DataType of(LogicalType logicalType) {
+        Preconditions.checkArgument(
+                !LogicalTypeChecks.hasNested(logicalType, t -> t.is(LogicalTypeRoot.UNRESOLVED)),
+                "Unresolved logical types cannot be used to create a data type at this location.");
         return TypeConversions.fromLogicalToDataType(logicalType);

Review comment:
       I think we should do this check inside `fromLogicalToDataType`?

##########
File path: flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/utils/LogicalTypeDataTypeConverter.java
##########
@@ -201,9 +200,29 @@ public DataType visit(RowType rowType) {
 
         @Override
         public DataType visit(DistinctType distinctType) {
-            return new FieldsDataType(
-                    distinctType,
-                    Collections.singletonList(distinctType.getSourceType().accept(this)));
+            final DataType sourceDataType = distinctType.getSourceType().accept(this);
+            if (sourceDataType instanceof AtomicDataType) {
+                return new AtomicDataType(distinctType, sourceDataType.getConversionClass());
+            } else if (sourceDataType instanceof CollectionDataType) {
+                final CollectionDataType collectionDataType = (CollectionDataType) sourceDataType;
+                return new CollectionDataType(
+                        distinctType,
+                        collectionDataType.getConversionClass(),
+                        collectionDataType.getElementDataType());
+            } else if (sourceDataType instanceof KeyValueDataType) {
+                final KeyValueDataType keyValueDataType = (KeyValueDataType) sourceDataType;
+                return new KeyValueDataType(
+                        distinctType,
+                        keyValueDataType.getConversionClass(),
+                        keyValueDataType.getKeyDataType(),
+                        keyValueDataType.getValueDataType());
+            } else if (sourceDataType instanceof FieldsDataType) {
+                return new FieldsDataType(
+                        distinctType,
+                        sourceDataType.getConversionClass(),
+                        sourceDataType.getChildren());
+            }
+            throw new IllegalStateException("Unexpected data type instance.");

Review comment:
       Why `distinctType.getSourceType().accept(this)` isn't enough? Why preserving the `distinctType`? If we need to preserve distinct type, then shouldn't exist some DataType implementation corresponding to distinct type?

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/DataTypeJsonDeserializer.java
##########
@@ -0,0 +1,191 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.annotation.Internal;
+import org.apache.flink.table.api.DataTypes;
+import org.apache.flink.table.api.TableException;
+import org.apache.flink.table.types.CollectionDataType;
+import org.apache.flink.table.types.DataType;
+import org.apache.flink.table.types.FieldsDataType;
+import org.apache.flink.table.types.KeyValueDataType;
+import org.apache.flink.table.types.logical.DistinctType;
+import org.apache.flink.table.types.logical.LogicalType;
+import org.apache.flink.table.types.logical.MapType;
+import org.apache.flink.table.types.logical.utils.LogicalTypeChecks;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonParser;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.JsonNode;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.StdDeserializer;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ArrayNode;
+
+import org.apache.commons.lang3.ClassUtils;
+
+import javax.annotation.Nullable;
+
+import java.io.IOException;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.stream.Collectors;
+import java.util.stream.IntStream;
+
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_CONVERSION_CLASS;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_ELEMENT_CLASS;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_FIELDS;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_FIELD_NAME;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_KEY_CLASS;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_TYPE;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_VALUE_CLASS;
+
+/**
+ * JSON deserializer for {@link DataType}.
+ *
+ * @see DataTypeJsonSerializer for the reverse operation
+ */
+@Internal
+public class DataTypeJsonDeserializer extends StdDeserializer<DataType> {
+
+    public DataTypeJsonDeserializer() {
+        super(DataType.class);
+    }
+
+    @Override
+    public DataType deserialize(JsonParser jsonParser, DeserializationContext ctx)
+            throws IOException {
+        final JsonNode dataTypeNode = jsonParser.readValueAsTree();
+        final SerdeContext serdeContext = SerdeContext.from(ctx);
+        return deserialize(dataTypeNode, serdeContext);
+    }
+
+    public static DataType deserialize(JsonNode dataTypeNode, SerdeContext serdeContext) {
+        if (dataTypeNode.isTextual()) {
+            return deserializeWithInternalClass(dataTypeNode, serdeContext);
+        } else {
+            return deserializeWithExternalClass(dataTypeNode, serdeContext);
+        }
+    }
+
+    private static DataType deserializeWithInternalClass(
+            JsonNode logicalTypeNode, SerdeContext serdeContext) {
+        final LogicalType logicalType =
+                LogicalTypeJsonDeserializer.deserialize(logicalTypeNode, serdeContext);
+        return DataTypes.of(logicalType).toInternal();
+    }
+
+    private static DataType deserializeWithExternalClass(
+            JsonNode dataTypeNode, SerdeContext serdeContext) {
+        final LogicalType logicalType =
+                LogicalTypeJsonDeserializer.deserialize(
+                        dataTypeNode.get(FIELD_NAME_TYPE), serdeContext);
+        return deserializeClass(logicalType, dataTypeNode, serdeContext);
+    }
+
+    private static DataType deserializeClass(
+            LogicalType logicalType, @Nullable JsonNode classNode, SerdeContext serdeContext) {
+        if (classNode == null) {
+            return DataTypes.of(logicalType).toInternal();
+        }
+
+        final DataType dataType;
+        switch (logicalType.getTypeRoot()) {
+            case ARRAY:
+            case MULTISET:
+                final DataType elementDataType =
+                        deserializeClass(
+                                logicalType.getChildren().get(0),
+                                classNode.get(FIELD_NAME_ELEMENT_CLASS),
+                                serdeContext);
+                dataType = new CollectionDataType(logicalType, elementDataType);
+                break;
+
+            case MAP:
+                final MapType mapType = (MapType) logicalType;
+                final DataType keyDataType =
+                        deserializeClass(
+                                mapType.getKeyType(),
+                                classNode.get(FIELD_NAME_KEY_CLASS),
+                                serdeContext);
+                final DataType valueDataType =
+                        deserializeClass(
+                                mapType.getValueType(),
+                                classNode.get(FIELD_NAME_VALUE_CLASS),
+                                serdeContext);
+                dataType = new KeyValueDataType(mapType, keyDataType, valueDataType);
+                break;
+
+            case ROW:
+            case STRUCTURED_TYPE:
+                final List<String> fieldNames = LogicalTypeChecks.getFieldNames(logicalType);
+                final List<LogicalType> fieldTypes = LogicalTypeChecks.getFieldTypes(logicalType);
+
+                final ArrayNode fieldNodes = (ArrayNode) classNode.get(FIELD_NAME_FIELDS);
+                final Map<String, JsonNode> fieldNodesByName = new HashMap<>();
+                if (fieldNodes != null) {
+                    fieldNodes.forEach(
+                            fieldNode ->
+                                    fieldNodesByName.put(
+                                            fieldNode.get(FIELD_NAME_FIELD_NAME).asText(),
+                                            fieldNode));
+                }
+
+                final List<DataType> fieldDataTypes =
+                        IntStream.range(0, fieldNames.size())
+                                .mapToObj(
+                                        i -> {
+                                            final String fieldName = fieldNames.get(i);
+                                            final LogicalType fieldType = fieldTypes.get(i);
+                                            return deserializeClass(
+                                                    fieldType,
+                                                    fieldNodesByName.get(fieldName),
+                                                    serdeContext);
+                                        })
+                                .collect(Collectors.toList());
+
+                dataType = new FieldsDataType(logicalType, fieldDataTypes);
+                break;
+
+            case DISTINCT_TYPE:
+                final DistinctType distinctType = (DistinctType) logicalType;
+                dataType = deserializeClass(distinctType.getSourceType(), classNode, serdeContext);
+                break;
+
+            default:
+                dataType = DataTypes.of(logicalType).toInternal();
+        }
+
+        final Class<?> conversionClass =
+                loadClass(
+                        classNode.get(FIELD_NAME_CONVERSION_CLASS).asText(),
+                        serdeContext,
+                        String.format("conversion class of data type '%s'", dataType));
+        return dataType.bridgedTo(conversionClass);
+    }
+
+    private static Class<?> loadClass(
+            String className, SerdeContext serdeContext, String explanation) {
+        try {
+            return ClassUtils.getClass(serdeContext.getClassLoader(), className, true);

Review comment:
       Can you avoid using this utility from commons?

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/DataTypeJsonSerializer.java
##########
@@ -0,0 +1,165 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.annotation.Internal;
+import org.apache.flink.table.api.DataTypes.Field;
+import org.apache.flink.table.types.CollectionDataType;
+import org.apache.flink.table.types.DataType;
+import org.apache.flink.table.types.KeyValueDataType;
+import org.apache.flink.table.types.logical.LogicalType;
+import org.apache.flink.table.types.utils.DataTypeUtils;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonGenerator;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.SerializerProvider;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.std.StdSerializer;
+
+import java.io.IOException;
+import java.util.List;
+import java.util.stream.Collectors;
+
+/**
+ * JSON serializer for {@link LogicalType}.
+ *
+ * @see DataTypeJsonDeserializer for the reverse operation
+ */
+@Internal
+public final class DataTypeJsonSerializer extends StdSerializer<DataType> {
+    private static final long serialVersionUID = 1L;
+
+    /*
+    Example generated JSON for
+

Review comment:
       Can you add an example for a DataType not bridged, the ones using only the text representation of the `LogicalType`?

##########
File path: flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/serde/LogicalTypeCatalogJsonSerdeTest.java
##########
@@ -0,0 +1,124 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.configuration.Configuration;
+import org.apache.flink.table.api.TableConfig;
+import org.apache.flink.table.api.ValidationException;
+import org.apache.flink.table.api.config.TableConfigOptions;
+import org.apache.flink.table.api.config.TableConfigOptions.CatalogPlanRestore;
+import org.apache.flink.table.catalog.CatalogManager;
+import org.apache.flink.table.catalog.ObjectIdentifier;
+import org.apache.flink.table.types.logical.StructuredType;
+import org.apache.flink.table.types.logical.UserDefinedType;
+import org.apache.flink.table.types.utils.DataTypeFactoryMock;
+import org.apache.flink.table.utils.CatalogManagerMocks;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper;
+
+import org.junit.Test;

Review comment:
       Same, use JUnit 5

##########
File path: flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/serde/LogicalTypeJsonSerdeTest.java
##########
@@ -60,20 +53,21 @@
 import org.apache.flink.table.types.logical.TimestampKind;
 import org.apache.flink.table.types.logical.TimestampType;
 import org.apache.flink.table.types.logical.TinyIntType;
-import org.apache.flink.table.types.logical.TypeInformationRawType;
 import org.apache.flink.table.types.logical.VarBinaryType;
 import org.apache.flink.table.types.logical.VarCharType;
 import org.apache.flink.table.types.logical.YearMonthIntervalType;
 import org.apache.flink.table.types.logical.ZonedTimestampType;
-import org.apache.flink.table.types.utils.DataTypeUtils;
+import org.apache.flink.types.Row;
 
 import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonGenerator;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonProcessingException;
 import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper;
-import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.module.SimpleModule;
 
 import org.junit.Test;
 import org.junit.runner.RunWith;
 import org.junit.runners.Parameterized;
+import org.junit.runners.Parameterized.Parameter;
+import org.junit.runners.Parameterized.Parameters;

Review comment:
       JUnit 5

##########
File path: flink-table/flink-table-planner/src/test/resources/jsonplan/testGetJsonPlan.out
##########
@@ -21,21 +21,7 @@
             }
          },
          "id":1,
-         "outputType":{
-            "type":"ROW",
-            "nullable":true,
-            "fields":[
-               {
-                  "a":"BIGINT"
-               },
-               {
-                  "b":"INT"
-               },
-               {
-                  "c":"VARCHAR(2147483647)"
-               }
-            ]
-         },
+         "outputType":"ROW<`a` BIGINT, `b` INT, `c` VARCHAR(2147483647)>",

Review comment:
       Nice!

##########
File path: flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/serde/DataTypeJsonSerdeTest.java
##########
@@ -0,0 +1,152 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.table.api.DataTypes;
+import org.apache.flink.table.api.TableConfig;
+import org.apache.flink.table.catalog.CatalogManager;
+import org.apache.flink.table.module.ModuleManager;
+import org.apache.flink.table.planner.calcite.FlinkContextImpl;
+import org.apache.flink.table.planner.calcite.FlinkTypeFactory;
+import org.apache.flink.table.planner.functions.sql.FlinkSqlOperatorTable;
+import org.apache.flink.table.types.DataType;
+import org.apache.flink.table.types.logical.LogicalType;
+import org.apache.flink.table.utils.CatalogManagerMocks;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonGenerator;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonProcessingException;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.module.SimpleModule;
+
+import org.junit.Test;
+import org.junit.runner.RunWith;
+import org.junit.runners.Parameterized;
+import org.junit.runners.Parameterized.Parameter;
+import org.junit.runners.Parameterized.Parameters;
+
+import java.io.IOException;
+import java.io.StringWriter;
+import java.util.Arrays;
+import java.util.List;
+
+import static org.assertj.core.api.Assertions.assertThat;
+
+/** Tests for {@link DataType} serialization and deserialization. */
+@RunWith(Parameterized.class)
+public class DataTypeJsonSerdeTest {
+
+    @Parameter public DataType dataType;
+
+    @Test
+    public void testDataTypeSerde() throws IOException {
+        final ObjectMapper mapper = configuredObjectMapper();
+        final String json = toJson(mapper, dataType);
+        final DataType actual = toDataType(mapper, json);
+
+        if (json.contains("children")) {
+            System.out.println();
+        }

Review comment:
       Have you forgot to remove this after an intense debugging session?

##########
File path: flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/serde/DataTypeJsonSerdeTest.java
##########
@@ -0,0 +1,152 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.table.api.DataTypes;
+import org.apache.flink.table.api.TableConfig;
+import org.apache.flink.table.catalog.CatalogManager;
+import org.apache.flink.table.module.ModuleManager;
+import org.apache.flink.table.planner.calcite.FlinkContextImpl;
+import org.apache.flink.table.planner.calcite.FlinkTypeFactory;
+import org.apache.flink.table.planner.functions.sql.FlinkSqlOperatorTable;
+import org.apache.flink.table.types.DataType;
+import org.apache.flink.table.types.logical.LogicalType;
+import org.apache.flink.table.utils.CatalogManagerMocks;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonGenerator;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonProcessingException;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.module.SimpleModule;
+
+import org.junit.Test;
+import org.junit.runner.RunWith;
+import org.junit.runners.Parameterized;
+import org.junit.runners.Parameterized.Parameter;
+import org.junit.runners.Parameterized.Parameters;

Review comment:
       Please use JUnit 5 with `ParametrizedTest` and `MethodSource`. We should not add JUnit 4 tests anymore.

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/DataTypeJsonDeserializer.java
##########
@@ -0,0 +1,191 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.annotation.Internal;
+import org.apache.flink.table.api.DataTypes;
+import org.apache.flink.table.api.TableException;
+import org.apache.flink.table.types.CollectionDataType;
+import org.apache.flink.table.types.DataType;
+import org.apache.flink.table.types.FieldsDataType;
+import org.apache.flink.table.types.KeyValueDataType;
+import org.apache.flink.table.types.logical.DistinctType;
+import org.apache.flink.table.types.logical.LogicalType;
+import org.apache.flink.table.types.logical.MapType;
+import org.apache.flink.table.types.logical.utils.LogicalTypeChecks;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonParser;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.JsonNode;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.StdDeserializer;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ArrayNode;
+
+import org.apache.commons.lang3.ClassUtils;
+
+import javax.annotation.Nullable;
+
+import java.io.IOException;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.stream.Collectors;
+import java.util.stream.IntStream;
+
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_CONVERSION_CLASS;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_ELEMENT_CLASS;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_FIELDS;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_FIELD_NAME;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_KEY_CLASS;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_TYPE;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_VALUE_CLASS;
+
+/**
+ * JSON deserializer for {@link DataType}.
+ *
+ * @see DataTypeJsonSerializer for the reverse operation
+ */
+@Internal
+public class DataTypeJsonDeserializer extends StdDeserializer<DataType> {
+
+    public DataTypeJsonDeserializer() {
+        super(DataType.class);
+    }
+
+    @Override
+    public DataType deserialize(JsonParser jsonParser, DeserializationContext ctx)
+            throws IOException {
+        final JsonNode dataTypeNode = jsonParser.readValueAsTree();
+        final SerdeContext serdeContext = SerdeContext.from(ctx);
+        return deserialize(dataTypeNode, serdeContext);
+    }
+
+    public static DataType deserialize(JsonNode dataTypeNode, SerdeContext serdeContext) {
+        if (dataTypeNode.isTextual()) {

Review comment:
       Ignore this, I see you have it in Serializer

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/DataTypeJsonDeserializer.java
##########
@@ -0,0 +1,191 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.annotation.Internal;
+import org.apache.flink.table.api.DataTypes;
+import org.apache.flink.table.api.TableException;
+import org.apache.flink.table.types.CollectionDataType;
+import org.apache.flink.table.types.DataType;
+import org.apache.flink.table.types.FieldsDataType;
+import org.apache.flink.table.types.KeyValueDataType;
+import org.apache.flink.table.types.logical.DistinctType;
+import org.apache.flink.table.types.logical.LogicalType;
+import org.apache.flink.table.types.logical.MapType;
+import org.apache.flink.table.types.logical.utils.LogicalTypeChecks;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonParser;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.JsonNode;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.StdDeserializer;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ArrayNode;
+
+import org.apache.commons.lang3.ClassUtils;
+
+import javax.annotation.Nullable;
+
+import java.io.IOException;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.stream.Collectors;
+import java.util.stream.IntStream;
+
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_CONVERSION_CLASS;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_ELEMENT_CLASS;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_FIELDS;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_FIELD_NAME;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_KEY_CLASS;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_TYPE;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_VALUE_CLASS;
+
+/**
+ * JSON deserializer for {@link DataType}.
+ *
+ * @see DataTypeJsonSerializer for the reverse operation
+ */
+@Internal
+public class DataTypeJsonDeserializer extends StdDeserializer<DataType> {
+
+    public DataTypeJsonDeserializer() {
+        super(DataType.class);
+    }
+
+    @Override
+    public DataType deserialize(JsonParser jsonParser, DeserializationContext ctx)
+            throws IOException {
+        final JsonNode dataTypeNode = jsonParser.readValueAsTree();
+        final SerdeContext serdeContext = SerdeContext.from(ctx);
+        return deserialize(dataTypeNode, serdeContext);
+    }
+
+    public static DataType deserialize(JsonNode dataTypeNode, SerdeContext serdeContext) {
+        if (dataTypeNode.isTextual()) {

Review comment:
       I see how it works here, but can you please explain it in the javadoc that we have 2 representations?

##########
File path: flink-table/flink-table-common/src/main/java/org/apache/flink/table/catalog/DataTypeFactory.java
##########
@@ -112,4 +112,23 @@
      * edges of the API.
      */
     <T> DataType createRawDataType(TypeInformation<T> typeInfo);
+
+    // --------------------------------------------------------------------------------------------
+    // LogicalType creation
+    // --------------------------------------------------------------------------------------------
+
+    /**
+     * Creates a {@link LogicalType} by a fully or partially defined name.
+     *
+     * <p>The factory will parse and resolve the name of a type to a {@link LogicalType}. This
+     * includes both built-in types and user-defined types (see {@link DistinctType} and {@link
+     * StructuredType}).
+     */
+    LogicalType createLogicalType(String name);

Review comment:
       `name` here is confusing, perhaps `typeString` as in the signature of `LogicalTypeParser` to specify that this is the serialized definition of the type?
   
   Same comment for `DataType createDataType(String name)`

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/LogicalTypeJsonSerializer.java
##########
@@ -109,349 +123,358 @@ public void serialize(
             JsonGenerator jsonGenerator,
             SerializerProvider serializerProvider)
             throws IOException {
-        if (logicalType instanceof CharType) {
-            // Zero-length character strings have no serializable string representation.
-            serializeRowType((CharType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof VarCharType) {
-            // Zero-length character strings have no serializable string representation.
-            serializeVarCharType((VarCharType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof BinaryType) {
-            // Zero-length binary strings have no serializable string representation.
-            serializeBinaryType((BinaryType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof VarBinaryType) {
-            // Zero-length binary strings have no serializable string representation.
-            serializeVarBinaryType((VarBinaryType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof SymbolType) {
-            // SymbolType does not support `asSerializableString`
-            serializeSymbolType((SymbolType<?>) logicalType, jsonGenerator);
-        } else if (logicalType instanceof TypeInformationRawType) {
-            // TypeInformationRawType does not support `asSerializableString`
-            serializeTypeInformationRawType((TypeInformationRawType<?>) logicalType, jsonGenerator);
-        } else if (logicalType instanceof StructuredType) {
-            //  StructuredType does not full support `asSerializableString`
-            serializeStructuredType((StructuredType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof DistinctType) {
-            //  DistinctType does not full support `asSerializableString`
-            serializeDistinctType((DistinctType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof TimestampType) {
-            // TimestampType does not consider `TimestampKind`
-            serializeTimestampType((TimestampType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof ZonedTimestampType) {
-            // ZonedTimestampType does not consider `TimestampKind`
-            serializeZonedTimestampType((ZonedTimestampType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof LocalZonedTimestampType) {
-            // LocalZonedTimestampType does not consider `TimestampKind`
-            serializeLocalZonedTimestampType((LocalZonedTimestampType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof RowType) {
-            serializeRowType((RowType) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof MapType) {
-            serializeMapType((MapType) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof ArrayType) {
-            serializeArrayType((ArrayType) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof MultisetType) {
-            serializeMultisetType((MultisetType) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof RawType) {
-            serializeRawType((RawType<?>) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof UnresolvedUserDefinedType) {
-            throw new TableException(
-                    "Can not serialize an UnresolvedUserDefinedType instance. \n"
-                            + "It needs to be resolved into a proper user-defined type.\"");
-        } else {
-            jsonGenerator.writeObject(logicalType.asSerializableString());
-        }
+        final ReadableConfig config = SerdeContext.from(serializerProvider).getConfiguration();
+        final boolean serializeCatalogObjects =
+                !config.get(TableConfigOptions.PLAN_COMPILE_CATALOG_OBJECTS)
+                        .equals(CatalogPlanCompilation.IDENTIFIER);
+        serializeInternal(logicalType, jsonGenerator, serializeCatalogObjects);
     }
 
-    private void serializeRowType(
-            RowType rowType, JsonGenerator jsonGenerator, SerializerProvider serializerProvider)
+    private static void serializeInternal(
+            LogicalType logicalType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, rowType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, rowType.isNullable());
-        List<RowType.RowField> fields = rowType.getFields();
-        jsonGenerator.writeArrayFieldStart(FIELD_NAME_FIELDS);
-        for (RowType.RowField rowField : fields) {
-            jsonGenerator.writeStartObject();
-            jsonGenerator.writeFieldName(rowField.getName());
-            serialize(rowField.getType(), jsonGenerator, serializerProvider);
-            if (rowField.getDescription().isPresent()) {
-                jsonGenerator.writeStringField(
-                        FIELD_NAME_DESCRIPTION, rowField.getDescription().get());
-            }
-            jsonGenerator.writeEndObject();
+        if (supportsCompactSerialization(logicalType, serializeCatalogObjects)) {
+            serializeTypeWithCompactSerialization(logicalType, jsonGenerator);
+        } else {
+            // fallback to generic serialization that might still use compact serialization for
+            // individual fields
+            serializeTypeWithGenericSerialization(
+                    logicalType, jsonGenerator, serializeCatalogObjects);
         }
-        jsonGenerator.writeEndArray();
-        jsonGenerator.writeEndObject();
     }
 
-    private void serializeMapType(
-            MapType mapType, JsonGenerator jsonGenerator, SerializerProvider serializerProvider)
+    // --------------------------------------------------------------------------------------------
+    // Generic Serialization
+    // --------------------------------------------------------------------------------------------
+
+    private static void serializeTypeWithGenericSerialization(
+            LogicalType logicalType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
         jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, mapType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, mapType.isNullable());
-        jsonGenerator.writeFieldName(FIELD_NAME_KEY_TYPE);
-        serialize(mapType.getKeyType(), jsonGenerator, serializerProvider);
-        jsonGenerator.writeFieldName(FIELD_NAME_VALUE_TYPE);
-        serialize(mapType.getValueType(), jsonGenerator, serializerProvider);
+
+        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, logicalType.getTypeRoot().name());
+        if (!logicalType.isNullable()) {
+            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, false);
+        }
+
+        switch (logicalType.getTypeRoot()) {
+            case CHAR:
+            case VARCHAR:
+            case BINARY:
+            case VARBINARY:
+                serializeZeroLengthString(jsonGenerator);
+                break;
+            case TIMESTAMP_WITHOUT_TIME_ZONE:
+                final TimestampType timestampType = (TimestampType) logicalType;
+                serializeTimestamp(
+                        timestampType.getPrecision(), timestampType.getKind(), jsonGenerator);
+                break;
+            case TIMESTAMP_WITH_TIME_ZONE:
+                final ZonedTimestampType zonedTimestampType = (ZonedTimestampType) logicalType;
+                serializeTimestamp(
+                        zonedTimestampType.getPrecision(),
+                        zonedTimestampType.getKind(),
+                        jsonGenerator);
+                break;
+            case TIMESTAMP_WITH_LOCAL_TIME_ZONE:
+                final LocalZonedTimestampType localZonedTimestampType =
+                        (LocalZonedTimestampType) logicalType;
+                serializeTimestamp(
+                        localZonedTimestampType.getPrecision(),
+                        localZonedTimestampType.getKind(),
+                        jsonGenerator);
+                break;
+            case ARRAY:
+                serializeCollection(
+                        ((ArrayType) logicalType).getElementType(),
+                        jsonGenerator,
+                        serializeCatalogObjects);
+                break;
+            case MULTISET:
+                serializeCollection(
+                        ((MultisetType) logicalType).getElementType(),
+                        jsonGenerator,
+                        serializeCatalogObjects);
+                break;
+            case MAP:
+                serializeMap((MapType) logicalType, jsonGenerator, serializeCatalogObjects);
+                break;
+            case ROW:
+                serializeRow((RowType) logicalType, jsonGenerator, serializeCatalogObjects);
+                break;
+            case DISTINCT_TYPE:
+                serializeDistinctType(
+                        (DistinctType) logicalType, jsonGenerator, serializeCatalogObjects);
+                break;
+            case STRUCTURED_TYPE:
+                serializeStructuredType(
+                        (StructuredType) logicalType, jsonGenerator, serializeCatalogObjects);
+                break;
+            case SYMBOL:
+                // type root is enough
+                break;
+            case RAW:
+                if (logicalType instanceof RawType) {
+                    serializeSpecializedRaw((RawType<?>) logicalType, jsonGenerator);
+                    break;
+                }
+                // fall through
+            default:
+                throw new ValidationException(
+                        String.format(
+                                "Unable to serialize logical type '%s'. Please check the documentation for supported types.",
+                                logicalType.asSummaryString()));
+        }
+
         jsonGenerator.writeEndObject();
     }
 
-    private void serializeArrayType(
-            ArrayType arrayType, JsonGenerator jsonGenerator, SerializerProvider serializerProvider)
-            throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, arrayType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, arrayType.isNullable());
-        jsonGenerator.writeFieldName(FIELD_NAME_ELEMENT_TYPE);
-        serialize(arrayType.getElementType(), jsonGenerator, serializerProvider);
-        jsonGenerator.writeEndObject();
+    private static void serializeZeroLengthString(JsonGenerator jsonGenerator) throws IOException {
+        jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
     }
 
-    private void serializeMultisetType(
-            MultisetType multisetType,
-            JsonGenerator jsonGenerator,
-            SerializerProvider serializerProvider)
-            throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, multisetType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, multisetType.isNullable());
-        jsonGenerator.writeFieldName(FIELD_NAME_ELEMENT_TYPE);
-        serialize(multisetType.getElementType(), jsonGenerator, serializerProvider);
-        jsonGenerator.writeEndObject();
+    private static void serializeTimestamp(
+            int precision, TimestampKind kind, JsonGenerator jsonGenerator) throws IOException {
+        jsonGenerator.writeNumberField(FIELD_NAME_PRECISION, precision);
+        jsonGenerator.writeObjectField(FIELD_NAME_TIMESTAMP_KIND, kind);
     }
 
-    private void serializeRowType(CharType charType, JsonGenerator jsonGenerator)
+    private static void serializeCollection(
+            LogicalType elementType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        // Zero-length character strings have no serializable string representation.
-        if (charType.getLength() == CharType.EMPTY_LITERAL_LENGTH) {
-            jsonGenerator.writeStartObject();
-            jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, charType.getTypeRoot().name());
-            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, charType.isNullable());
-            jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
-            jsonGenerator.writeEndObject();
-        } else {
-            jsonGenerator.writeObject(charType.asSerializableString());
-        }
+        jsonGenerator.writeFieldName(FIELD_NAME_ELEMENT_TYPE);
+        serializeInternal(elementType, jsonGenerator, serializeCatalogObjects);
     }
 
-    private void serializeVarCharType(VarCharType varCharType, JsonGenerator jsonGenerator)
+    private static void serializeMap(
+            MapType mapType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        // Zero-length character strings have no serializable string representation.
-        if (varCharType.getLength() == VarCharType.EMPTY_LITERAL_LENGTH) {
-            jsonGenerator.writeStartObject();
-            jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, varCharType.getTypeRoot().name());
-            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, varCharType.isNullable());
-            jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
-            jsonGenerator.writeEndObject();
-        } else {
-            jsonGenerator.writeObject(varCharType.asSerializableString());
-        }
+        jsonGenerator.writeFieldName(FIELD_NAME_KEY_TYPE);
+        serializeInternal(mapType.getKeyType(), jsonGenerator, serializeCatalogObjects);
+        jsonGenerator.writeFieldName(FIELD_NAME_VALUE_TYPE);
+        serializeInternal(mapType.getValueType(), jsonGenerator, serializeCatalogObjects);
     }
 
-    private void serializeBinaryType(BinaryType binaryType, JsonGenerator jsonGenerator)
+    private static void serializeRow(
+            RowType rowType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        // Zero-length binary strings have no serializable string representation.
-        if (binaryType.getLength() == BinaryType.EMPTY_LITERAL_LENGTH) {
+        jsonGenerator.writeArrayFieldStart(FIELD_NAME_FIELDS);
+        for (RowType.RowField rowField : rowType.getFields()) {
             jsonGenerator.writeStartObject();
-            jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, binaryType.getTypeRoot().name());
-            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, binaryType.isNullable());
-            jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
+            jsonGenerator.writeStringField(FIELD_NAME_FIELD_NAME, rowField.getName());
+            jsonGenerator.writeFieldName(FIELD_NAME_FIELD_TYPE);
+            serializeInternal(rowField.getType(), jsonGenerator, serializeCatalogObjects);
+            if (rowField.getDescription().isPresent()) {
+                jsonGenerator.writeStringField(
+                        FIELD_NAME_FIELD_DESCRIPTION, rowField.getDescription().get());
+            }
             jsonGenerator.writeEndObject();
-        } else {
-            jsonGenerator.writeObject(binaryType.asSerializableString());
         }
+        jsonGenerator.writeEndArray();
     }
 
-    private void serializeVarBinaryType(VarBinaryType varBinaryType, JsonGenerator jsonGenerator)
+    private static void serializeDistinctType(
+            DistinctType distinctType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        // Zero-length binary strings have no serializable string representation.
-        if (varBinaryType.getLength() == VarBinaryType.EMPTY_LITERAL_LENGTH) {
-            jsonGenerator.writeStartObject();
+        jsonGenerator.writeObjectField(
+                FIELD_NAME_OBJECT_IDENTIFIER,
+                distinctType.getObjectIdentifier().orElseThrow(IllegalStateException::new));
+        if (distinctType.getDescription().isPresent()) {
             jsonGenerator.writeStringField(
-                    FIELD_NAME_TYPE_NAME, varBinaryType.getTypeRoot().name());
-            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, varBinaryType.isNullable());
-            jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
-            jsonGenerator.writeEndObject();
-        } else {
-            jsonGenerator.writeObject(varBinaryType.asSerializableString());
+                    FIELD_NAME_FIELD_DESCRIPTION, distinctType.getDescription().get());
         }
+        jsonGenerator.writeFieldName(FIELD_NAME_SOURCE_TYPE);
+        serializeInternal(distinctType.getSourceType(), jsonGenerator, serializeCatalogObjects);
     }
 
-    private void serializeSymbolType(SymbolType<?> symbolType, JsonGenerator jsonGenerator)
-            throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, symbolType.isNullable());
-        jsonGenerator.writeStringField(
-                FIELD_NAME_SYMBOL_CLASS, symbolType.getDefaultConversion().getName());
-        jsonGenerator.writeEndObject();
-    }
-
-    private void serializeTypeInformationRawType(
-            TypeInformationRawType<?> rawType, JsonGenerator jsonGenerator) throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, rawType.isNullable());
-        jsonGenerator.writeStringField(
-                FIELD_NAME_TYPE_INFO,
-                EncodingUtils.encodeObjectToString(rawType.getTypeInformation()));
-        jsonGenerator.writeEndObject();
-    }
-
-    private void serializeStructuredType(StructuredType structuredType, JsonGenerator jsonGenerator)
+    private static void serializeStructuredType(
+            StructuredType structuredType,
+            JsonGenerator jsonGenerator,
+            boolean serializeCatalogObjects)
             throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(
-                FIELD_NAME_TYPE_NAME, LogicalTypeRoot.STRUCTURED_TYPE.name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, structuredType.isNullable());
         if (structuredType.getObjectIdentifier().isPresent()) {
             jsonGenerator.writeObjectField(
-                    FIELD_NAME_IDENTIFIER, structuredType.getObjectIdentifier().get());
+                    FIELD_NAME_OBJECT_IDENTIFIER, structuredType.getObjectIdentifier().get());
         }
-        if (structuredType.getImplementationClass().isPresent()) {
+        if (structuredType.getDescription().isPresent()) {
             jsonGenerator.writeStringField(
-                    FIELD_NAME_IMPLEMENTATION_CLASS,
-                    structuredType.getImplementationClass().get().getName());
+                    FIELD_NAME_DESCRIPTION, structuredType.getDescription().get());
+        }
+        if (structuredType.getImplementationClass().isPresent()) {
+            jsonGenerator.writeObjectField(
+                    FIELD_NAME_IMPLEMENTATION_CLASS, structuredType.getImplementationClass().get());
         }
         jsonGenerator.writeFieldName(FIELD_NAME_ATTRIBUTES);
         jsonGenerator.writeStartArray();
-        for (StructuredType.StructuredAttribute attribute : structuredType.getAttributes()) {
+        for (StructuredAttribute attribute : structuredType.getAttributes()) {
             jsonGenerator.writeStartObject();
-            jsonGenerator.writeStringField(FIELD_NAME_NAME, attribute.getName());
-            jsonGenerator.writeObjectField(FIELD_NAME_LOGICAL_TYPE, attribute.getType());
+            jsonGenerator.writeStringField(FIELD_NAME_ATTRIBUTE_NAME, attribute.getName());
+            jsonGenerator.writeFieldName(FIELD_NAME_ATTRIBUTE_TYPE);
+            serializeInternal(attribute.getType(), jsonGenerator, serializeCatalogObjects);
             if (attribute.getDescription().isPresent()) {
                 jsonGenerator.writeStringField(
-                        FIELD_NAME_DESCRIPTION, attribute.getDescription().get());
+                        FIELD_NAME_ATTRIBUTE_DESCRIPTION, attribute.getDescription().get());
             }
             jsonGenerator.writeEndObject();
         }
         jsonGenerator.writeEndArray();
-        jsonGenerator.writeBooleanField(FIELD_NAME_FINAL, structuredType.isFinal());
-        jsonGenerator.writeBooleanField(FIELD_NAME_INSTANTIABLE, structuredType.isInstantiable());
-        jsonGenerator.writeStringField(
-                FIELD_NAME_COMPARISON, structuredType.getComparison().name());
-        if (structuredType.getSuperType().isPresent()) {
-            jsonGenerator.writeObjectField(
-                    FIELD_NAME_SUPPER_TYPE, structuredType.getSuperType().get());
+        if (!structuredType.isFinal()) {
+            jsonGenerator.writeBooleanField(FIELD_NAME_FINAL, false);
         }
-        if (structuredType.getDescription().isPresent()) {
+        if (!structuredType.isInstantiable()) {
+            jsonGenerator.writeBooleanField(FIELD_NAME_INSTANTIABLE, false);
+        }
+        if (structuredType.getComparison() != StructuredComparison.NONE) {
             jsonGenerator.writeStringField(
-                    FIELD_NAME_DESCRIPTION, structuredType.getDescription().get());
+                    FIELD_NAME_COMPARISON, structuredType.getComparison().name());
+        }
+        if (structuredType.getSuperType().isPresent()) {
+            jsonGenerator.writeObjectField(
+                    FIELD_NAME_SUPER_TYPE, structuredType.getSuperType().get());
         }
-        jsonGenerator.writeEndObject();
     }
 
-    private void serializeDistinctType(DistinctType distinctType, JsonGenerator jsonGenerator)
+    private static void serializeSpecializedRaw(RawType<?> rawType, JsonGenerator jsonGenerator)
             throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, LogicalTypeRoot.DISTINCT_TYPE.name());
-        Preconditions.checkArgument(distinctType.getObjectIdentifier().isPresent());
-        jsonGenerator.writeObjectField(
-                FIELD_NAME_IDENTIFIER, distinctType.getObjectIdentifier().get());
-        jsonGenerator.writeObjectField(FIELD_NAME_SOURCE_TYPE, distinctType.getSourceType());
-        if (distinctType.getDescription().isPresent()) {
+        jsonGenerator.writeStringField(FIELD_NAME_CLASS, rawType.getOriginatingClass().getName());
+        final TypeSerializer<?> serializer = rawType.getTypeSerializer();
+        if (serializer.equals(NullSerializer.INSTANCE)) {
             jsonGenerator.writeStringField(
-                    FIELD_NAME_DESCRIPTION, distinctType.getDescription().get());
+                    FIELD_NAME_SPECIAL_SERIALIZER, FIELD_VALUE_EXTERNAL_SERIALIZER_NULL);
+        } else if (serializer instanceof ExternalSerializer) {
+            final ExternalSerializer<?, ?> externalSerializer =
+                    (ExternalSerializer<?, ?>) rawType.getTypeSerializer();
+            if (externalSerializer.isInternalInput()) {
+                throw new TableException(
+                        "Asymmetric external serializers are currently not supported. "
+                                + "The input must not be internal if the output is external.");
+            }
+            jsonGenerator.writeObjectField(
+                    FIELD_NAME_EXTERNAL_DATA_TYPE, externalSerializer.getDataType());
+        } else {
+            throw new TableException("Unsupported special case for RAW type.");
         }
-        jsonGenerator.writeEndObject();
     }
 
-    private void serializeTimestampType(TimestampType timestampType, JsonGenerator jsonGenerator)
-            throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, timestampType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, timestampType.isNullable());
-        jsonGenerator.writeNumberField(FIELD_NAME_PRECISION, timestampType.getPrecision());
-        jsonGenerator.writeObjectField(FIELD_NAME_TIMESTAMP_KIND, timestampType.getKind());
-        jsonGenerator.writeEndObject();
-    }
+    // --------------------------------------------------------------------------------------------
+    // Compact Serialization
+    // --------------------------------------------------------------------------------------------
 
-    private void serializeZonedTimestampType(
-            ZonedTimestampType timestampType, JsonGenerator jsonGenerator) throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, timestampType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, timestampType.isNullable());
-        jsonGenerator.writeNumberField(FIELD_NAME_PRECISION, timestampType.getPrecision());
-        jsonGenerator.writeObjectField(FIELD_NAME_TIMESTAMP_KIND, timestampType.getKind());
-        jsonGenerator.writeEndObject();
+    private static boolean supportsCompactSerialization(
+            LogicalType logicalType, boolean serializeCatalogObjects) {
+        return logicalType.accept(new CompactSerializationChecker(serializeCatalogObjects));
     }
 
-    private void serializeLocalZonedTimestampType(
-            LocalZonedTimestampType timestampType, JsonGenerator jsonGenerator) throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, timestampType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, timestampType.isNullable());
-        jsonGenerator.writeNumberField(FIELD_NAME_PRECISION, timestampType.getPrecision());
-        jsonGenerator.writeObjectField(FIELD_NAME_TIMESTAMP_KIND, timestampType.getKind());
-        jsonGenerator.writeEndObject();
+    private static void serializeTypeWithCompactSerialization(
+            LogicalType logicalType, JsonGenerator jsonGenerator) throws IOException {
+        final String compactString = logicalType.asSerializableString();
+        jsonGenerator.writeString(compactString);
     }
 
-    @SuppressWarnings("rawtypes")
-    private void serializeRawType(
-            RawType<?> rawType, JsonGenerator jsonGenerator, SerializerProvider serializerProvider)
-            throws IOException {
-        TypeSerializer<?> typeSer = rawType.getTypeSerializer();
-        if (typeSer instanceof ExternalSerializer) {
-            ExternalSerializer externalSer = (ExternalSerializer) typeSer;
-            // Currently, ExternalSerializer with `isInternalInput=false` will be serialized,
-            // Once `isInternalInput=true` needs to be serialized, we can add individual field in
-            // the json to support it, and the new json plan is compatible with the previous one.
-            if (externalSer.isInternalInput()) {
-                throw new TableException(
-                        "ExternalSerializer with `isInternalInput=true` is not supported.");
-            }
-            DataType dataType = externalSer.getDataType();
-            boolean isMapView = DataViewUtils.isMapViewDataType(dataType);
-            boolean isListView = DataViewUtils.isListViewDataType(dataType);
-            if (isMapView || isListView) {
-                jsonGenerator.writeStartObject();
-                jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, LogicalTypeRoot.RAW.name());
-                jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, rawType.isNullable());
-                if (isMapView) {
-                    jsonGenerator.writeStringField(
-                            FIELD_NAME_DATA_VIEW_CLASS, MapView.class.getName());
-                    KeyValueDataType keyValueDataType =
-                            DataViewUtils.extractKeyValueDataTypeForMapView(dataType);
-                    serializeDataTypeForDataView(
-                            FIELD_NAME_KEY_TYPE,
-                            keyValueDataType.getKeyDataType(),
-                            jsonGenerator,
-                            serializerProvider);
-                    serializeDataTypeForDataView(
-                            FIELD_NAME_VALUE_TYPE,
-                            keyValueDataType.getValueDataType(),
-                            jsonGenerator,
-                            serializerProvider);
-                } else {
-                    jsonGenerator.writeStringField(
-                            FIELD_NAME_DATA_VIEW_CLASS, ListView.class.getName());
-                    DataType elementType =
-                            DataViewUtils.extractElementDataTypeForListView(dataType);
-                    serializeDataTypeForDataView(
-                            FIELD_NAME_ELEMENT_TYPE,
-                            elementType,
-                            jsonGenerator,
-                            serializerProvider);
-                }
-                jsonGenerator.writeEndObject();
-                return;
-            }
+    /**
+     * Checks whether the given type can be serialized as a compact string created from {@link
+     * LogicalType#asSerializableString()}.
+     */
+    private static class CompactSerializationChecker extends LogicalTypeDefaultVisitor<Boolean> {
+
+        private final boolean serializeCatalogObjects;
+
+        CompactSerializationChecker(boolean serializeCatalogObjects) {
+            this.serializeCatalogObjects = serializeCatalogObjects;
         }
 
-        jsonGenerator.writeObject(rawType.asSerializableString());
-    }
+        @Override
+        public Boolean visit(CharType charType) {
+            return charType.getLength() > 0;
+        }
 
-    private void serializeDataTypeForDataView(
-            String key,
-            DataType dataType,
-            JsonGenerator jsonGenerator,
-            SerializerProvider serializerProvider)
-            throws IOException {
-        jsonGenerator.writeFieldName(key);
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeBooleanField(
-                FIELD_NAME_IS_INTERNAL_TYPE, DataTypeUtils.isInternal(dataType));
-        jsonGenerator.writeFieldName(FIELD_NAME_TYPE_NAME);
-        LogicalType logicalType = LogicalTypeDataTypeConverter.toLogicalType(dataType);
-        serialize(logicalType, jsonGenerator, serializerProvider);
-        jsonGenerator.writeEndObject();
+        @Override
+        public Boolean visit(VarCharType varCharType) {
+            return varCharType.getLength() > 0;
+        }
+
+        @Override
+        public Boolean visit(BinaryType binaryType) {
+            return binaryType.getLength() > 0;
+        }
+
+        @Override
+        public Boolean visit(VarBinaryType varBinaryType) {
+            return varBinaryType.getLength() > 0;
+        }
+
+        @Override
+        public Boolean visit(TimestampType timestampType) {
+            return timestampType.getKind() == TimestampKind.REGULAR;
+        }
+
+        @Override
+        public Boolean visit(ZonedTimestampType zonedTimestampType) {
+            return zonedTimestampType.getKind() == TimestampKind.REGULAR;
+        }
+
+        @Override
+        public Boolean visit(LocalZonedTimestampType localZonedTimestampType) {
+            return localZonedTimestampType.getKind() == TimestampKind.REGULAR;
+        }
+
+        @Override
+        public Boolean visit(DistinctType distinctType) {
+            // catalog-based distinct types are always string serializable,
+            // however, depending on the configuration, we serialize the entire type
+            return !serializeCatalogObjects;
+        }
+
+        @Override
+        public Boolean visit(StructuredType structuredType) {
+            // catalog-based structured types are always string serializable,
+            // however, depending on the configuration, we serialize the entire type
+            return structuredType.getObjectIdentifier().isPresent() && !serializeCatalogObjects;

Review comment:
       In case the structured type doesn't have an object identifier, aren't we able to compat serialize structured types which children are compat serializable? In the same fashion we do for row.

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/LogicalTypeJsonSerializer.java
##########
@@ -109,349 +123,358 @@ public void serialize(
             JsonGenerator jsonGenerator,
             SerializerProvider serializerProvider)
             throws IOException {
-        if (logicalType instanceof CharType) {
-            // Zero-length character strings have no serializable string representation.
-            serializeRowType((CharType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof VarCharType) {
-            // Zero-length character strings have no serializable string representation.
-            serializeVarCharType((VarCharType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof BinaryType) {
-            // Zero-length binary strings have no serializable string representation.
-            serializeBinaryType((BinaryType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof VarBinaryType) {
-            // Zero-length binary strings have no serializable string representation.
-            serializeVarBinaryType((VarBinaryType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof SymbolType) {
-            // SymbolType does not support `asSerializableString`
-            serializeSymbolType((SymbolType<?>) logicalType, jsonGenerator);
-        } else if (logicalType instanceof TypeInformationRawType) {
-            // TypeInformationRawType does not support `asSerializableString`
-            serializeTypeInformationRawType((TypeInformationRawType<?>) logicalType, jsonGenerator);
-        } else if (logicalType instanceof StructuredType) {
-            //  StructuredType does not full support `asSerializableString`
-            serializeStructuredType((StructuredType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof DistinctType) {
-            //  DistinctType does not full support `asSerializableString`
-            serializeDistinctType((DistinctType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof TimestampType) {
-            // TimestampType does not consider `TimestampKind`
-            serializeTimestampType((TimestampType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof ZonedTimestampType) {
-            // ZonedTimestampType does not consider `TimestampKind`
-            serializeZonedTimestampType((ZonedTimestampType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof LocalZonedTimestampType) {
-            // LocalZonedTimestampType does not consider `TimestampKind`
-            serializeLocalZonedTimestampType((LocalZonedTimestampType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof RowType) {
-            serializeRowType((RowType) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof MapType) {
-            serializeMapType((MapType) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof ArrayType) {
-            serializeArrayType((ArrayType) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof MultisetType) {
-            serializeMultisetType((MultisetType) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof RawType) {
-            serializeRawType((RawType<?>) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof UnresolvedUserDefinedType) {
-            throw new TableException(
-                    "Can not serialize an UnresolvedUserDefinedType instance. \n"
-                            + "It needs to be resolved into a proper user-defined type.\"");
-        } else {
-            jsonGenerator.writeObject(logicalType.asSerializableString());
-        }
+        final ReadableConfig config = SerdeContext.from(serializerProvider).getConfiguration();
+        final boolean serializeCatalogObjects =
+                !config.get(TableConfigOptions.PLAN_COMPILE_CATALOG_OBJECTS)
+                        .equals(CatalogPlanCompilation.IDENTIFIER);
+        serializeInternal(logicalType, jsonGenerator, serializeCatalogObjects);
     }
 
-    private void serializeRowType(
-            RowType rowType, JsonGenerator jsonGenerator, SerializerProvider serializerProvider)
+    private static void serializeInternal(
+            LogicalType logicalType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, rowType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, rowType.isNullable());
-        List<RowType.RowField> fields = rowType.getFields();
-        jsonGenerator.writeArrayFieldStart(FIELD_NAME_FIELDS);
-        for (RowType.RowField rowField : fields) {
-            jsonGenerator.writeStartObject();
-            jsonGenerator.writeFieldName(rowField.getName());
-            serialize(rowField.getType(), jsonGenerator, serializerProvider);
-            if (rowField.getDescription().isPresent()) {
-                jsonGenerator.writeStringField(
-                        FIELD_NAME_DESCRIPTION, rowField.getDescription().get());
-            }
-            jsonGenerator.writeEndObject();
+        if (supportsCompactSerialization(logicalType, serializeCatalogObjects)) {
+            serializeTypeWithCompactSerialization(logicalType, jsonGenerator);
+        } else {
+            // fallback to generic serialization that might still use compact serialization for
+            // individual fields
+            serializeTypeWithGenericSerialization(
+                    logicalType, jsonGenerator, serializeCatalogObjects);
         }
-        jsonGenerator.writeEndArray();
-        jsonGenerator.writeEndObject();
     }
 
-    private void serializeMapType(
-            MapType mapType, JsonGenerator jsonGenerator, SerializerProvider serializerProvider)
+    // --------------------------------------------------------------------------------------------
+    // Generic Serialization
+    // --------------------------------------------------------------------------------------------
+
+    private static void serializeTypeWithGenericSerialization(
+            LogicalType logicalType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
         jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, mapType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, mapType.isNullable());
-        jsonGenerator.writeFieldName(FIELD_NAME_KEY_TYPE);
-        serialize(mapType.getKeyType(), jsonGenerator, serializerProvider);
-        jsonGenerator.writeFieldName(FIELD_NAME_VALUE_TYPE);
-        serialize(mapType.getValueType(), jsonGenerator, serializerProvider);
+
+        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, logicalType.getTypeRoot().name());
+        if (!logicalType.isNullable()) {
+            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, false);
+        }
+
+        switch (logicalType.getTypeRoot()) {
+            case CHAR:
+            case VARCHAR:
+            case BINARY:
+            case VARBINARY:
+                serializeZeroLengthString(jsonGenerator);
+                break;
+            case TIMESTAMP_WITHOUT_TIME_ZONE:
+                final TimestampType timestampType = (TimestampType) logicalType;
+                serializeTimestamp(
+                        timestampType.getPrecision(), timestampType.getKind(), jsonGenerator);
+                break;
+            case TIMESTAMP_WITH_TIME_ZONE:
+                final ZonedTimestampType zonedTimestampType = (ZonedTimestampType) logicalType;
+                serializeTimestamp(
+                        zonedTimestampType.getPrecision(),
+                        zonedTimestampType.getKind(),
+                        jsonGenerator);
+                break;
+            case TIMESTAMP_WITH_LOCAL_TIME_ZONE:
+                final LocalZonedTimestampType localZonedTimestampType =
+                        (LocalZonedTimestampType) logicalType;
+                serializeTimestamp(
+                        localZonedTimestampType.getPrecision(),
+                        localZonedTimestampType.getKind(),
+                        jsonGenerator);
+                break;
+            case ARRAY:
+                serializeCollection(
+                        ((ArrayType) logicalType).getElementType(),
+                        jsonGenerator,
+                        serializeCatalogObjects);
+                break;
+            case MULTISET:
+                serializeCollection(
+                        ((MultisetType) logicalType).getElementType(),
+                        jsonGenerator,
+                        serializeCatalogObjects);
+                break;
+            case MAP:
+                serializeMap((MapType) logicalType, jsonGenerator, serializeCatalogObjects);
+                break;
+            case ROW:
+                serializeRow((RowType) logicalType, jsonGenerator, serializeCatalogObjects);
+                break;
+            case DISTINCT_TYPE:
+                serializeDistinctType(
+                        (DistinctType) logicalType, jsonGenerator, serializeCatalogObjects);
+                break;
+            case STRUCTURED_TYPE:
+                serializeStructuredType(
+                        (StructuredType) logicalType, jsonGenerator, serializeCatalogObjects);
+                break;
+            case SYMBOL:
+                // type root is enough
+                break;
+            case RAW:
+                if (logicalType instanceof RawType) {
+                    serializeSpecializedRaw((RawType<?>) logicalType, jsonGenerator);
+                    break;
+                }
+                // fall through
+            default:
+                throw new ValidationException(
+                        String.format(
+                                "Unable to serialize logical type '%s'. Please check the documentation for supported types.",
+                                logicalType.asSummaryString()));
+        }
+
         jsonGenerator.writeEndObject();
     }
 
-    private void serializeArrayType(
-            ArrayType arrayType, JsonGenerator jsonGenerator, SerializerProvider serializerProvider)
-            throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, arrayType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, arrayType.isNullable());
-        jsonGenerator.writeFieldName(FIELD_NAME_ELEMENT_TYPE);
-        serialize(arrayType.getElementType(), jsonGenerator, serializerProvider);
-        jsonGenerator.writeEndObject();
+    private static void serializeZeroLengthString(JsonGenerator jsonGenerator) throws IOException {
+        jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
     }
 
-    private void serializeMultisetType(
-            MultisetType multisetType,
-            JsonGenerator jsonGenerator,
-            SerializerProvider serializerProvider)
-            throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, multisetType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, multisetType.isNullable());
-        jsonGenerator.writeFieldName(FIELD_NAME_ELEMENT_TYPE);
-        serialize(multisetType.getElementType(), jsonGenerator, serializerProvider);
-        jsonGenerator.writeEndObject();
+    private static void serializeTimestamp(
+            int precision, TimestampKind kind, JsonGenerator jsonGenerator) throws IOException {
+        jsonGenerator.writeNumberField(FIELD_NAME_PRECISION, precision);
+        jsonGenerator.writeObjectField(FIELD_NAME_TIMESTAMP_KIND, kind);
     }
 
-    private void serializeRowType(CharType charType, JsonGenerator jsonGenerator)
+    private static void serializeCollection(
+            LogicalType elementType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        // Zero-length character strings have no serializable string representation.
-        if (charType.getLength() == CharType.EMPTY_LITERAL_LENGTH) {
-            jsonGenerator.writeStartObject();
-            jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, charType.getTypeRoot().name());
-            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, charType.isNullable());
-            jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
-            jsonGenerator.writeEndObject();
-        } else {
-            jsonGenerator.writeObject(charType.asSerializableString());
-        }
+        jsonGenerator.writeFieldName(FIELD_NAME_ELEMENT_TYPE);
+        serializeInternal(elementType, jsonGenerator, serializeCatalogObjects);
     }
 
-    private void serializeVarCharType(VarCharType varCharType, JsonGenerator jsonGenerator)
+    private static void serializeMap(
+            MapType mapType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        // Zero-length character strings have no serializable string representation.
-        if (varCharType.getLength() == VarCharType.EMPTY_LITERAL_LENGTH) {
-            jsonGenerator.writeStartObject();
-            jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, varCharType.getTypeRoot().name());
-            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, varCharType.isNullable());
-            jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
-            jsonGenerator.writeEndObject();
-        } else {
-            jsonGenerator.writeObject(varCharType.asSerializableString());
-        }
+        jsonGenerator.writeFieldName(FIELD_NAME_KEY_TYPE);
+        serializeInternal(mapType.getKeyType(), jsonGenerator, serializeCatalogObjects);
+        jsonGenerator.writeFieldName(FIELD_NAME_VALUE_TYPE);
+        serializeInternal(mapType.getValueType(), jsonGenerator, serializeCatalogObjects);
     }
 
-    private void serializeBinaryType(BinaryType binaryType, JsonGenerator jsonGenerator)
+    private static void serializeRow(
+            RowType rowType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        // Zero-length binary strings have no serializable string representation.
-        if (binaryType.getLength() == BinaryType.EMPTY_LITERAL_LENGTH) {
+        jsonGenerator.writeArrayFieldStart(FIELD_NAME_FIELDS);
+        for (RowType.RowField rowField : rowType.getFields()) {
             jsonGenerator.writeStartObject();
-            jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, binaryType.getTypeRoot().name());
-            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, binaryType.isNullable());
-            jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
+            jsonGenerator.writeStringField(FIELD_NAME_FIELD_NAME, rowField.getName());
+            jsonGenerator.writeFieldName(FIELD_NAME_FIELD_TYPE);
+            serializeInternal(rowField.getType(), jsonGenerator, serializeCatalogObjects);
+            if (rowField.getDescription().isPresent()) {
+                jsonGenerator.writeStringField(
+                        FIELD_NAME_FIELD_DESCRIPTION, rowField.getDescription().get());
+            }
             jsonGenerator.writeEndObject();
-        } else {
-            jsonGenerator.writeObject(binaryType.asSerializableString());
         }
+        jsonGenerator.writeEndArray();
     }
 
-    private void serializeVarBinaryType(VarBinaryType varBinaryType, JsonGenerator jsonGenerator)
+    private static void serializeDistinctType(
+            DistinctType distinctType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        // Zero-length binary strings have no serializable string representation.
-        if (varBinaryType.getLength() == VarBinaryType.EMPTY_LITERAL_LENGTH) {
-            jsonGenerator.writeStartObject();
+        jsonGenerator.writeObjectField(
+                FIELD_NAME_OBJECT_IDENTIFIER,
+                distinctType.getObjectIdentifier().orElseThrow(IllegalStateException::new));
+        if (distinctType.getDescription().isPresent()) {
             jsonGenerator.writeStringField(
-                    FIELD_NAME_TYPE_NAME, varBinaryType.getTypeRoot().name());
-            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, varBinaryType.isNullable());
-            jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
-            jsonGenerator.writeEndObject();
-        } else {
-            jsonGenerator.writeObject(varBinaryType.asSerializableString());
+                    FIELD_NAME_FIELD_DESCRIPTION, distinctType.getDescription().get());
         }
+        jsonGenerator.writeFieldName(FIELD_NAME_SOURCE_TYPE);
+        serializeInternal(distinctType.getSourceType(), jsonGenerator, serializeCatalogObjects);
     }
 
-    private void serializeSymbolType(SymbolType<?> symbolType, JsonGenerator jsonGenerator)
-            throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, symbolType.isNullable());
-        jsonGenerator.writeStringField(
-                FIELD_NAME_SYMBOL_CLASS, symbolType.getDefaultConversion().getName());
-        jsonGenerator.writeEndObject();
-    }
-
-    private void serializeTypeInformationRawType(
-            TypeInformationRawType<?> rawType, JsonGenerator jsonGenerator) throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, rawType.isNullable());
-        jsonGenerator.writeStringField(
-                FIELD_NAME_TYPE_INFO,
-                EncodingUtils.encodeObjectToString(rawType.getTypeInformation()));
-        jsonGenerator.writeEndObject();
-    }
-
-    private void serializeStructuredType(StructuredType structuredType, JsonGenerator jsonGenerator)
+    private static void serializeStructuredType(
+            StructuredType structuredType,
+            JsonGenerator jsonGenerator,
+            boolean serializeCatalogObjects)
             throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(
-                FIELD_NAME_TYPE_NAME, LogicalTypeRoot.STRUCTURED_TYPE.name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, structuredType.isNullable());
         if (structuredType.getObjectIdentifier().isPresent()) {
             jsonGenerator.writeObjectField(
-                    FIELD_NAME_IDENTIFIER, structuredType.getObjectIdentifier().get());
+                    FIELD_NAME_OBJECT_IDENTIFIER, structuredType.getObjectIdentifier().get());
         }
-        if (structuredType.getImplementationClass().isPresent()) {
+        if (structuredType.getDescription().isPresent()) {
             jsonGenerator.writeStringField(
-                    FIELD_NAME_IMPLEMENTATION_CLASS,
-                    structuredType.getImplementationClass().get().getName());
+                    FIELD_NAME_DESCRIPTION, structuredType.getDescription().get());
+        }
+        if (structuredType.getImplementationClass().isPresent()) {
+            jsonGenerator.writeObjectField(
+                    FIELD_NAME_IMPLEMENTATION_CLASS, structuredType.getImplementationClass().get());
         }
         jsonGenerator.writeFieldName(FIELD_NAME_ATTRIBUTES);
         jsonGenerator.writeStartArray();
-        for (StructuredType.StructuredAttribute attribute : structuredType.getAttributes()) {
+        for (StructuredAttribute attribute : structuredType.getAttributes()) {
             jsonGenerator.writeStartObject();
-            jsonGenerator.writeStringField(FIELD_NAME_NAME, attribute.getName());
-            jsonGenerator.writeObjectField(FIELD_NAME_LOGICAL_TYPE, attribute.getType());
+            jsonGenerator.writeStringField(FIELD_NAME_ATTRIBUTE_NAME, attribute.getName());
+            jsonGenerator.writeFieldName(FIELD_NAME_ATTRIBUTE_TYPE);
+            serializeInternal(attribute.getType(), jsonGenerator, serializeCatalogObjects);
             if (attribute.getDescription().isPresent()) {
                 jsonGenerator.writeStringField(
-                        FIELD_NAME_DESCRIPTION, attribute.getDescription().get());
+                        FIELD_NAME_ATTRIBUTE_DESCRIPTION, attribute.getDescription().get());
             }
             jsonGenerator.writeEndObject();
         }
         jsonGenerator.writeEndArray();
-        jsonGenerator.writeBooleanField(FIELD_NAME_FINAL, structuredType.isFinal());
-        jsonGenerator.writeBooleanField(FIELD_NAME_INSTANTIABLE, structuredType.isInstantiable());
-        jsonGenerator.writeStringField(
-                FIELD_NAME_COMPARISON, structuredType.getComparison().name());
-        if (structuredType.getSuperType().isPresent()) {
-            jsonGenerator.writeObjectField(
-                    FIELD_NAME_SUPPER_TYPE, structuredType.getSuperType().get());
+        if (!structuredType.isFinal()) {
+            jsonGenerator.writeBooleanField(FIELD_NAME_FINAL, false);
         }
-        if (structuredType.getDescription().isPresent()) {
+        if (!structuredType.isInstantiable()) {
+            jsonGenerator.writeBooleanField(FIELD_NAME_INSTANTIABLE, false);
+        }
+        if (structuredType.getComparison() != StructuredComparison.NONE) {
             jsonGenerator.writeStringField(
-                    FIELD_NAME_DESCRIPTION, structuredType.getDescription().get());
+                    FIELD_NAME_COMPARISON, structuredType.getComparison().name());
+        }
+        if (structuredType.getSuperType().isPresent()) {
+            jsonGenerator.writeObjectField(
+                    FIELD_NAME_SUPER_TYPE, structuredType.getSuperType().get());
         }
-        jsonGenerator.writeEndObject();
     }
 
-    private void serializeDistinctType(DistinctType distinctType, JsonGenerator jsonGenerator)
+    private static void serializeSpecializedRaw(RawType<?> rawType, JsonGenerator jsonGenerator)
             throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, LogicalTypeRoot.DISTINCT_TYPE.name());
-        Preconditions.checkArgument(distinctType.getObjectIdentifier().isPresent());
-        jsonGenerator.writeObjectField(
-                FIELD_NAME_IDENTIFIER, distinctType.getObjectIdentifier().get());
-        jsonGenerator.writeObjectField(FIELD_NAME_SOURCE_TYPE, distinctType.getSourceType());
-        if (distinctType.getDescription().isPresent()) {
+        jsonGenerator.writeStringField(FIELD_NAME_CLASS, rawType.getOriginatingClass().getName());
+        final TypeSerializer<?> serializer = rawType.getTypeSerializer();
+        if (serializer.equals(NullSerializer.INSTANCE)) {
             jsonGenerator.writeStringField(
-                    FIELD_NAME_DESCRIPTION, distinctType.getDescription().get());
+                    FIELD_NAME_SPECIAL_SERIALIZER, FIELD_VALUE_EXTERNAL_SERIALIZER_NULL);
+        } else if (serializer instanceof ExternalSerializer) {
+            final ExternalSerializer<?, ?> externalSerializer =
+                    (ExternalSerializer<?, ?>) rawType.getTypeSerializer();
+            if (externalSerializer.isInternalInput()) {
+                throw new TableException(
+                        "Asymmetric external serializers are currently not supported. "
+                                + "The input must not be internal if the output is external.");
+            }
+            jsonGenerator.writeObjectField(
+                    FIELD_NAME_EXTERNAL_DATA_TYPE, externalSerializer.getDataType());
+        } else {
+            throw new TableException("Unsupported special case for RAW type.");
         }
-        jsonGenerator.writeEndObject();
     }
 
-    private void serializeTimestampType(TimestampType timestampType, JsonGenerator jsonGenerator)
-            throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, timestampType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, timestampType.isNullable());
-        jsonGenerator.writeNumberField(FIELD_NAME_PRECISION, timestampType.getPrecision());
-        jsonGenerator.writeObjectField(FIELD_NAME_TIMESTAMP_KIND, timestampType.getKind());
-        jsonGenerator.writeEndObject();
-    }
+    // --------------------------------------------------------------------------------------------
+    // Compact Serialization
+    // --------------------------------------------------------------------------------------------
 
-    private void serializeZonedTimestampType(
-            ZonedTimestampType timestampType, JsonGenerator jsonGenerator) throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, timestampType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, timestampType.isNullable());
-        jsonGenerator.writeNumberField(FIELD_NAME_PRECISION, timestampType.getPrecision());
-        jsonGenerator.writeObjectField(FIELD_NAME_TIMESTAMP_KIND, timestampType.getKind());
-        jsonGenerator.writeEndObject();
+    private static boolean supportsCompactSerialization(
+            LogicalType logicalType, boolean serializeCatalogObjects) {
+        return logicalType.accept(new CompactSerializationChecker(serializeCatalogObjects));
     }
 
-    private void serializeLocalZonedTimestampType(
-            LocalZonedTimestampType timestampType, JsonGenerator jsonGenerator) throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, timestampType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, timestampType.isNullable());
-        jsonGenerator.writeNumberField(FIELD_NAME_PRECISION, timestampType.getPrecision());
-        jsonGenerator.writeObjectField(FIELD_NAME_TIMESTAMP_KIND, timestampType.getKind());
-        jsonGenerator.writeEndObject();
+    private static void serializeTypeWithCompactSerialization(
+            LogicalType logicalType, JsonGenerator jsonGenerator) throws IOException {
+        final String compactString = logicalType.asSerializableString();
+        jsonGenerator.writeString(compactString);
     }
 
-    @SuppressWarnings("rawtypes")
-    private void serializeRawType(
-            RawType<?> rawType, JsonGenerator jsonGenerator, SerializerProvider serializerProvider)
-            throws IOException {
-        TypeSerializer<?> typeSer = rawType.getTypeSerializer();
-        if (typeSer instanceof ExternalSerializer) {
-            ExternalSerializer externalSer = (ExternalSerializer) typeSer;
-            // Currently, ExternalSerializer with `isInternalInput=false` will be serialized,
-            // Once `isInternalInput=true` needs to be serialized, we can add individual field in
-            // the json to support it, and the new json plan is compatible with the previous one.
-            if (externalSer.isInternalInput()) {
-                throw new TableException(
-                        "ExternalSerializer with `isInternalInput=true` is not supported.");
-            }
-            DataType dataType = externalSer.getDataType();
-            boolean isMapView = DataViewUtils.isMapViewDataType(dataType);
-            boolean isListView = DataViewUtils.isListViewDataType(dataType);
-            if (isMapView || isListView) {
-                jsonGenerator.writeStartObject();
-                jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, LogicalTypeRoot.RAW.name());
-                jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, rawType.isNullable());
-                if (isMapView) {
-                    jsonGenerator.writeStringField(
-                            FIELD_NAME_DATA_VIEW_CLASS, MapView.class.getName());
-                    KeyValueDataType keyValueDataType =
-                            DataViewUtils.extractKeyValueDataTypeForMapView(dataType);
-                    serializeDataTypeForDataView(
-                            FIELD_NAME_KEY_TYPE,
-                            keyValueDataType.getKeyDataType(),
-                            jsonGenerator,
-                            serializerProvider);
-                    serializeDataTypeForDataView(
-                            FIELD_NAME_VALUE_TYPE,
-                            keyValueDataType.getValueDataType(),
-                            jsonGenerator,
-                            serializerProvider);
-                } else {
-                    jsonGenerator.writeStringField(
-                            FIELD_NAME_DATA_VIEW_CLASS, ListView.class.getName());
-                    DataType elementType =
-                            DataViewUtils.extractElementDataTypeForListView(dataType);
-                    serializeDataTypeForDataView(
-                            FIELD_NAME_ELEMENT_TYPE,
-                            elementType,
-                            jsonGenerator,
-                            serializerProvider);
-                }
-                jsonGenerator.writeEndObject();
-                return;
-            }
+    /**
+     * Checks whether the given type can be serialized as a compact string created from {@link
+     * LogicalType#asSerializableString()}.
+     */
+    private static class CompactSerializationChecker extends LogicalTypeDefaultVisitor<Boolean> {
+
+        private final boolean serializeCatalogObjects;
+
+        CompactSerializationChecker(boolean serializeCatalogObjects) {
+            this.serializeCatalogObjects = serializeCatalogObjects;
         }
 
-        jsonGenerator.writeObject(rawType.asSerializableString());
-    }
+        @Override
+        public Boolean visit(CharType charType) {
+            return charType.getLength() > 0;

Review comment:
       Isn't this always true because CharType of length 0 is invalid?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] slinkydeveloper commented on a change in pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
slinkydeveloper commented on a change in pull request #18274:
URL: https://github.com/apache/flink/pull/18274#discussion_r780265171



##########
File path: flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/utils/LogicalTypeDataTypeConverter.java
##########
@@ -201,9 +200,29 @@ public DataType visit(RowType rowType) {
 
         @Override
         public DataType visit(DistinctType distinctType) {
-            return new FieldsDataType(
-                    distinctType,
-                    Collections.singletonList(distinctType.getSourceType().accept(this)));
+            final DataType sourceDataType = distinctType.getSourceType().accept(this);
+            if (sourceDataType instanceof AtomicDataType) {
+                return new AtomicDataType(distinctType, sourceDataType.getConversionClass());
+            } else if (sourceDataType instanceof CollectionDataType) {
+                final CollectionDataType collectionDataType = (CollectionDataType) sourceDataType;
+                return new CollectionDataType(
+                        distinctType,
+                        collectionDataType.getConversionClass(),
+                        collectionDataType.getElementDataType());
+            } else if (sourceDataType instanceof KeyValueDataType) {
+                final KeyValueDataType keyValueDataType = (KeyValueDataType) sourceDataType;
+                return new KeyValueDataType(
+                        distinctType,
+                        keyValueDataType.getConversionClass(),
+                        keyValueDataType.getKeyDataType(),
+                        keyValueDataType.getValueDataType());
+            } else if (sourceDataType instanceof FieldsDataType) {
+                return new FieldsDataType(
+                        distinctType,
+                        sourceDataType.getConversionClass(),
+                        sourceDataType.getChildren());
+            }
+            throw new IllegalStateException("Unexpected data type instance.");

Review comment:
       Ok, that's a detail I didn't knew about our type system, makes sense




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] twalthr commented on a change in pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
twalthr commented on a change in pull request #18274:
URL: https://github.com/apache/flink/pull/18274#discussion_r780140261



##########
File path: flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/utils/LogicalTypeDataTypeConverter.java
##########
@@ -201,9 +200,29 @@ public DataType visit(RowType rowType) {
 
         @Override
         public DataType visit(DistinctType distinctType) {
-            return new FieldsDataType(
-                    distinctType,
-                    Collections.singletonList(distinctType.getSourceType().accept(this)));
+            final DataType sourceDataType = distinctType.getSourceType().accept(this);
+            if (sourceDataType instanceof AtomicDataType) {
+                return new AtomicDataType(distinctType, sourceDataType.getConversionClass());
+            } else if (sourceDataType instanceof CollectionDataType) {
+                final CollectionDataType collectionDataType = (CollectionDataType) sourceDataType;
+                return new CollectionDataType(
+                        distinctType,
+                        collectionDataType.getConversionClass(),
+                        collectionDataType.getElementDataType());
+            } else if (sourceDataType instanceof KeyValueDataType) {
+                final KeyValueDataType keyValueDataType = (KeyValueDataType) sourceDataType;
+                return new KeyValueDataType(
+                        distinctType,
+                        keyValueDataType.getConversionClass(),
+                        keyValueDataType.getKeyDataType(),
+                        keyValueDataType.getValueDataType());
+            } else if (sourceDataType instanceof FieldsDataType) {
+                return new FieldsDataType(
+                        distinctType,
+                        sourceDataType.getConversionClass(),
+                        sourceDataType.getChildren());
+            }
+            throw new IllegalStateException("Unexpected data type instance.");

Review comment:
       The outer data type needs to preserve the original `distinctType` LogicalType. However, distinct types should behave similar as the source type when it comes to accessing elements and fields. Therefore, they also need to have a similar DataType wrapper class such as `FieldsDataType` or `KeyValueDataType`.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18274:
URL: https://github.com/apache/flink/pull/18274#issuecomment-1005845834


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=28985",
       "triggerID" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29041",
       "triggerID" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29092",
       "triggerID" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 8b30dbb1acd4bffe0c4c5d25b669705deb19463e Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29041) 
   * 9f800b453598a8cbf015583929e5fabda9fedbf6 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29092) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18274:
URL: https://github.com/apache/flink/pull/18274#issuecomment-1005845834


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=28985",
       "triggerID" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29041",
       "triggerID" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29092",
       "triggerID" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29112",
       "triggerID" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "triggerType" : "PUSH"
     }, {
       "hash" : "c445781fa99453f887f9d91978e5bb9ca9d1f91a",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "c445781fa99453f887f9d91978e5bb9ca9d1f91a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 9f800b453598a8cbf015583929e5fabda9fedbf6 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29092) 
   * 718ea97838ff10191544cc7460d3c18380b0e119 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29112) 
   * c445781fa99453f887f9d91978e5bb9ca9d1f91a UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] godfreyhe commented on a change in pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
godfreyhe commented on a change in pull request #18274:
URL: https://github.com/apache/flink/pull/18274#discussion_r781013103



##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/DataTypeJsonDeserializer.java
##########
@@ -0,0 +1,190 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.annotation.Internal;
+import org.apache.flink.table.api.DataTypes;
+import org.apache.flink.table.api.TableException;
+import org.apache.flink.table.types.CollectionDataType;
+import org.apache.flink.table.types.DataType;
+import org.apache.flink.table.types.FieldsDataType;
+import org.apache.flink.table.types.KeyValueDataType;
+import org.apache.flink.table.types.extraction.ExtractionUtils;
+import org.apache.flink.table.types.logical.DistinctType;
+import org.apache.flink.table.types.logical.LogicalType;
+import org.apache.flink.table.types.logical.MapType;
+import org.apache.flink.table.types.logical.utils.LogicalTypeChecks;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonParser;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.JsonNode;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.StdDeserializer;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ArrayNode;
+
+import javax.annotation.Nullable;
+
+import java.io.IOException;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.stream.Collectors;
+import java.util.stream.IntStream;
+
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_CONVERSION_CLASS;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_ELEMENT_CLASS;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_FIELDS;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_FIELD_NAME;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_KEY_CLASS;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_TYPE;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_VALUE_CLASS;
+
+/**
+ * JSON deserializer for {@link DataType}.
+ *
+ * @see DataTypeJsonSerializer for the reverse operation
+ */
+@Internal
+public class DataTypeJsonDeserializer extends StdDeserializer<DataType> {
+
+    public DataTypeJsonDeserializer() {
+        super(DataType.class);
+    }
+
+    @Override
+    public DataType deserialize(JsonParser jsonParser, DeserializationContext ctx)
+            throws IOException {
+        final JsonNode dataTypeNode = jsonParser.readValueAsTree();
+        final SerdeContext serdeContext = SerdeContext.get(ctx);
+        return deserialize(dataTypeNode, serdeContext);
+    }
+
+    public static DataType deserialize(JsonNode dataTypeNode, SerdeContext serdeContext) {
+        if (dataTypeNode.isTextual()) {
+            return deserializeWithInternalClass(dataTypeNode, serdeContext);
+        } else {
+            return deserializeWithExternalClass(dataTypeNode, serdeContext);
+        }
+    }
+
+    private static DataType deserializeWithInternalClass(
+            JsonNode logicalTypeNode, SerdeContext serdeContext) {
+        final LogicalType logicalType =
+                LogicalTypeJsonDeserializer.deserialize(logicalTypeNode, serdeContext);
+        return DataTypes.of(logicalType).toInternal();
+    }
+
+    private static DataType deserializeWithExternalClass(
+            JsonNode dataTypeNode, SerdeContext serdeContext) {
+        final LogicalType logicalType =
+                LogicalTypeJsonDeserializer.deserialize(
+                        dataTypeNode.get(FIELD_NAME_TYPE), serdeContext);
+        return deserializeClass(logicalType, dataTypeNode, serdeContext);
+    }
+
+    private static DataType deserializeClass(
+            LogicalType logicalType, @Nullable JsonNode classNode, SerdeContext serdeContext) {

Review comment:
       nit: rename `classNode` to `parentNode` ? 

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/DataTypeJsonSerializer.java
##########
@@ -0,0 +1,172 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.annotation.Internal;
+import org.apache.flink.table.api.DataTypes.Field;
+import org.apache.flink.table.types.CollectionDataType;
+import org.apache.flink.table.types.DataType;
+import org.apache.flink.table.types.KeyValueDataType;
+import org.apache.flink.table.types.logical.LogicalType;
+import org.apache.flink.table.types.utils.DataTypeUtils;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonGenerator;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.SerializerProvider;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.std.StdSerializer;
+
+import java.io.IOException;
+import java.util.List;
+import java.util.stream.Collectors;
+
+/**
+ * JSON serializer for {@link LogicalType}.

Review comment:
       `JSON serializer for {@link DataType}.`




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] twalthr commented on pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
twalthr commented on pull request #18274:
URL: https://github.com/apache/flink/pull/18274#issuecomment-1009016694


   Thanks everyone. I addressed all comments. I will merge this once the build is green.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] twalthr commented on a change in pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
twalthr commented on a change in pull request #18274:
URL: https://github.com/apache/flink/pull/18274#discussion_r780165059



##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/LogicalTypeJsonSerializer.java
##########
@@ -109,349 +123,358 @@ public void serialize(
             JsonGenerator jsonGenerator,
             SerializerProvider serializerProvider)
             throws IOException {
-        if (logicalType instanceof CharType) {
-            // Zero-length character strings have no serializable string representation.
-            serializeRowType((CharType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof VarCharType) {
-            // Zero-length character strings have no serializable string representation.
-            serializeVarCharType((VarCharType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof BinaryType) {
-            // Zero-length binary strings have no serializable string representation.
-            serializeBinaryType((BinaryType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof VarBinaryType) {
-            // Zero-length binary strings have no serializable string representation.
-            serializeVarBinaryType((VarBinaryType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof SymbolType) {
-            // SymbolType does not support `asSerializableString`
-            serializeSymbolType((SymbolType<?>) logicalType, jsonGenerator);
-        } else if (logicalType instanceof TypeInformationRawType) {
-            // TypeInformationRawType does not support `asSerializableString`
-            serializeTypeInformationRawType((TypeInformationRawType<?>) logicalType, jsonGenerator);
-        } else if (logicalType instanceof StructuredType) {
-            //  StructuredType does not full support `asSerializableString`
-            serializeStructuredType((StructuredType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof DistinctType) {
-            //  DistinctType does not full support `asSerializableString`
-            serializeDistinctType((DistinctType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof TimestampType) {
-            // TimestampType does not consider `TimestampKind`
-            serializeTimestampType((TimestampType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof ZonedTimestampType) {
-            // ZonedTimestampType does not consider `TimestampKind`
-            serializeZonedTimestampType((ZonedTimestampType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof LocalZonedTimestampType) {
-            // LocalZonedTimestampType does not consider `TimestampKind`
-            serializeLocalZonedTimestampType((LocalZonedTimestampType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof RowType) {
-            serializeRowType((RowType) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof MapType) {
-            serializeMapType((MapType) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof ArrayType) {
-            serializeArrayType((ArrayType) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof MultisetType) {
-            serializeMultisetType((MultisetType) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof RawType) {
-            serializeRawType((RawType<?>) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof UnresolvedUserDefinedType) {
-            throw new TableException(
-                    "Can not serialize an UnresolvedUserDefinedType instance. \n"
-                            + "It needs to be resolved into a proper user-defined type.\"");
-        } else {
-            jsonGenerator.writeObject(logicalType.asSerializableString());
-        }
+        final ReadableConfig config = SerdeContext.from(serializerProvider).getConfiguration();
+        final boolean serializeCatalogObjects =
+                !config.get(TableConfigOptions.PLAN_COMPILE_CATALOG_OBJECTS)
+                        .equals(CatalogPlanCompilation.IDENTIFIER);
+        serializeInternal(logicalType, jsonGenerator, serializeCatalogObjects);
     }
 
-    private void serializeRowType(
-            RowType rowType, JsonGenerator jsonGenerator, SerializerProvider serializerProvider)
+    private static void serializeInternal(
+            LogicalType logicalType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, rowType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, rowType.isNullable());
-        List<RowType.RowField> fields = rowType.getFields();
-        jsonGenerator.writeArrayFieldStart(FIELD_NAME_FIELDS);
-        for (RowType.RowField rowField : fields) {
-            jsonGenerator.writeStartObject();
-            jsonGenerator.writeFieldName(rowField.getName());
-            serialize(rowField.getType(), jsonGenerator, serializerProvider);
-            if (rowField.getDescription().isPresent()) {
-                jsonGenerator.writeStringField(
-                        FIELD_NAME_DESCRIPTION, rowField.getDescription().get());
-            }
-            jsonGenerator.writeEndObject();
+        if (supportsCompactSerialization(logicalType, serializeCatalogObjects)) {
+            serializeTypeWithCompactSerialization(logicalType, jsonGenerator);
+        } else {
+            // fallback to generic serialization that might still use compact serialization for
+            // individual fields
+            serializeTypeWithGenericSerialization(
+                    logicalType, jsonGenerator, serializeCatalogObjects);
         }
-        jsonGenerator.writeEndArray();
-        jsonGenerator.writeEndObject();
     }
 
-    private void serializeMapType(
-            MapType mapType, JsonGenerator jsonGenerator, SerializerProvider serializerProvider)
+    // --------------------------------------------------------------------------------------------
+    // Generic Serialization
+    // --------------------------------------------------------------------------------------------
+
+    private static void serializeTypeWithGenericSerialization(
+            LogicalType logicalType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
         jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, mapType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, mapType.isNullable());
-        jsonGenerator.writeFieldName(FIELD_NAME_KEY_TYPE);
-        serialize(mapType.getKeyType(), jsonGenerator, serializerProvider);
-        jsonGenerator.writeFieldName(FIELD_NAME_VALUE_TYPE);
-        serialize(mapType.getValueType(), jsonGenerator, serializerProvider);
+
+        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, logicalType.getTypeRoot().name());
+        if (!logicalType.isNullable()) {
+            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, false);
+        }
+
+        switch (logicalType.getTypeRoot()) {
+            case CHAR:
+            case VARCHAR:
+            case BINARY:
+            case VARBINARY:
+                serializeZeroLengthString(jsonGenerator);
+                break;
+            case TIMESTAMP_WITHOUT_TIME_ZONE:
+                final TimestampType timestampType = (TimestampType) logicalType;
+                serializeTimestamp(
+                        timestampType.getPrecision(), timestampType.getKind(), jsonGenerator);
+                break;
+            case TIMESTAMP_WITH_TIME_ZONE:
+                final ZonedTimestampType zonedTimestampType = (ZonedTimestampType) logicalType;
+                serializeTimestamp(
+                        zonedTimestampType.getPrecision(),
+                        zonedTimestampType.getKind(),
+                        jsonGenerator);
+                break;
+            case TIMESTAMP_WITH_LOCAL_TIME_ZONE:
+                final LocalZonedTimestampType localZonedTimestampType =
+                        (LocalZonedTimestampType) logicalType;
+                serializeTimestamp(
+                        localZonedTimestampType.getPrecision(),
+                        localZonedTimestampType.getKind(),
+                        jsonGenerator);
+                break;
+            case ARRAY:
+                serializeCollection(
+                        ((ArrayType) logicalType).getElementType(),
+                        jsonGenerator,
+                        serializeCatalogObjects);
+                break;
+            case MULTISET:
+                serializeCollection(
+                        ((MultisetType) logicalType).getElementType(),
+                        jsonGenerator,
+                        serializeCatalogObjects);
+                break;
+            case MAP:
+                serializeMap((MapType) logicalType, jsonGenerator, serializeCatalogObjects);
+                break;
+            case ROW:
+                serializeRow((RowType) logicalType, jsonGenerator, serializeCatalogObjects);
+                break;
+            case DISTINCT_TYPE:
+                serializeDistinctType(
+                        (DistinctType) logicalType, jsonGenerator, serializeCatalogObjects);
+                break;
+            case STRUCTURED_TYPE:
+                serializeStructuredType(
+                        (StructuredType) logicalType, jsonGenerator, serializeCatalogObjects);
+                break;
+            case SYMBOL:
+                // type root is enough
+                break;
+            case RAW:
+                if (logicalType instanceof RawType) {
+                    serializeSpecializedRaw((RawType<?>) logicalType, jsonGenerator);
+                    break;
+                }
+                // fall through
+            default:
+                throw new ValidationException(
+                        String.format(
+                                "Unable to serialize logical type '%s'. Please check the documentation for supported types.",
+                                logicalType.asSummaryString()));
+        }
+
         jsonGenerator.writeEndObject();
     }
 
-    private void serializeArrayType(
-            ArrayType arrayType, JsonGenerator jsonGenerator, SerializerProvider serializerProvider)
-            throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, arrayType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, arrayType.isNullable());
-        jsonGenerator.writeFieldName(FIELD_NAME_ELEMENT_TYPE);
-        serialize(arrayType.getElementType(), jsonGenerator, serializerProvider);
-        jsonGenerator.writeEndObject();
+    private static void serializeZeroLengthString(JsonGenerator jsonGenerator) throws IOException {
+        jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
     }
 
-    private void serializeMultisetType(
-            MultisetType multisetType,
-            JsonGenerator jsonGenerator,
-            SerializerProvider serializerProvider)
-            throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, multisetType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, multisetType.isNullable());
-        jsonGenerator.writeFieldName(FIELD_NAME_ELEMENT_TYPE);
-        serialize(multisetType.getElementType(), jsonGenerator, serializerProvider);
-        jsonGenerator.writeEndObject();
+    private static void serializeTimestamp(
+            int precision, TimestampKind kind, JsonGenerator jsonGenerator) throws IOException {
+        jsonGenerator.writeNumberField(FIELD_NAME_PRECISION, precision);
+        jsonGenerator.writeObjectField(FIELD_NAME_TIMESTAMP_KIND, kind);
     }
 
-    private void serializeRowType(CharType charType, JsonGenerator jsonGenerator)
+    private static void serializeCollection(
+            LogicalType elementType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        // Zero-length character strings have no serializable string representation.
-        if (charType.getLength() == CharType.EMPTY_LITERAL_LENGTH) {
-            jsonGenerator.writeStartObject();
-            jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, charType.getTypeRoot().name());
-            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, charType.isNullable());
-            jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
-            jsonGenerator.writeEndObject();
-        } else {
-            jsonGenerator.writeObject(charType.asSerializableString());
-        }
+        jsonGenerator.writeFieldName(FIELD_NAME_ELEMENT_TYPE);
+        serializeInternal(elementType, jsonGenerator, serializeCatalogObjects);
     }
 
-    private void serializeVarCharType(VarCharType varCharType, JsonGenerator jsonGenerator)
+    private static void serializeMap(
+            MapType mapType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        // Zero-length character strings have no serializable string representation.
-        if (varCharType.getLength() == VarCharType.EMPTY_LITERAL_LENGTH) {
-            jsonGenerator.writeStartObject();
-            jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, varCharType.getTypeRoot().name());
-            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, varCharType.isNullable());
-            jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
-            jsonGenerator.writeEndObject();
-        } else {
-            jsonGenerator.writeObject(varCharType.asSerializableString());
-        }
+        jsonGenerator.writeFieldName(FIELD_NAME_KEY_TYPE);
+        serializeInternal(mapType.getKeyType(), jsonGenerator, serializeCatalogObjects);
+        jsonGenerator.writeFieldName(FIELD_NAME_VALUE_TYPE);
+        serializeInternal(mapType.getValueType(), jsonGenerator, serializeCatalogObjects);
     }
 
-    private void serializeBinaryType(BinaryType binaryType, JsonGenerator jsonGenerator)
+    private static void serializeRow(
+            RowType rowType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        // Zero-length binary strings have no serializable string representation.
-        if (binaryType.getLength() == BinaryType.EMPTY_LITERAL_LENGTH) {
+        jsonGenerator.writeArrayFieldStart(FIELD_NAME_FIELDS);
+        for (RowType.RowField rowField : rowType.getFields()) {
             jsonGenerator.writeStartObject();
-            jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, binaryType.getTypeRoot().name());
-            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, binaryType.isNullable());
-            jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
+            jsonGenerator.writeStringField(FIELD_NAME_FIELD_NAME, rowField.getName());
+            jsonGenerator.writeFieldName(FIELD_NAME_FIELD_TYPE);
+            serializeInternal(rowField.getType(), jsonGenerator, serializeCatalogObjects);
+            if (rowField.getDescription().isPresent()) {
+                jsonGenerator.writeStringField(
+                        FIELD_NAME_FIELD_DESCRIPTION, rowField.getDescription().get());
+            }
             jsonGenerator.writeEndObject();
-        } else {
-            jsonGenerator.writeObject(binaryType.asSerializableString());
         }
+        jsonGenerator.writeEndArray();
     }
 
-    private void serializeVarBinaryType(VarBinaryType varBinaryType, JsonGenerator jsonGenerator)
+    private static void serializeDistinctType(
+            DistinctType distinctType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        // Zero-length binary strings have no serializable string representation.
-        if (varBinaryType.getLength() == VarBinaryType.EMPTY_LITERAL_LENGTH) {
-            jsonGenerator.writeStartObject();
+        jsonGenerator.writeObjectField(
+                FIELD_NAME_OBJECT_IDENTIFIER,
+                distinctType.getObjectIdentifier().orElseThrow(IllegalStateException::new));
+        if (distinctType.getDescription().isPresent()) {
             jsonGenerator.writeStringField(
-                    FIELD_NAME_TYPE_NAME, varBinaryType.getTypeRoot().name());
-            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, varBinaryType.isNullable());
-            jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
-            jsonGenerator.writeEndObject();
-        } else {
-            jsonGenerator.writeObject(varBinaryType.asSerializableString());
+                    FIELD_NAME_FIELD_DESCRIPTION, distinctType.getDescription().get());
         }
+        jsonGenerator.writeFieldName(FIELD_NAME_SOURCE_TYPE);
+        serializeInternal(distinctType.getSourceType(), jsonGenerator, serializeCatalogObjects);
     }
 
-    private void serializeSymbolType(SymbolType<?> symbolType, JsonGenerator jsonGenerator)
-            throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, symbolType.isNullable());
-        jsonGenerator.writeStringField(
-                FIELD_NAME_SYMBOL_CLASS, symbolType.getDefaultConversion().getName());
-        jsonGenerator.writeEndObject();
-    }
-
-    private void serializeTypeInformationRawType(
-            TypeInformationRawType<?> rawType, JsonGenerator jsonGenerator) throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, rawType.isNullable());
-        jsonGenerator.writeStringField(
-                FIELD_NAME_TYPE_INFO,
-                EncodingUtils.encodeObjectToString(rawType.getTypeInformation()));
-        jsonGenerator.writeEndObject();
-    }
-
-    private void serializeStructuredType(StructuredType structuredType, JsonGenerator jsonGenerator)
+    private static void serializeStructuredType(
+            StructuredType structuredType,
+            JsonGenerator jsonGenerator,
+            boolean serializeCatalogObjects)
             throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(
-                FIELD_NAME_TYPE_NAME, LogicalTypeRoot.STRUCTURED_TYPE.name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, structuredType.isNullable());
         if (structuredType.getObjectIdentifier().isPresent()) {
             jsonGenerator.writeObjectField(
-                    FIELD_NAME_IDENTIFIER, structuredType.getObjectIdentifier().get());
+                    FIELD_NAME_OBJECT_IDENTIFIER, structuredType.getObjectIdentifier().get());
         }
-        if (structuredType.getImplementationClass().isPresent()) {
+        if (structuredType.getDescription().isPresent()) {
             jsonGenerator.writeStringField(
-                    FIELD_NAME_IMPLEMENTATION_CLASS,
-                    structuredType.getImplementationClass().get().getName());
+                    FIELD_NAME_DESCRIPTION, structuredType.getDescription().get());
+        }
+        if (structuredType.getImplementationClass().isPresent()) {
+            jsonGenerator.writeObjectField(
+                    FIELD_NAME_IMPLEMENTATION_CLASS, structuredType.getImplementationClass().get());
         }
         jsonGenerator.writeFieldName(FIELD_NAME_ATTRIBUTES);
         jsonGenerator.writeStartArray();
-        for (StructuredType.StructuredAttribute attribute : structuredType.getAttributes()) {
+        for (StructuredAttribute attribute : structuredType.getAttributes()) {
             jsonGenerator.writeStartObject();
-            jsonGenerator.writeStringField(FIELD_NAME_NAME, attribute.getName());
-            jsonGenerator.writeObjectField(FIELD_NAME_LOGICAL_TYPE, attribute.getType());
+            jsonGenerator.writeStringField(FIELD_NAME_ATTRIBUTE_NAME, attribute.getName());
+            jsonGenerator.writeFieldName(FIELD_NAME_ATTRIBUTE_TYPE);
+            serializeInternal(attribute.getType(), jsonGenerator, serializeCatalogObjects);
             if (attribute.getDescription().isPresent()) {
                 jsonGenerator.writeStringField(
-                        FIELD_NAME_DESCRIPTION, attribute.getDescription().get());
+                        FIELD_NAME_ATTRIBUTE_DESCRIPTION, attribute.getDescription().get());
             }
             jsonGenerator.writeEndObject();
         }
         jsonGenerator.writeEndArray();
-        jsonGenerator.writeBooleanField(FIELD_NAME_FINAL, structuredType.isFinal());
-        jsonGenerator.writeBooleanField(FIELD_NAME_INSTANTIABLE, structuredType.isInstantiable());
-        jsonGenerator.writeStringField(
-                FIELD_NAME_COMPARISON, structuredType.getComparison().name());
-        if (structuredType.getSuperType().isPresent()) {
-            jsonGenerator.writeObjectField(
-                    FIELD_NAME_SUPPER_TYPE, structuredType.getSuperType().get());
+        if (!structuredType.isFinal()) {
+            jsonGenerator.writeBooleanField(FIELD_NAME_FINAL, false);
         }
-        if (structuredType.getDescription().isPresent()) {
+        if (!structuredType.isInstantiable()) {
+            jsonGenerator.writeBooleanField(FIELD_NAME_INSTANTIABLE, false);
+        }
+        if (structuredType.getComparison() != StructuredComparison.NONE) {
             jsonGenerator.writeStringField(
-                    FIELD_NAME_DESCRIPTION, structuredType.getDescription().get());
+                    FIELD_NAME_COMPARISON, structuredType.getComparison().name());
+        }
+        if (structuredType.getSuperType().isPresent()) {
+            jsonGenerator.writeObjectField(
+                    FIELD_NAME_SUPER_TYPE, structuredType.getSuperType().get());
         }
-        jsonGenerator.writeEndObject();
     }
 
-    private void serializeDistinctType(DistinctType distinctType, JsonGenerator jsonGenerator)
+    private static void serializeSpecializedRaw(RawType<?> rawType, JsonGenerator jsonGenerator)
             throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, LogicalTypeRoot.DISTINCT_TYPE.name());
-        Preconditions.checkArgument(distinctType.getObjectIdentifier().isPresent());
-        jsonGenerator.writeObjectField(
-                FIELD_NAME_IDENTIFIER, distinctType.getObjectIdentifier().get());
-        jsonGenerator.writeObjectField(FIELD_NAME_SOURCE_TYPE, distinctType.getSourceType());
-        if (distinctType.getDescription().isPresent()) {
+        jsonGenerator.writeStringField(FIELD_NAME_CLASS, rawType.getOriginatingClass().getName());
+        final TypeSerializer<?> serializer = rawType.getTypeSerializer();
+        if (serializer.equals(NullSerializer.INSTANCE)) {
             jsonGenerator.writeStringField(
-                    FIELD_NAME_DESCRIPTION, distinctType.getDescription().get());
+                    FIELD_NAME_SPECIAL_SERIALIZER, FIELD_VALUE_EXTERNAL_SERIALIZER_NULL);
+        } else if (serializer instanceof ExternalSerializer) {
+            final ExternalSerializer<?, ?> externalSerializer =
+                    (ExternalSerializer<?, ?>) rawType.getTypeSerializer();
+            if (externalSerializer.isInternalInput()) {
+                throw new TableException(
+                        "Asymmetric external serializers are currently not supported. "
+                                + "The input must not be internal if the output is external.");
+            }
+            jsonGenerator.writeObjectField(
+                    FIELD_NAME_EXTERNAL_DATA_TYPE, externalSerializer.getDataType());
+        } else {
+            throw new TableException("Unsupported special case for RAW type.");
         }
-        jsonGenerator.writeEndObject();
     }
 
-    private void serializeTimestampType(TimestampType timestampType, JsonGenerator jsonGenerator)
-            throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, timestampType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, timestampType.isNullable());
-        jsonGenerator.writeNumberField(FIELD_NAME_PRECISION, timestampType.getPrecision());
-        jsonGenerator.writeObjectField(FIELD_NAME_TIMESTAMP_KIND, timestampType.getKind());
-        jsonGenerator.writeEndObject();
-    }
+    // --------------------------------------------------------------------------------------------
+    // Compact Serialization
+    // --------------------------------------------------------------------------------------------
 
-    private void serializeZonedTimestampType(
-            ZonedTimestampType timestampType, JsonGenerator jsonGenerator) throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, timestampType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, timestampType.isNullable());
-        jsonGenerator.writeNumberField(FIELD_NAME_PRECISION, timestampType.getPrecision());
-        jsonGenerator.writeObjectField(FIELD_NAME_TIMESTAMP_KIND, timestampType.getKind());
-        jsonGenerator.writeEndObject();
+    private static boolean supportsCompactSerialization(
+            LogicalType logicalType, boolean serializeCatalogObjects) {
+        return logicalType.accept(new CompactSerializationChecker(serializeCatalogObjects));
     }
 
-    private void serializeLocalZonedTimestampType(
-            LocalZonedTimestampType timestampType, JsonGenerator jsonGenerator) throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, timestampType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, timestampType.isNullable());
-        jsonGenerator.writeNumberField(FIELD_NAME_PRECISION, timestampType.getPrecision());
-        jsonGenerator.writeObjectField(FIELD_NAME_TIMESTAMP_KIND, timestampType.getKind());
-        jsonGenerator.writeEndObject();
+    private static void serializeTypeWithCompactSerialization(
+            LogicalType logicalType, JsonGenerator jsonGenerator) throws IOException {
+        final String compactString = logicalType.asSerializableString();
+        jsonGenerator.writeString(compactString);
     }
 
-    @SuppressWarnings("rawtypes")
-    private void serializeRawType(
-            RawType<?> rawType, JsonGenerator jsonGenerator, SerializerProvider serializerProvider)
-            throws IOException {
-        TypeSerializer<?> typeSer = rawType.getTypeSerializer();
-        if (typeSer instanceof ExternalSerializer) {
-            ExternalSerializer externalSer = (ExternalSerializer) typeSer;
-            // Currently, ExternalSerializer with `isInternalInput=false` will be serialized,
-            // Once `isInternalInput=true` needs to be serialized, we can add individual field in
-            // the json to support it, and the new json plan is compatible with the previous one.
-            if (externalSer.isInternalInput()) {
-                throw new TableException(
-                        "ExternalSerializer with `isInternalInput=true` is not supported.");
-            }
-            DataType dataType = externalSer.getDataType();
-            boolean isMapView = DataViewUtils.isMapViewDataType(dataType);
-            boolean isListView = DataViewUtils.isListViewDataType(dataType);
-            if (isMapView || isListView) {
-                jsonGenerator.writeStartObject();
-                jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, LogicalTypeRoot.RAW.name());
-                jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, rawType.isNullable());
-                if (isMapView) {
-                    jsonGenerator.writeStringField(
-                            FIELD_NAME_DATA_VIEW_CLASS, MapView.class.getName());
-                    KeyValueDataType keyValueDataType =
-                            DataViewUtils.extractKeyValueDataTypeForMapView(dataType);
-                    serializeDataTypeForDataView(
-                            FIELD_NAME_KEY_TYPE,
-                            keyValueDataType.getKeyDataType(),
-                            jsonGenerator,
-                            serializerProvider);
-                    serializeDataTypeForDataView(
-                            FIELD_NAME_VALUE_TYPE,
-                            keyValueDataType.getValueDataType(),
-                            jsonGenerator,
-                            serializerProvider);
-                } else {
-                    jsonGenerator.writeStringField(
-                            FIELD_NAME_DATA_VIEW_CLASS, ListView.class.getName());
-                    DataType elementType =
-                            DataViewUtils.extractElementDataTypeForListView(dataType);
-                    serializeDataTypeForDataView(
-                            FIELD_NAME_ELEMENT_TYPE,
-                            elementType,
-                            jsonGenerator,
-                            serializerProvider);
-                }
-                jsonGenerator.writeEndObject();
-                return;
-            }
+    /**
+     * Checks whether the given type can be serialized as a compact string created from {@link
+     * LogicalType#asSerializableString()}.
+     */
+    private static class CompactSerializationChecker extends LogicalTypeDefaultVisitor<Boolean> {
+
+        private final boolean serializeCatalogObjects;
+
+        CompactSerializationChecker(boolean serializeCatalogObjects) {
+            this.serializeCatalogObjects = serializeCatalogObjects;
         }
 
-        jsonGenerator.writeObject(rawType.asSerializableString());
-    }
+        @Override
+        public Boolean visit(CharType charType) {
+            return charType.getLength() > 0;
+        }
 
-    private void serializeDataTypeForDataView(
-            String key,
-            DataType dataType,
-            JsonGenerator jsonGenerator,
-            SerializerProvider serializerProvider)
-            throws IOException {
-        jsonGenerator.writeFieldName(key);
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeBooleanField(
-                FIELD_NAME_IS_INTERNAL_TYPE, DataTypeUtils.isInternal(dataType));
-        jsonGenerator.writeFieldName(FIELD_NAME_TYPE_NAME);
-        LogicalType logicalType = LogicalTypeDataTypeConverter.toLogicalType(dataType);
-        serialize(logicalType, jsonGenerator, serializerProvider);
-        jsonGenerator.writeEndObject();
+        @Override
+        public Boolean visit(VarCharType varCharType) {
+            return varCharType.getLength() > 0;
+        }
+
+        @Override
+        public Boolean visit(BinaryType binaryType) {
+            return binaryType.getLength() > 0;
+        }
+
+        @Override
+        public Boolean visit(VarBinaryType varBinaryType) {
+            return varBinaryType.getLength() > 0;
+        }
+
+        @Override
+        public Boolean visit(TimestampType timestampType) {
+            return timestampType.getKind() == TimestampKind.REGULAR;
+        }
+
+        @Override
+        public Boolean visit(ZonedTimestampType zonedTimestampType) {
+            return zonedTimestampType.getKind() == TimestampKind.REGULAR;
+        }
+
+        @Override
+        public Boolean visit(LocalZonedTimestampType localZonedTimestampType) {
+            return localZonedTimestampType.getKind() == TimestampKind.REGULAR;
+        }
+
+        @Override
+        public Boolean visit(DistinctType distinctType) {
+            // catalog-based distinct types are always string serializable,
+            // however, depending on the configuration, we serialize the entire type
+            return !serializeCatalogObjects;
+        }
+
+        @Override
+        public Boolean visit(StructuredType structuredType) {
+            // catalog-based structured types are always string serializable,
+            // however, depending on the configuration, we serialize the entire type
+            return structuredType.getObjectIdentifier().isPresent() && !serializeCatalogObjects;

Review comment:
       Not sure if I understand this question correctly, But this is exactly what we do. The children will have a compact representation. However, there is no string representation defined in the SQL standard like `ROW`. We would need to make up syntax that should actually be defined by a type DDL.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] twalthr commented on a change in pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
twalthr commented on a change in pull request #18274:
URL: https://github.com/apache/flink/pull/18274#discussion_r780327690



##########
File path: flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/serde/LogicalTypeJsonSerdeTest.java
##########
@@ -249,28 +331,44 @@ public void testLogicalTypeSerde() throws IOException {
                                         ObjectIdentifier.of("cat", "db", "distinctType"),
                                         new VarCharType(false, 5))
                                 .build(),
+                        // custom RawType
+                        new RawType<>(Integer.class, IntSerializer.INSTANCE),

Review comment:
       It doesn't really make a difference. But I replaced it with `LocalDateTime`.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18274:
URL: https://github.com/apache/flink/pull/18274#issuecomment-1005845834


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=28985",
       "triggerID" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * dd2c1d149708b916cb05bd2b0580015ae2e1f889 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=28985) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18274:
URL: https://github.com/apache/flink/pull/18274#issuecomment-1005845834


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=28985",
       "triggerID" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * dd2c1d149708b916cb05bd2b0580015ae2e1f889 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=28985) 
   * 8b30dbb1acd4bffe0c4c5d25b669705deb19463e UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18274:
URL: https://github.com/apache/flink/pull/18274#issuecomment-1005845834


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=28985",
       "triggerID" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * dd2c1d149708b916cb05bd2b0580015ae2e1f889 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=28985) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18274:
URL: https://github.com/apache/flink/pull/18274#issuecomment-1005845834


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=28985",
       "triggerID" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29041",
       "triggerID" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29092",
       "triggerID" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 9f800b453598a8cbf015583929e5fabda9fedbf6 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29092) 
   * 718ea97838ff10191544cc7460d3c18380b0e119 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18274:
URL: https://github.com/apache/flink/pull/18274#issuecomment-1005845834


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=28985",
       "triggerID" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29041",
       "triggerID" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29092",
       "triggerID" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29112",
       "triggerID" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 718ea97838ff10191544cc7460d3c18380b0e119 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29112) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] slinkydeveloper commented on a change in pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
slinkydeveloper commented on a change in pull request #18274:
URL: https://github.com/apache/flink/pull/18274#discussion_r780265171



##########
File path: flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/utils/LogicalTypeDataTypeConverter.java
##########
@@ -201,9 +200,29 @@ public DataType visit(RowType rowType) {
 
         @Override
         public DataType visit(DistinctType distinctType) {
-            return new FieldsDataType(
-                    distinctType,
-                    Collections.singletonList(distinctType.getSourceType().accept(this)));
+            final DataType sourceDataType = distinctType.getSourceType().accept(this);
+            if (sourceDataType instanceof AtomicDataType) {
+                return new AtomicDataType(distinctType, sourceDataType.getConversionClass());
+            } else if (sourceDataType instanceof CollectionDataType) {
+                final CollectionDataType collectionDataType = (CollectionDataType) sourceDataType;
+                return new CollectionDataType(
+                        distinctType,
+                        collectionDataType.getConversionClass(),
+                        collectionDataType.getElementDataType());
+            } else if (sourceDataType instanceof KeyValueDataType) {
+                final KeyValueDataType keyValueDataType = (KeyValueDataType) sourceDataType;
+                return new KeyValueDataType(
+                        distinctType,
+                        keyValueDataType.getConversionClass(),
+                        keyValueDataType.getKeyDataType(),
+                        keyValueDataType.getValueDataType());
+            } else if (sourceDataType instanceof FieldsDataType) {
+                return new FieldsDataType(
+                        distinctType,
+                        sourceDataType.getConversionClass(),
+                        sourceDataType.getChildren());
+            }
+            throw new IllegalStateException("Unexpected data type instance.");

Review comment:
       Ok, that's a detail I didn't knew about our type system, makes sense

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/FlinkSerializationProvider.java
##########
@@ -0,0 +1,54 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.SerializationConfig;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.SerializerProvider;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.DefaultSerializerProvider;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ser.SerializerFactory;
+
+/** {@link SerializerProvider} that offers a Flink-specific {@link SerdeContext}. */
+class FlinkSerializationProvider extends DefaultSerializerProvider {

Review comment:
       You don't need this one anymore

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/DataTypeJsonDeserializer.java
##########
@@ -0,0 +1,190 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.annotation.Internal;
+import org.apache.flink.table.api.DataTypes;
+import org.apache.flink.table.api.TableException;
+import org.apache.flink.table.types.CollectionDataType;
+import org.apache.flink.table.types.DataType;
+import org.apache.flink.table.types.FieldsDataType;
+import org.apache.flink.table.types.KeyValueDataType;
+import org.apache.flink.table.types.extraction.ExtractionUtils;
+import org.apache.flink.table.types.logical.DistinctType;
+import org.apache.flink.table.types.logical.LogicalType;
+import org.apache.flink.table.types.logical.MapType;
+import org.apache.flink.table.types.logical.utils.LogicalTypeChecks;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonParser;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.JsonNode;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.StdDeserializer;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.node.ArrayNode;
+
+import javax.annotation.Nullable;
+
+import java.io.IOException;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import java.util.stream.Collectors;
+import java.util.stream.IntStream;
+
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_CONVERSION_CLASS;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_ELEMENT_CLASS;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_FIELDS;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_FIELD_NAME;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_KEY_CLASS;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_TYPE;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerializer.FIELD_NAME_VALUE_CLASS;
+
+/**
+ * JSON deserializer for {@link DataType}.
+ *
+ * @see DataTypeJsonSerializer for the reverse operation
+ */
+@Internal
+public class DataTypeJsonDeserializer extends StdDeserializer<DataType> {
+
+    public DataTypeJsonDeserializer() {
+        super(DataType.class);
+    }
+
+    @Override
+    public DataType deserialize(JsonParser jsonParser, DeserializationContext ctx)
+            throws IOException {
+        final JsonNode dataTypeNode = jsonParser.readValueAsTree();
+        final SerdeContext serdeContext = SerdeContext.get(ctx);
+        return deserialize(dataTypeNode, serdeContext);
+    }
+
+    public static DataType deserialize(JsonNode dataTypeNode, SerdeContext serdeContext) {
+        if (dataTypeNode.isTextual()) {
+            return deserializeWithInternalClass(dataTypeNode, serdeContext);
+        } else {
+            return deserializeWithExternalClass(dataTypeNode, serdeContext);
+        }
+    }
+
+    private static DataType deserializeWithInternalClass(
+            JsonNode logicalTypeNode, SerdeContext serdeContext) {
+        final LogicalType logicalType =
+                LogicalTypeJsonDeserializer.deserialize(logicalTypeNode, serdeContext);
+        return DataTypes.of(logicalType).toInternal();
+    }
+
+    private static DataType deserializeWithExternalClass(
+            JsonNode dataTypeNode, SerdeContext serdeContext) {
+        final LogicalType logicalType =
+                LogicalTypeJsonDeserializer.deserialize(
+                        dataTypeNode.get(FIELD_NAME_TYPE), serdeContext);
+        return deserializeClass(logicalType, dataTypeNode, serdeContext);
+    }
+
+    private static DataType deserializeClass(
+            LogicalType logicalType, @Nullable JsonNode classNode, SerdeContext serdeContext) {
+        if (classNode == null) {
+            return DataTypes.of(logicalType).toInternal();
+        }
+
+        final DataType dataType;
+        switch (logicalType.getTypeRoot()) {
+            case ARRAY:
+            case MULTISET:
+                final DataType elementDataType =
+                        deserializeClass(
+                                logicalType.getChildren().get(0),
+                                classNode.get(FIELD_NAME_ELEMENT_CLASS),
+                                serdeContext);
+                dataType = new CollectionDataType(logicalType, elementDataType);
+                break;
+
+            case MAP:
+                final MapType mapType = (MapType) logicalType;
+                final DataType keyDataType =
+                        deserializeClass(
+                                mapType.getKeyType(),
+                                classNode.get(FIELD_NAME_KEY_CLASS),
+                                serdeContext);
+                final DataType valueDataType =
+                        deserializeClass(
+                                mapType.getValueType(),
+                                classNode.get(FIELD_NAME_VALUE_CLASS),
+                                serdeContext);
+                dataType = new KeyValueDataType(mapType, keyDataType, valueDataType);
+                break;
+
+            case ROW:
+            case STRUCTURED_TYPE:
+                final List<String> fieldNames = LogicalTypeChecks.getFieldNames(logicalType);
+                final List<LogicalType> fieldTypes = LogicalTypeChecks.getFieldTypes(logicalType);
+
+                final ArrayNode fieldNodes = (ArrayNode) classNode.get(FIELD_NAME_FIELDS);
+                final Map<String, JsonNode> fieldNodesByName = new HashMap<>();
+                if (fieldNodes != null) {
+                    fieldNodes.forEach(
+                            fieldNode ->
+                                    fieldNodesByName.put(
+                                            fieldNode.get(FIELD_NAME_FIELD_NAME).asText(),
+                                            fieldNode));
+                }
+
+                final List<DataType> fieldDataTypes =
+                        IntStream.range(0, fieldNames.size())
+                                .mapToObj(
+                                        i -> {
+                                            final String fieldName = fieldNames.get(i);
+                                            final LogicalType fieldType = fieldTypes.get(i);
+                                            return deserializeClass(
+                                                    fieldType,
+                                                    fieldNodesByName.get(fieldName),
+                                                    serdeContext);
+                                        })
+                                .collect(Collectors.toList());
+
+                dataType = new FieldsDataType(logicalType, fieldDataTypes);
+                break;
+
+            case DISTINCT_TYPE:
+                final DistinctType distinctType = (DistinctType) logicalType;
+                dataType = deserializeClass(distinctType.getSourceType(), classNode, serdeContext);
+                break;
+
+            default:
+                dataType = DataTypes.of(logicalType).toInternal();
+        }
+
+        final Class<?> conversionClass =
+                loadClass(
+                        classNode.get(FIELD_NAME_CONVERSION_CLASS).asText(),
+                        serdeContext,
+                        String.format("conversion class of data type '%s'", dataType));
+        return dataType.bridgedTo(conversionClass);
+    }
+
+    private static Class<?> loadClass(
+            String className, SerdeContext serdeContext, String explanation) {
+        try {
+            return ExtractionUtils.classForName(className, true, serdeContext.getClassLoader());
+        } catch (ClassNotFoundException e) {
+            throw new TableException(
+                    String.format("Could not load class '%s' for %s.", className, explanation));

Review comment:
       Can you propagate `e` in the exception cause?

##########
File path: flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/serde/LogicalTypeJsonSerdeTest.java
##########
@@ -249,28 +331,44 @@ public void testLogicalTypeSerde() throws IOException {
                                         ObjectIdentifier.of("cat", "db", "distinctType"),
                                         new VarCharType(false, 5))
                                 .build(),
+                        // custom RawType
+                        new RawType<>(Integer.class, IntSerializer.INSTANCE),

Review comment:
       Can you add a test case here for a "real" raw type? For example take `RAW(LocalDateTime.class, LocalDateTimeSerializer.INSTANCE)` (I use it in `CastRulesTest`)

##########
File path: flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/serde/LogicalTypeJsonSerdeTest.java
##########
@@ -60,68 +58,142 @@
 import org.apache.flink.table.types.logical.TimestampKind;
 import org.apache.flink.table.types.logical.TimestampType;
 import org.apache.flink.table.types.logical.TinyIntType;
-import org.apache.flink.table.types.logical.TypeInformationRawType;
 import org.apache.flink.table.types.logical.VarBinaryType;
 import org.apache.flink.table.types.logical.VarCharType;
 import org.apache.flink.table.types.logical.YearMonthIntervalType;
 import org.apache.flink.table.types.logical.ZonedTimestampType;
-import org.apache.flink.table.types.utils.DataTypeUtils;
+import org.apache.flink.table.types.utils.DataTypeFactoryMock;
+import org.apache.flink.table.utils.CatalogManagerMocks;
+import org.apache.flink.types.Row;
 
-import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonGenerator;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonProcessingException;
 import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectReader;
 import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectWriter;
 
 import org.junit.Test;
-import org.junit.runner.RunWith;
-import org.junit.runners.Parameterized;
+import org.junit.jupiter.params.ParameterizedTest;
+import org.junit.jupiter.params.provider.MethodSource;
+import org.junit.runners.Parameterized.Parameters;
 
 import java.io.IOException;
-import java.io.StringWriter;
 import java.util.ArrayList;
 import java.util.Arrays;
 import java.util.Collections;
 import java.util.List;
+import java.util.Optional;
 
-import static org.apache.flink.table.types.utils.LogicalTypeDataTypeConverter.toDataType;
-import static org.apache.flink.table.types.utils.LogicalTypeDataTypeConverter.toLogicalType;
-import static org.junit.Assert.assertEquals;
+import static org.apache.flink.core.testutils.FlinkAssertions.anyCauseMatches;
+import static org.apache.flink.table.api.config.TableConfigOptions.CatalogPlanCompilation.ALL;
+import static org.apache.flink.table.api.config.TableConfigOptions.CatalogPlanCompilation.IDENTIFIER;
+import static org.apache.flink.table.planner.plan.nodes.exec.serde.DataTypeJsonSerdeTest.configuredSerdeContext;
+import static org.apache.flink.table.utils.CatalogManagerMocks.preparedCatalogManager;
+import static org.assertj.core.api.Assertions.assertThat;
+import static org.assertj.core.api.Assertions.assertThatThrownBy;
 
 /** Tests for {@link LogicalType} serialization and deserialization. */
-@RunWith(Parameterized.class)
-public class LogicalTypeSerdeTest {
+public class LogicalTypeJsonSerdeTest {
 
-    @Parameterized.Parameter public LogicalType logicalType;
+    @ParameterizedTest
+    @MethodSource("testLogicalTypeSerde")
+    public void testLogicalTypeSerde(LogicalType logicalType) throws IOException {
+        final SerdeContext serdeContext = configuredSerdeContext();
+
+        final String json = toJson(serdeContext, logicalType);
+        final LogicalType actual = toLogicalType(serdeContext, json);
+
+        assertThat(actual).isEqualTo(logicalType);
+    }
 
     @Test
-    public void testLogicalTypeSerde() throws IOException {
-        SerdeContext serdeCtx =
-                new SerdeContext(
-                        new FlinkContextImpl(
-                                false,
-                                TableConfig.getDefault(),
-                                new ModuleManager(),
-                                null,
-                                null,
-                                null),
-                        Thread.currentThread().getContextClassLoader(),
-                        FlinkTypeFactory.INSTANCE(),
-                        FlinkSqlOperatorTable.instance());
-        ObjectReader objectReader = JsonSerdeUtil.createObjectReader(serdeCtx);
-        ObjectWriter objectWriter = JsonSerdeUtil.createObjectWriter(serdeCtx);
-
-        StringWriter writer = new StringWriter(100);
-        try (JsonGenerator gen = objectWriter.getFactory().createGenerator(writer)) {
-            gen.writeObject(logicalType);
-        }
-        String json = writer.toString();
-        LogicalType actual = objectReader.readValue(json, LogicalType.class);
-        assertEquals(logicalType, actual);
-        assertEquals(logicalType.asSummaryString(), actual.asSummaryString());
+    public void testIdentifierSerde() {
+        final DataTypeFactoryMock dataTypeFactoryMock = new DataTypeFactoryMock();
+        final TableConfig tableConfig = TableConfig.getDefault();
+        final Configuration config = tableConfig.getConfiguration();
+        final CatalogManager catalogManager =
+                preparedCatalogManager().dataTypeFactory(dataTypeFactoryMock).build();
+        final SerdeContext serdeContext = configuredSerdeContext(catalogManager, tableConfig);
+
+        // minimal plan content
+        config.set(TableConfigOptions.PLAN_COMPILE_CATALOG_OBJECTS, IDENTIFIER);
+        final String minimalJson = toJson(serdeContext, STRUCTURED_TYPE);
+        assertThat(minimalJson).isEqualTo("\"`default_catalog`.`default_database`.`MyType`\"");
+
+        // catalog lookup with miss
+        config.set(
+                TableConfigOptions.PLAN_RESTORE_CATALOG_OBJECTS,
+                TableConfigOptions.CatalogPlanRestore.IDENTIFIER);
+        dataTypeFactoryMock.logicalType = Optional.empty();
+        assertThatThrownBy(() -> toLogicalType(serdeContext, minimalJson))
+                .satisfies(anyCauseMatches(ValidationException.class, "No type found."));
+
+        // catalog lookup
+        config.set(
+                TableConfigOptions.PLAN_RESTORE_CATALOG_OBJECTS,
+                TableConfigOptions.CatalogPlanRestore.IDENTIFIER);
+        dataTypeFactoryMock.logicalType = Optional.of(STRUCTURED_TYPE);
+        assertThat(toLogicalType(serdeContext, minimalJson)).isEqualTo(STRUCTURED_TYPE);
+
+        // maximum plan content
+        config.set(TableConfigOptions.PLAN_COMPILE_CATALOG_OBJECTS, ALL);
+        final String maximumJson = toJson(serdeContext, STRUCTURED_TYPE);
+        assertThat(maximumJson)
+                .isEqualTo(
+                        "{\"type\":\"STRUCTURED_TYPE\","
+                                + "\"objectIdentifier\":"
+                                + "{\"catalogName\":\"default_catalog\","
+                                + "\"databaseName\":\"default_database\","
+                                + "\"tableName\":\"MyType\"},"
+                                + "\"description\":\"My original type.\","
+                                + "\"attributes\":[]}");

Review comment:
       Let's avoid these tests, but instead execute equalities on `ObjectNode`




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18274:
URL: https://github.com/apache/flink/pull/18274#issuecomment-1005845834


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=28985",
       "triggerID" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29041",
       "triggerID" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * dd2c1d149708b916cb05bd2b0580015ae2e1f889 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=28985) 
   * 8b30dbb1acd4bffe0c4c5d25b669705deb19463e Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29041) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18274:
URL: https://github.com/apache/flink/pull/18274#issuecomment-1005845834


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=28985",
       "triggerID" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29041",
       "triggerID" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29092",
       "triggerID" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29112",
       "triggerID" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "triggerType" : "PUSH"
     }, {
       "hash" : "c445781fa99453f887f9d91978e5bb9ca9d1f91a",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "c445781fa99453f887f9d91978e5bb9ca9d1f91a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 718ea97838ff10191544cc7460d3c18380b0e119 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29112) 
   * c445781fa99453f887f9d91978e5bb9ca9d1f91a UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] twalthr commented on a change in pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
twalthr commented on a change in pull request #18274:
URL: https://github.com/apache/flink/pull/18274#discussion_r780140261



##########
File path: flink-table/flink-table-common/src/main/java/org/apache/flink/table/types/utils/LogicalTypeDataTypeConverter.java
##########
@@ -201,9 +200,29 @@ public DataType visit(RowType rowType) {
 
         @Override
         public DataType visit(DistinctType distinctType) {
-            return new FieldsDataType(
-                    distinctType,
-                    Collections.singletonList(distinctType.getSourceType().accept(this)));
+            final DataType sourceDataType = distinctType.getSourceType().accept(this);
+            if (sourceDataType instanceof AtomicDataType) {
+                return new AtomicDataType(distinctType, sourceDataType.getConversionClass());
+            } else if (sourceDataType instanceof CollectionDataType) {
+                final CollectionDataType collectionDataType = (CollectionDataType) sourceDataType;
+                return new CollectionDataType(
+                        distinctType,
+                        collectionDataType.getConversionClass(),
+                        collectionDataType.getElementDataType());
+            } else if (sourceDataType instanceof KeyValueDataType) {
+                final KeyValueDataType keyValueDataType = (KeyValueDataType) sourceDataType;
+                return new KeyValueDataType(
+                        distinctType,
+                        keyValueDataType.getConversionClass(),
+                        keyValueDataType.getKeyDataType(),
+                        keyValueDataType.getValueDataType());
+            } else if (sourceDataType instanceof FieldsDataType) {
+                return new FieldsDataType(
+                        distinctType,
+                        sourceDataType.getConversionClass(),
+                        sourceDataType.getChildren());
+            }
+            throw new IllegalStateException("Unexpected data type instance.");

Review comment:
       The outer data type needs to preserve the original `distinctType` LogicalType. However, distinct types should behave similar as the source type when it comes to accessing elements and fields. Therefore, they also need to have a similar DataType wrapper class such as `FieldsDataType` or `KeyValueDataType`.

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/LogicalTypeJsonSerializer.java
##########
@@ -109,349 +123,358 @@ public void serialize(
             JsonGenerator jsonGenerator,
             SerializerProvider serializerProvider)
             throws IOException {
-        if (logicalType instanceof CharType) {
-            // Zero-length character strings have no serializable string representation.
-            serializeRowType((CharType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof VarCharType) {
-            // Zero-length character strings have no serializable string representation.
-            serializeVarCharType((VarCharType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof BinaryType) {
-            // Zero-length binary strings have no serializable string representation.
-            serializeBinaryType((BinaryType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof VarBinaryType) {
-            // Zero-length binary strings have no serializable string representation.
-            serializeVarBinaryType((VarBinaryType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof SymbolType) {
-            // SymbolType does not support `asSerializableString`
-            serializeSymbolType((SymbolType<?>) logicalType, jsonGenerator);
-        } else if (logicalType instanceof TypeInformationRawType) {
-            // TypeInformationRawType does not support `asSerializableString`
-            serializeTypeInformationRawType((TypeInformationRawType<?>) logicalType, jsonGenerator);
-        } else if (logicalType instanceof StructuredType) {
-            //  StructuredType does not full support `asSerializableString`
-            serializeStructuredType((StructuredType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof DistinctType) {
-            //  DistinctType does not full support `asSerializableString`
-            serializeDistinctType((DistinctType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof TimestampType) {
-            // TimestampType does not consider `TimestampKind`
-            serializeTimestampType((TimestampType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof ZonedTimestampType) {
-            // ZonedTimestampType does not consider `TimestampKind`
-            serializeZonedTimestampType((ZonedTimestampType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof LocalZonedTimestampType) {
-            // LocalZonedTimestampType does not consider `TimestampKind`
-            serializeLocalZonedTimestampType((LocalZonedTimestampType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof RowType) {
-            serializeRowType((RowType) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof MapType) {
-            serializeMapType((MapType) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof ArrayType) {
-            serializeArrayType((ArrayType) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof MultisetType) {
-            serializeMultisetType((MultisetType) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof RawType) {
-            serializeRawType((RawType<?>) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof UnresolvedUserDefinedType) {
-            throw new TableException(
-                    "Can not serialize an UnresolvedUserDefinedType instance. \n"
-                            + "It needs to be resolved into a proper user-defined type.\"");
-        } else {
-            jsonGenerator.writeObject(logicalType.asSerializableString());
-        }
+        final ReadableConfig config = SerdeContext.from(serializerProvider).getConfiguration();
+        final boolean serializeCatalogObjects =
+                !config.get(TableConfigOptions.PLAN_COMPILE_CATALOG_OBJECTS)
+                        .equals(CatalogPlanCompilation.IDENTIFIER);
+        serializeInternal(logicalType, jsonGenerator, serializeCatalogObjects);
     }
 
-    private void serializeRowType(
-            RowType rowType, JsonGenerator jsonGenerator, SerializerProvider serializerProvider)
+    private static void serializeInternal(
+            LogicalType logicalType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, rowType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, rowType.isNullable());
-        List<RowType.RowField> fields = rowType.getFields();
-        jsonGenerator.writeArrayFieldStart(FIELD_NAME_FIELDS);
-        for (RowType.RowField rowField : fields) {
-            jsonGenerator.writeStartObject();
-            jsonGenerator.writeFieldName(rowField.getName());
-            serialize(rowField.getType(), jsonGenerator, serializerProvider);
-            if (rowField.getDescription().isPresent()) {
-                jsonGenerator.writeStringField(
-                        FIELD_NAME_DESCRIPTION, rowField.getDescription().get());
-            }
-            jsonGenerator.writeEndObject();
+        if (supportsCompactSerialization(logicalType, serializeCatalogObjects)) {
+            serializeTypeWithCompactSerialization(logicalType, jsonGenerator);
+        } else {
+            // fallback to generic serialization that might still use compact serialization for
+            // individual fields
+            serializeTypeWithGenericSerialization(
+                    logicalType, jsonGenerator, serializeCatalogObjects);
         }
-        jsonGenerator.writeEndArray();
-        jsonGenerator.writeEndObject();
     }
 
-    private void serializeMapType(
-            MapType mapType, JsonGenerator jsonGenerator, SerializerProvider serializerProvider)
+    // --------------------------------------------------------------------------------------------
+    // Generic Serialization
+    // --------------------------------------------------------------------------------------------
+
+    private static void serializeTypeWithGenericSerialization(
+            LogicalType logicalType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
         jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, mapType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, mapType.isNullable());
-        jsonGenerator.writeFieldName(FIELD_NAME_KEY_TYPE);
-        serialize(mapType.getKeyType(), jsonGenerator, serializerProvider);
-        jsonGenerator.writeFieldName(FIELD_NAME_VALUE_TYPE);
-        serialize(mapType.getValueType(), jsonGenerator, serializerProvider);
+
+        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, logicalType.getTypeRoot().name());
+        if (!logicalType.isNullable()) {
+            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, false);
+        }
+
+        switch (logicalType.getTypeRoot()) {
+            case CHAR:
+            case VARCHAR:
+            case BINARY:
+            case VARBINARY:
+                serializeZeroLengthString(jsonGenerator);
+                break;
+            case TIMESTAMP_WITHOUT_TIME_ZONE:
+                final TimestampType timestampType = (TimestampType) logicalType;
+                serializeTimestamp(
+                        timestampType.getPrecision(), timestampType.getKind(), jsonGenerator);
+                break;
+            case TIMESTAMP_WITH_TIME_ZONE:
+                final ZonedTimestampType zonedTimestampType = (ZonedTimestampType) logicalType;
+                serializeTimestamp(
+                        zonedTimestampType.getPrecision(),
+                        zonedTimestampType.getKind(),
+                        jsonGenerator);
+                break;
+            case TIMESTAMP_WITH_LOCAL_TIME_ZONE:
+                final LocalZonedTimestampType localZonedTimestampType =
+                        (LocalZonedTimestampType) logicalType;
+                serializeTimestamp(
+                        localZonedTimestampType.getPrecision(),
+                        localZonedTimestampType.getKind(),
+                        jsonGenerator);
+                break;
+            case ARRAY:
+                serializeCollection(
+                        ((ArrayType) logicalType).getElementType(),
+                        jsonGenerator,
+                        serializeCatalogObjects);
+                break;
+            case MULTISET:
+                serializeCollection(
+                        ((MultisetType) logicalType).getElementType(),
+                        jsonGenerator,
+                        serializeCatalogObjects);
+                break;
+            case MAP:
+                serializeMap((MapType) logicalType, jsonGenerator, serializeCatalogObjects);
+                break;
+            case ROW:
+                serializeRow((RowType) logicalType, jsonGenerator, serializeCatalogObjects);
+                break;
+            case DISTINCT_TYPE:
+                serializeDistinctType(
+                        (DistinctType) logicalType, jsonGenerator, serializeCatalogObjects);
+                break;
+            case STRUCTURED_TYPE:
+                serializeStructuredType(
+                        (StructuredType) logicalType, jsonGenerator, serializeCatalogObjects);
+                break;
+            case SYMBOL:
+                // type root is enough
+                break;
+            case RAW:
+                if (logicalType instanceof RawType) {
+                    serializeSpecializedRaw((RawType<?>) logicalType, jsonGenerator);
+                    break;
+                }
+                // fall through
+            default:
+                throw new ValidationException(
+                        String.format(
+                                "Unable to serialize logical type '%s'. Please check the documentation for supported types.",
+                                logicalType.asSummaryString()));
+        }
+
         jsonGenerator.writeEndObject();
     }
 
-    private void serializeArrayType(
-            ArrayType arrayType, JsonGenerator jsonGenerator, SerializerProvider serializerProvider)
-            throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, arrayType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, arrayType.isNullable());
-        jsonGenerator.writeFieldName(FIELD_NAME_ELEMENT_TYPE);
-        serialize(arrayType.getElementType(), jsonGenerator, serializerProvider);
-        jsonGenerator.writeEndObject();
+    private static void serializeZeroLengthString(JsonGenerator jsonGenerator) throws IOException {
+        jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
     }
 
-    private void serializeMultisetType(
-            MultisetType multisetType,
-            JsonGenerator jsonGenerator,
-            SerializerProvider serializerProvider)
-            throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, multisetType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, multisetType.isNullable());
-        jsonGenerator.writeFieldName(FIELD_NAME_ELEMENT_TYPE);
-        serialize(multisetType.getElementType(), jsonGenerator, serializerProvider);
-        jsonGenerator.writeEndObject();
+    private static void serializeTimestamp(
+            int precision, TimestampKind kind, JsonGenerator jsonGenerator) throws IOException {
+        jsonGenerator.writeNumberField(FIELD_NAME_PRECISION, precision);
+        jsonGenerator.writeObjectField(FIELD_NAME_TIMESTAMP_KIND, kind);
     }
 
-    private void serializeRowType(CharType charType, JsonGenerator jsonGenerator)
+    private static void serializeCollection(
+            LogicalType elementType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        // Zero-length character strings have no serializable string representation.
-        if (charType.getLength() == CharType.EMPTY_LITERAL_LENGTH) {
-            jsonGenerator.writeStartObject();
-            jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, charType.getTypeRoot().name());
-            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, charType.isNullable());
-            jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
-            jsonGenerator.writeEndObject();
-        } else {
-            jsonGenerator.writeObject(charType.asSerializableString());
-        }
+        jsonGenerator.writeFieldName(FIELD_NAME_ELEMENT_TYPE);
+        serializeInternal(elementType, jsonGenerator, serializeCatalogObjects);
     }
 
-    private void serializeVarCharType(VarCharType varCharType, JsonGenerator jsonGenerator)
+    private static void serializeMap(
+            MapType mapType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        // Zero-length character strings have no serializable string representation.
-        if (varCharType.getLength() == VarCharType.EMPTY_LITERAL_LENGTH) {
-            jsonGenerator.writeStartObject();
-            jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, varCharType.getTypeRoot().name());
-            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, varCharType.isNullable());
-            jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
-            jsonGenerator.writeEndObject();
-        } else {
-            jsonGenerator.writeObject(varCharType.asSerializableString());
-        }
+        jsonGenerator.writeFieldName(FIELD_NAME_KEY_TYPE);
+        serializeInternal(mapType.getKeyType(), jsonGenerator, serializeCatalogObjects);
+        jsonGenerator.writeFieldName(FIELD_NAME_VALUE_TYPE);
+        serializeInternal(mapType.getValueType(), jsonGenerator, serializeCatalogObjects);
     }
 
-    private void serializeBinaryType(BinaryType binaryType, JsonGenerator jsonGenerator)
+    private static void serializeRow(
+            RowType rowType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        // Zero-length binary strings have no serializable string representation.
-        if (binaryType.getLength() == BinaryType.EMPTY_LITERAL_LENGTH) {
+        jsonGenerator.writeArrayFieldStart(FIELD_NAME_FIELDS);
+        for (RowType.RowField rowField : rowType.getFields()) {
             jsonGenerator.writeStartObject();
-            jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, binaryType.getTypeRoot().name());
-            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, binaryType.isNullable());
-            jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
+            jsonGenerator.writeStringField(FIELD_NAME_FIELD_NAME, rowField.getName());
+            jsonGenerator.writeFieldName(FIELD_NAME_FIELD_TYPE);
+            serializeInternal(rowField.getType(), jsonGenerator, serializeCatalogObjects);
+            if (rowField.getDescription().isPresent()) {
+                jsonGenerator.writeStringField(
+                        FIELD_NAME_FIELD_DESCRIPTION, rowField.getDescription().get());
+            }
             jsonGenerator.writeEndObject();
-        } else {
-            jsonGenerator.writeObject(binaryType.asSerializableString());
         }
+        jsonGenerator.writeEndArray();
     }
 
-    private void serializeVarBinaryType(VarBinaryType varBinaryType, JsonGenerator jsonGenerator)
+    private static void serializeDistinctType(
+            DistinctType distinctType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        // Zero-length binary strings have no serializable string representation.
-        if (varBinaryType.getLength() == VarBinaryType.EMPTY_LITERAL_LENGTH) {
-            jsonGenerator.writeStartObject();
+        jsonGenerator.writeObjectField(
+                FIELD_NAME_OBJECT_IDENTIFIER,
+                distinctType.getObjectIdentifier().orElseThrow(IllegalStateException::new));
+        if (distinctType.getDescription().isPresent()) {
             jsonGenerator.writeStringField(
-                    FIELD_NAME_TYPE_NAME, varBinaryType.getTypeRoot().name());
-            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, varBinaryType.isNullable());
-            jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
-            jsonGenerator.writeEndObject();
-        } else {
-            jsonGenerator.writeObject(varBinaryType.asSerializableString());
+                    FIELD_NAME_FIELD_DESCRIPTION, distinctType.getDescription().get());
         }
+        jsonGenerator.writeFieldName(FIELD_NAME_SOURCE_TYPE);
+        serializeInternal(distinctType.getSourceType(), jsonGenerator, serializeCatalogObjects);
     }
 
-    private void serializeSymbolType(SymbolType<?> symbolType, JsonGenerator jsonGenerator)
-            throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, symbolType.isNullable());
-        jsonGenerator.writeStringField(
-                FIELD_NAME_SYMBOL_CLASS, symbolType.getDefaultConversion().getName());
-        jsonGenerator.writeEndObject();
-    }
-
-    private void serializeTypeInformationRawType(
-            TypeInformationRawType<?> rawType, JsonGenerator jsonGenerator) throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, rawType.isNullable());
-        jsonGenerator.writeStringField(
-                FIELD_NAME_TYPE_INFO,
-                EncodingUtils.encodeObjectToString(rawType.getTypeInformation()));
-        jsonGenerator.writeEndObject();
-    }
-
-    private void serializeStructuredType(StructuredType structuredType, JsonGenerator jsonGenerator)
+    private static void serializeStructuredType(
+            StructuredType structuredType,
+            JsonGenerator jsonGenerator,
+            boolean serializeCatalogObjects)
             throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(
-                FIELD_NAME_TYPE_NAME, LogicalTypeRoot.STRUCTURED_TYPE.name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, structuredType.isNullable());
         if (structuredType.getObjectIdentifier().isPresent()) {
             jsonGenerator.writeObjectField(
-                    FIELD_NAME_IDENTIFIER, structuredType.getObjectIdentifier().get());
+                    FIELD_NAME_OBJECT_IDENTIFIER, structuredType.getObjectIdentifier().get());
         }
-        if (structuredType.getImplementationClass().isPresent()) {
+        if (structuredType.getDescription().isPresent()) {
             jsonGenerator.writeStringField(
-                    FIELD_NAME_IMPLEMENTATION_CLASS,
-                    structuredType.getImplementationClass().get().getName());
+                    FIELD_NAME_DESCRIPTION, structuredType.getDescription().get());
+        }
+        if (structuredType.getImplementationClass().isPresent()) {
+            jsonGenerator.writeObjectField(
+                    FIELD_NAME_IMPLEMENTATION_CLASS, structuredType.getImplementationClass().get());
         }
         jsonGenerator.writeFieldName(FIELD_NAME_ATTRIBUTES);
         jsonGenerator.writeStartArray();
-        for (StructuredType.StructuredAttribute attribute : structuredType.getAttributes()) {
+        for (StructuredAttribute attribute : structuredType.getAttributes()) {
             jsonGenerator.writeStartObject();
-            jsonGenerator.writeStringField(FIELD_NAME_NAME, attribute.getName());
-            jsonGenerator.writeObjectField(FIELD_NAME_LOGICAL_TYPE, attribute.getType());
+            jsonGenerator.writeStringField(FIELD_NAME_ATTRIBUTE_NAME, attribute.getName());
+            jsonGenerator.writeFieldName(FIELD_NAME_ATTRIBUTE_TYPE);
+            serializeInternal(attribute.getType(), jsonGenerator, serializeCatalogObjects);
             if (attribute.getDescription().isPresent()) {
                 jsonGenerator.writeStringField(
-                        FIELD_NAME_DESCRIPTION, attribute.getDescription().get());
+                        FIELD_NAME_ATTRIBUTE_DESCRIPTION, attribute.getDescription().get());
             }
             jsonGenerator.writeEndObject();
         }
         jsonGenerator.writeEndArray();
-        jsonGenerator.writeBooleanField(FIELD_NAME_FINAL, structuredType.isFinal());
-        jsonGenerator.writeBooleanField(FIELD_NAME_INSTANTIABLE, structuredType.isInstantiable());
-        jsonGenerator.writeStringField(
-                FIELD_NAME_COMPARISON, structuredType.getComparison().name());
-        if (structuredType.getSuperType().isPresent()) {
-            jsonGenerator.writeObjectField(
-                    FIELD_NAME_SUPPER_TYPE, structuredType.getSuperType().get());
+        if (!structuredType.isFinal()) {
+            jsonGenerator.writeBooleanField(FIELD_NAME_FINAL, false);
         }
-        if (structuredType.getDescription().isPresent()) {
+        if (!structuredType.isInstantiable()) {
+            jsonGenerator.writeBooleanField(FIELD_NAME_INSTANTIABLE, false);
+        }
+        if (structuredType.getComparison() != StructuredComparison.NONE) {
             jsonGenerator.writeStringField(
-                    FIELD_NAME_DESCRIPTION, structuredType.getDescription().get());
+                    FIELD_NAME_COMPARISON, structuredType.getComparison().name());
+        }
+        if (structuredType.getSuperType().isPresent()) {
+            jsonGenerator.writeObjectField(
+                    FIELD_NAME_SUPER_TYPE, structuredType.getSuperType().get());
         }
-        jsonGenerator.writeEndObject();
     }
 
-    private void serializeDistinctType(DistinctType distinctType, JsonGenerator jsonGenerator)
+    private static void serializeSpecializedRaw(RawType<?> rawType, JsonGenerator jsonGenerator)
             throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, LogicalTypeRoot.DISTINCT_TYPE.name());
-        Preconditions.checkArgument(distinctType.getObjectIdentifier().isPresent());
-        jsonGenerator.writeObjectField(
-                FIELD_NAME_IDENTIFIER, distinctType.getObjectIdentifier().get());
-        jsonGenerator.writeObjectField(FIELD_NAME_SOURCE_TYPE, distinctType.getSourceType());
-        if (distinctType.getDescription().isPresent()) {
+        jsonGenerator.writeStringField(FIELD_NAME_CLASS, rawType.getOriginatingClass().getName());
+        final TypeSerializer<?> serializer = rawType.getTypeSerializer();
+        if (serializer.equals(NullSerializer.INSTANCE)) {
             jsonGenerator.writeStringField(
-                    FIELD_NAME_DESCRIPTION, distinctType.getDescription().get());
+                    FIELD_NAME_SPECIAL_SERIALIZER, FIELD_VALUE_EXTERNAL_SERIALIZER_NULL);
+        } else if (serializer instanceof ExternalSerializer) {
+            final ExternalSerializer<?, ?> externalSerializer =
+                    (ExternalSerializer<?, ?>) rawType.getTypeSerializer();
+            if (externalSerializer.isInternalInput()) {
+                throw new TableException(
+                        "Asymmetric external serializers are currently not supported. "
+                                + "The input must not be internal if the output is external.");
+            }
+            jsonGenerator.writeObjectField(
+                    FIELD_NAME_EXTERNAL_DATA_TYPE, externalSerializer.getDataType());
+        } else {
+            throw new TableException("Unsupported special case for RAW type.");
         }
-        jsonGenerator.writeEndObject();
     }
 
-    private void serializeTimestampType(TimestampType timestampType, JsonGenerator jsonGenerator)
-            throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, timestampType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, timestampType.isNullable());
-        jsonGenerator.writeNumberField(FIELD_NAME_PRECISION, timestampType.getPrecision());
-        jsonGenerator.writeObjectField(FIELD_NAME_TIMESTAMP_KIND, timestampType.getKind());
-        jsonGenerator.writeEndObject();
-    }
+    // --------------------------------------------------------------------------------------------
+    // Compact Serialization
+    // --------------------------------------------------------------------------------------------
 
-    private void serializeZonedTimestampType(
-            ZonedTimestampType timestampType, JsonGenerator jsonGenerator) throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, timestampType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, timestampType.isNullable());
-        jsonGenerator.writeNumberField(FIELD_NAME_PRECISION, timestampType.getPrecision());
-        jsonGenerator.writeObjectField(FIELD_NAME_TIMESTAMP_KIND, timestampType.getKind());
-        jsonGenerator.writeEndObject();
+    private static boolean supportsCompactSerialization(
+            LogicalType logicalType, boolean serializeCatalogObjects) {
+        return logicalType.accept(new CompactSerializationChecker(serializeCatalogObjects));
     }
 
-    private void serializeLocalZonedTimestampType(
-            LocalZonedTimestampType timestampType, JsonGenerator jsonGenerator) throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, timestampType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, timestampType.isNullable());
-        jsonGenerator.writeNumberField(FIELD_NAME_PRECISION, timestampType.getPrecision());
-        jsonGenerator.writeObjectField(FIELD_NAME_TIMESTAMP_KIND, timestampType.getKind());
-        jsonGenerator.writeEndObject();
+    private static void serializeTypeWithCompactSerialization(
+            LogicalType logicalType, JsonGenerator jsonGenerator) throws IOException {
+        final String compactString = logicalType.asSerializableString();
+        jsonGenerator.writeString(compactString);
     }
 
-    @SuppressWarnings("rawtypes")
-    private void serializeRawType(
-            RawType<?> rawType, JsonGenerator jsonGenerator, SerializerProvider serializerProvider)
-            throws IOException {
-        TypeSerializer<?> typeSer = rawType.getTypeSerializer();
-        if (typeSer instanceof ExternalSerializer) {
-            ExternalSerializer externalSer = (ExternalSerializer) typeSer;
-            // Currently, ExternalSerializer with `isInternalInput=false` will be serialized,
-            // Once `isInternalInput=true` needs to be serialized, we can add individual field in
-            // the json to support it, and the new json plan is compatible with the previous one.
-            if (externalSer.isInternalInput()) {
-                throw new TableException(
-                        "ExternalSerializer with `isInternalInput=true` is not supported.");
-            }
-            DataType dataType = externalSer.getDataType();
-            boolean isMapView = DataViewUtils.isMapViewDataType(dataType);
-            boolean isListView = DataViewUtils.isListViewDataType(dataType);
-            if (isMapView || isListView) {
-                jsonGenerator.writeStartObject();
-                jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, LogicalTypeRoot.RAW.name());
-                jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, rawType.isNullable());
-                if (isMapView) {
-                    jsonGenerator.writeStringField(
-                            FIELD_NAME_DATA_VIEW_CLASS, MapView.class.getName());
-                    KeyValueDataType keyValueDataType =
-                            DataViewUtils.extractKeyValueDataTypeForMapView(dataType);
-                    serializeDataTypeForDataView(
-                            FIELD_NAME_KEY_TYPE,
-                            keyValueDataType.getKeyDataType(),
-                            jsonGenerator,
-                            serializerProvider);
-                    serializeDataTypeForDataView(
-                            FIELD_NAME_VALUE_TYPE,
-                            keyValueDataType.getValueDataType(),
-                            jsonGenerator,
-                            serializerProvider);
-                } else {
-                    jsonGenerator.writeStringField(
-                            FIELD_NAME_DATA_VIEW_CLASS, ListView.class.getName());
-                    DataType elementType =
-                            DataViewUtils.extractElementDataTypeForListView(dataType);
-                    serializeDataTypeForDataView(
-                            FIELD_NAME_ELEMENT_TYPE,
-                            elementType,
-                            jsonGenerator,
-                            serializerProvider);
-                }
-                jsonGenerator.writeEndObject();
-                return;
-            }
+    /**
+     * Checks whether the given type can be serialized as a compact string created from {@link
+     * LogicalType#asSerializableString()}.
+     */
+    private static class CompactSerializationChecker extends LogicalTypeDefaultVisitor<Boolean> {
+
+        private final boolean serializeCatalogObjects;
+
+        CompactSerializationChecker(boolean serializeCatalogObjects) {
+            this.serializeCatalogObjects = serializeCatalogObjects;
         }
 
-        jsonGenerator.writeObject(rawType.asSerializableString());
-    }
+        @Override
+        public Boolean visit(CharType charType) {
+            return charType.getLength() > 0;

Review comment:
       Unfortunately not. It is invalid in declarations but valid within a plan if you have something like `SELECT ""`.

##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/LogicalTypeJsonSerializer.java
##########
@@ -109,349 +123,358 @@ public void serialize(
             JsonGenerator jsonGenerator,
             SerializerProvider serializerProvider)
             throws IOException {
-        if (logicalType instanceof CharType) {
-            // Zero-length character strings have no serializable string representation.
-            serializeRowType((CharType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof VarCharType) {
-            // Zero-length character strings have no serializable string representation.
-            serializeVarCharType((VarCharType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof BinaryType) {
-            // Zero-length binary strings have no serializable string representation.
-            serializeBinaryType((BinaryType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof VarBinaryType) {
-            // Zero-length binary strings have no serializable string representation.
-            serializeVarBinaryType((VarBinaryType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof SymbolType) {
-            // SymbolType does not support `asSerializableString`
-            serializeSymbolType((SymbolType<?>) logicalType, jsonGenerator);
-        } else if (logicalType instanceof TypeInformationRawType) {
-            // TypeInformationRawType does not support `asSerializableString`
-            serializeTypeInformationRawType((TypeInformationRawType<?>) logicalType, jsonGenerator);
-        } else if (logicalType instanceof StructuredType) {
-            //  StructuredType does not full support `asSerializableString`
-            serializeStructuredType((StructuredType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof DistinctType) {
-            //  DistinctType does not full support `asSerializableString`
-            serializeDistinctType((DistinctType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof TimestampType) {
-            // TimestampType does not consider `TimestampKind`
-            serializeTimestampType((TimestampType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof ZonedTimestampType) {
-            // ZonedTimestampType does not consider `TimestampKind`
-            serializeZonedTimestampType((ZonedTimestampType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof LocalZonedTimestampType) {
-            // LocalZonedTimestampType does not consider `TimestampKind`
-            serializeLocalZonedTimestampType((LocalZonedTimestampType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof RowType) {
-            serializeRowType((RowType) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof MapType) {
-            serializeMapType((MapType) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof ArrayType) {
-            serializeArrayType((ArrayType) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof MultisetType) {
-            serializeMultisetType((MultisetType) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof RawType) {
-            serializeRawType((RawType<?>) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof UnresolvedUserDefinedType) {
-            throw new TableException(
-                    "Can not serialize an UnresolvedUserDefinedType instance. \n"
-                            + "It needs to be resolved into a proper user-defined type.\"");
-        } else {
-            jsonGenerator.writeObject(logicalType.asSerializableString());
-        }
+        final ReadableConfig config = SerdeContext.from(serializerProvider).getConfiguration();
+        final boolean serializeCatalogObjects =
+                !config.get(TableConfigOptions.PLAN_COMPILE_CATALOG_OBJECTS)
+                        .equals(CatalogPlanCompilation.IDENTIFIER);
+        serializeInternal(logicalType, jsonGenerator, serializeCatalogObjects);
     }
 
-    private void serializeRowType(
-            RowType rowType, JsonGenerator jsonGenerator, SerializerProvider serializerProvider)
+    private static void serializeInternal(
+            LogicalType logicalType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, rowType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, rowType.isNullable());
-        List<RowType.RowField> fields = rowType.getFields();
-        jsonGenerator.writeArrayFieldStart(FIELD_NAME_FIELDS);
-        for (RowType.RowField rowField : fields) {
-            jsonGenerator.writeStartObject();
-            jsonGenerator.writeFieldName(rowField.getName());
-            serialize(rowField.getType(), jsonGenerator, serializerProvider);
-            if (rowField.getDescription().isPresent()) {
-                jsonGenerator.writeStringField(
-                        FIELD_NAME_DESCRIPTION, rowField.getDescription().get());
-            }
-            jsonGenerator.writeEndObject();
+        if (supportsCompactSerialization(logicalType, serializeCatalogObjects)) {
+            serializeTypeWithCompactSerialization(logicalType, jsonGenerator);
+        } else {
+            // fallback to generic serialization that might still use compact serialization for
+            // individual fields
+            serializeTypeWithGenericSerialization(
+                    logicalType, jsonGenerator, serializeCatalogObjects);
         }
-        jsonGenerator.writeEndArray();
-        jsonGenerator.writeEndObject();
     }
 
-    private void serializeMapType(
-            MapType mapType, JsonGenerator jsonGenerator, SerializerProvider serializerProvider)
+    // --------------------------------------------------------------------------------------------
+    // Generic Serialization
+    // --------------------------------------------------------------------------------------------
+
+    private static void serializeTypeWithGenericSerialization(
+            LogicalType logicalType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
         jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, mapType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, mapType.isNullable());
-        jsonGenerator.writeFieldName(FIELD_NAME_KEY_TYPE);
-        serialize(mapType.getKeyType(), jsonGenerator, serializerProvider);
-        jsonGenerator.writeFieldName(FIELD_NAME_VALUE_TYPE);
-        serialize(mapType.getValueType(), jsonGenerator, serializerProvider);
+
+        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, logicalType.getTypeRoot().name());
+        if (!logicalType.isNullable()) {
+            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, false);
+        }
+
+        switch (logicalType.getTypeRoot()) {
+            case CHAR:
+            case VARCHAR:
+            case BINARY:
+            case VARBINARY:
+                serializeZeroLengthString(jsonGenerator);
+                break;
+            case TIMESTAMP_WITHOUT_TIME_ZONE:
+                final TimestampType timestampType = (TimestampType) logicalType;
+                serializeTimestamp(
+                        timestampType.getPrecision(), timestampType.getKind(), jsonGenerator);
+                break;
+            case TIMESTAMP_WITH_TIME_ZONE:
+                final ZonedTimestampType zonedTimestampType = (ZonedTimestampType) logicalType;
+                serializeTimestamp(
+                        zonedTimestampType.getPrecision(),
+                        zonedTimestampType.getKind(),
+                        jsonGenerator);
+                break;
+            case TIMESTAMP_WITH_LOCAL_TIME_ZONE:
+                final LocalZonedTimestampType localZonedTimestampType =
+                        (LocalZonedTimestampType) logicalType;
+                serializeTimestamp(
+                        localZonedTimestampType.getPrecision(),
+                        localZonedTimestampType.getKind(),
+                        jsonGenerator);
+                break;
+            case ARRAY:
+                serializeCollection(
+                        ((ArrayType) logicalType).getElementType(),
+                        jsonGenerator,
+                        serializeCatalogObjects);
+                break;
+            case MULTISET:
+                serializeCollection(
+                        ((MultisetType) logicalType).getElementType(),
+                        jsonGenerator,
+                        serializeCatalogObjects);
+                break;
+            case MAP:
+                serializeMap((MapType) logicalType, jsonGenerator, serializeCatalogObjects);
+                break;
+            case ROW:
+                serializeRow((RowType) logicalType, jsonGenerator, serializeCatalogObjects);
+                break;
+            case DISTINCT_TYPE:
+                serializeDistinctType(
+                        (DistinctType) logicalType, jsonGenerator, serializeCatalogObjects);
+                break;
+            case STRUCTURED_TYPE:
+                serializeStructuredType(
+                        (StructuredType) logicalType, jsonGenerator, serializeCatalogObjects);
+                break;
+            case SYMBOL:
+                // type root is enough
+                break;
+            case RAW:
+                if (logicalType instanceof RawType) {
+                    serializeSpecializedRaw((RawType<?>) logicalType, jsonGenerator);
+                    break;
+                }
+                // fall through
+            default:
+                throw new ValidationException(
+                        String.format(
+                                "Unable to serialize logical type '%s'. Please check the documentation for supported types.",
+                                logicalType.asSummaryString()));
+        }
+
         jsonGenerator.writeEndObject();
     }
 
-    private void serializeArrayType(
-            ArrayType arrayType, JsonGenerator jsonGenerator, SerializerProvider serializerProvider)
-            throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, arrayType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, arrayType.isNullable());
-        jsonGenerator.writeFieldName(FIELD_NAME_ELEMENT_TYPE);
-        serialize(arrayType.getElementType(), jsonGenerator, serializerProvider);
-        jsonGenerator.writeEndObject();
+    private static void serializeZeroLengthString(JsonGenerator jsonGenerator) throws IOException {
+        jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
     }
 
-    private void serializeMultisetType(
-            MultisetType multisetType,
-            JsonGenerator jsonGenerator,
-            SerializerProvider serializerProvider)
-            throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, multisetType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, multisetType.isNullable());
-        jsonGenerator.writeFieldName(FIELD_NAME_ELEMENT_TYPE);
-        serialize(multisetType.getElementType(), jsonGenerator, serializerProvider);
-        jsonGenerator.writeEndObject();
+    private static void serializeTimestamp(
+            int precision, TimestampKind kind, JsonGenerator jsonGenerator) throws IOException {
+        jsonGenerator.writeNumberField(FIELD_NAME_PRECISION, precision);
+        jsonGenerator.writeObjectField(FIELD_NAME_TIMESTAMP_KIND, kind);
     }
 
-    private void serializeRowType(CharType charType, JsonGenerator jsonGenerator)
+    private static void serializeCollection(
+            LogicalType elementType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        // Zero-length character strings have no serializable string representation.
-        if (charType.getLength() == CharType.EMPTY_LITERAL_LENGTH) {
-            jsonGenerator.writeStartObject();
-            jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, charType.getTypeRoot().name());
-            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, charType.isNullable());
-            jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
-            jsonGenerator.writeEndObject();
-        } else {
-            jsonGenerator.writeObject(charType.asSerializableString());
-        }
+        jsonGenerator.writeFieldName(FIELD_NAME_ELEMENT_TYPE);
+        serializeInternal(elementType, jsonGenerator, serializeCatalogObjects);
     }
 
-    private void serializeVarCharType(VarCharType varCharType, JsonGenerator jsonGenerator)
+    private static void serializeMap(
+            MapType mapType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        // Zero-length character strings have no serializable string representation.
-        if (varCharType.getLength() == VarCharType.EMPTY_LITERAL_LENGTH) {
-            jsonGenerator.writeStartObject();
-            jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, varCharType.getTypeRoot().name());
-            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, varCharType.isNullable());
-            jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
-            jsonGenerator.writeEndObject();
-        } else {
-            jsonGenerator.writeObject(varCharType.asSerializableString());
-        }
+        jsonGenerator.writeFieldName(FIELD_NAME_KEY_TYPE);
+        serializeInternal(mapType.getKeyType(), jsonGenerator, serializeCatalogObjects);
+        jsonGenerator.writeFieldName(FIELD_NAME_VALUE_TYPE);
+        serializeInternal(mapType.getValueType(), jsonGenerator, serializeCatalogObjects);
     }
 
-    private void serializeBinaryType(BinaryType binaryType, JsonGenerator jsonGenerator)
+    private static void serializeRow(
+            RowType rowType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        // Zero-length binary strings have no serializable string representation.
-        if (binaryType.getLength() == BinaryType.EMPTY_LITERAL_LENGTH) {
+        jsonGenerator.writeArrayFieldStart(FIELD_NAME_FIELDS);
+        for (RowType.RowField rowField : rowType.getFields()) {
             jsonGenerator.writeStartObject();
-            jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, binaryType.getTypeRoot().name());
-            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, binaryType.isNullable());
-            jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
+            jsonGenerator.writeStringField(FIELD_NAME_FIELD_NAME, rowField.getName());
+            jsonGenerator.writeFieldName(FIELD_NAME_FIELD_TYPE);
+            serializeInternal(rowField.getType(), jsonGenerator, serializeCatalogObjects);
+            if (rowField.getDescription().isPresent()) {
+                jsonGenerator.writeStringField(
+                        FIELD_NAME_FIELD_DESCRIPTION, rowField.getDescription().get());
+            }
             jsonGenerator.writeEndObject();
-        } else {
-            jsonGenerator.writeObject(binaryType.asSerializableString());
         }
+        jsonGenerator.writeEndArray();
     }
 
-    private void serializeVarBinaryType(VarBinaryType varBinaryType, JsonGenerator jsonGenerator)
+    private static void serializeDistinctType(
+            DistinctType distinctType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        // Zero-length binary strings have no serializable string representation.
-        if (varBinaryType.getLength() == VarBinaryType.EMPTY_LITERAL_LENGTH) {
-            jsonGenerator.writeStartObject();
+        jsonGenerator.writeObjectField(
+                FIELD_NAME_OBJECT_IDENTIFIER,
+                distinctType.getObjectIdentifier().orElseThrow(IllegalStateException::new));
+        if (distinctType.getDescription().isPresent()) {
             jsonGenerator.writeStringField(
-                    FIELD_NAME_TYPE_NAME, varBinaryType.getTypeRoot().name());
-            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, varBinaryType.isNullable());
-            jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
-            jsonGenerator.writeEndObject();
-        } else {
-            jsonGenerator.writeObject(varBinaryType.asSerializableString());
+                    FIELD_NAME_FIELD_DESCRIPTION, distinctType.getDescription().get());
         }
+        jsonGenerator.writeFieldName(FIELD_NAME_SOURCE_TYPE);
+        serializeInternal(distinctType.getSourceType(), jsonGenerator, serializeCatalogObjects);
     }
 
-    private void serializeSymbolType(SymbolType<?> symbolType, JsonGenerator jsonGenerator)
-            throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, symbolType.isNullable());
-        jsonGenerator.writeStringField(
-                FIELD_NAME_SYMBOL_CLASS, symbolType.getDefaultConversion().getName());
-        jsonGenerator.writeEndObject();
-    }
-
-    private void serializeTypeInformationRawType(
-            TypeInformationRawType<?> rawType, JsonGenerator jsonGenerator) throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, rawType.isNullable());
-        jsonGenerator.writeStringField(
-                FIELD_NAME_TYPE_INFO,
-                EncodingUtils.encodeObjectToString(rawType.getTypeInformation()));
-        jsonGenerator.writeEndObject();
-    }
-
-    private void serializeStructuredType(StructuredType structuredType, JsonGenerator jsonGenerator)
+    private static void serializeStructuredType(
+            StructuredType structuredType,
+            JsonGenerator jsonGenerator,
+            boolean serializeCatalogObjects)
             throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(
-                FIELD_NAME_TYPE_NAME, LogicalTypeRoot.STRUCTURED_TYPE.name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, structuredType.isNullable());
         if (structuredType.getObjectIdentifier().isPresent()) {
             jsonGenerator.writeObjectField(
-                    FIELD_NAME_IDENTIFIER, structuredType.getObjectIdentifier().get());
+                    FIELD_NAME_OBJECT_IDENTIFIER, structuredType.getObjectIdentifier().get());
         }
-        if (structuredType.getImplementationClass().isPresent()) {
+        if (structuredType.getDescription().isPresent()) {
             jsonGenerator.writeStringField(
-                    FIELD_NAME_IMPLEMENTATION_CLASS,
-                    structuredType.getImplementationClass().get().getName());
+                    FIELD_NAME_DESCRIPTION, structuredType.getDescription().get());
+        }
+        if (structuredType.getImplementationClass().isPresent()) {
+            jsonGenerator.writeObjectField(
+                    FIELD_NAME_IMPLEMENTATION_CLASS, structuredType.getImplementationClass().get());
         }
         jsonGenerator.writeFieldName(FIELD_NAME_ATTRIBUTES);
         jsonGenerator.writeStartArray();
-        for (StructuredType.StructuredAttribute attribute : structuredType.getAttributes()) {
+        for (StructuredAttribute attribute : structuredType.getAttributes()) {
             jsonGenerator.writeStartObject();
-            jsonGenerator.writeStringField(FIELD_NAME_NAME, attribute.getName());
-            jsonGenerator.writeObjectField(FIELD_NAME_LOGICAL_TYPE, attribute.getType());
+            jsonGenerator.writeStringField(FIELD_NAME_ATTRIBUTE_NAME, attribute.getName());
+            jsonGenerator.writeFieldName(FIELD_NAME_ATTRIBUTE_TYPE);
+            serializeInternal(attribute.getType(), jsonGenerator, serializeCatalogObjects);
             if (attribute.getDescription().isPresent()) {
                 jsonGenerator.writeStringField(
-                        FIELD_NAME_DESCRIPTION, attribute.getDescription().get());
+                        FIELD_NAME_ATTRIBUTE_DESCRIPTION, attribute.getDescription().get());
             }
             jsonGenerator.writeEndObject();
         }
         jsonGenerator.writeEndArray();
-        jsonGenerator.writeBooleanField(FIELD_NAME_FINAL, structuredType.isFinal());
-        jsonGenerator.writeBooleanField(FIELD_NAME_INSTANTIABLE, structuredType.isInstantiable());
-        jsonGenerator.writeStringField(
-                FIELD_NAME_COMPARISON, structuredType.getComparison().name());
-        if (structuredType.getSuperType().isPresent()) {
-            jsonGenerator.writeObjectField(
-                    FIELD_NAME_SUPPER_TYPE, structuredType.getSuperType().get());
+        if (!structuredType.isFinal()) {
+            jsonGenerator.writeBooleanField(FIELD_NAME_FINAL, false);
         }
-        if (structuredType.getDescription().isPresent()) {
+        if (!structuredType.isInstantiable()) {
+            jsonGenerator.writeBooleanField(FIELD_NAME_INSTANTIABLE, false);
+        }
+        if (structuredType.getComparison() != StructuredComparison.NONE) {
             jsonGenerator.writeStringField(
-                    FIELD_NAME_DESCRIPTION, structuredType.getDescription().get());
+                    FIELD_NAME_COMPARISON, structuredType.getComparison().name());
+        }
+        if (structuredType.getSuperType().isPresent()) {
+            jsonGenerator.writeObjectField(
+                    FIELD_NAME_SUPER_TYPE, structuredType.getSuperType().get());
         }
-        jsonGenerator.writeEndObject();
     }
 
-    private void serializeDistinctType(DistinctType distinctType, JsonGenerator jsonGenerator)
+    private static void serializeSpecializedRaw(RawType<?> rawType, JsonGenerator jsonGenerator)
             throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, LogicalTypeRoot.DISTINCT_TYPE.name());
-        Preconditions.checkArgument(distinctType.getObjectIdentifier().isPresent());
-        jsonGenerator.writeObjectField(
-                FIELD_NAME_IDENTIFIER, distinctType.getObjectIdentifier().get());
-        jsonGenerator.writeObjectField(FIELD_NAME_SOURCE_TYPE, distinctType.getSourceType());
-        if (distinctType.getDescription().isPresent()) {
+        jsonGenerator.writeStringField(FIELD_NAME_CLASS, rawType.getOriginatingClass().getName());
+        final TypeSerializer<?> serializer = rawType.getTypeSerializer();
+        if (serializer.equals(NullSerializer.INSTANCE)) {
             jsonGenerator.writeStringField(
-                    FIELD_NAME_DESCRIPTION, distinctType.getDescription().get());
+                    FIELD_NAME_SPECIAL_SERIALIZER, FIELD_VALUE_EXTERNAL_SERIALIZER_NULL);
+        } else if (serializer instanceof ExternalSerializer) {
+            final ExternalSerializer<?, ?> externalSerializer =
+                    (ExternalSerializer<?, ?>) rawType.getTypeSerializer();
+            if (externalSerializer.isInternalInput()) {
+                throw new TableException(
+                        "Asymmetric external serializers are currently not supported. "
+                                + "The input must not be internal if the output is external.");
+            }
+            jsonGenerator.writeObjectField(
+                    FIELD_NAME_EXTERNAL_DATA_TYPE, externalSerializer.getDataType());
+        } else {
+            throw new TableException("Unsupported special case for RAW type.");
         }
-        jsonGenerator.writeEndObject();
     }
 
-    private void serializeTimestampType(TimestampType timestampType, JsonGenerator jsonGenerator)
-            throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, timestampType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, timestampType.isNullable());
-        jsonGenerator.writeNumberField(FIELD_NAME_PRECISION, timestampType.getPrecision());
-        jsonGenerator.writeObjectField(FIELD_NAME_TIMESTAMP_KIND, timestampType.getKind());
-        jsonGenerator.writeEndObject();
-    }
+    // --------------------------------------------------------------------------------------------
+    // Compact Serialization
+    // --------------------------------------------------------------------------------------------
 
-    private void serializeZonedTimestampType(
-            ZonedTimestampType timestampType, JsonGenerator jsonGenerator) throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, timestampType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, timestampType.isNullable());
-        jsonGenerator.writeNumberField(FIELD_NAME_PRECISION, timestampType.getPrecision());
-        jsonGenerator.writeObjectField(FIELD_NAME_TIMESTAMP_KIND, timestampType.getKind());
-        jsonGenerator.writeEndObject();
+    private static boolean supportsCompactSerialization(
+            LogicalType logicalType, boolean serializeCatalogObjects) {
+        return logicalType.accept(new CompactSerializationChecker(serializeCatalogObjects));
     }
 
-    private void serializeLocalZonedTimestampType(
-            LocalZonedTimestampType timestampType, JsonGenerator jsonGenerator) throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, timestampType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, timestampType.isNullable());
-        jsonGenerator.writeNumberField(FIELD_NAME_PRECISION, timestampType.getPrecision());
-        jsonGenerator.writeObjectField(FIELD_NAME_TIMESTAMP_KIND, timestampType.getKind());
-        jsonGenerator.writeEndObject();
+    private static void serializeTypeWithCompactSerialization(
+            LogicalType logicalType, JsonGenerator jsonGenerator) throws IOException {
+        final String compactString = logicalType.asSerializableString();
+        jsonGenerator.writeString(compactString);
     }
 
-    @SuppressWarnings("rawtypes")
-    private void serializeRawType(
-            RawType<?> rawType, JsonGenerator jsonGenerator, SerializerProvider serializerProvider)
-            throws IOException {
-        TypeSerializer<?> typeSer = rawType.getTypeSerializer();
-        if (typeSer instanceof ExternalSerializer) {
-            ExternalSerializer externalSer = (ExternalSerializer) typeSer;
-            // Currently, ExternalSerializer with `isInternalInput=false` will be serialized,
-            // Once `isInternalInput=true` needs to be serialized, we can add individual field in
-            // the json to support it, and the new json plan is compatible with the previous one.
-            if (externalSer.isInternalInput()) {
-                throw new TableException(
-                        "ExternalSerializer with `isInternalInput=true` is not supported.");
-            }
-            DataType dataType = externalSer.getDataType();
-            boolean isMapView = DataViewUtils.isMapViewDataType(dataType);
-            boolean isListView = DataViewUtils.isListViewDataType(dataType);
-            if (isMapView || isListView) {
-                jsonGenerator.writeStartObject();
-                jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, LogicalTypeRoot.RAW.name());
-                jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, rawType.isNullable());
-                if (isMapView) {
-                    jsonGenerator.writeStringField(
-                            FIELD_NAME_DATA_VIEW_CLASS, MapView.class.getName());
-                    KeyValueDataType keyValueDataType =
-                            DataViewUtils.extractKeyValueDataTypeForMapView(dataType);
-                    serializeDataTypeForDataView(
-                            FIELD_NAME_KEY_TYPE,
-                            keyValueDataType.getKeyDataType(),
-                            jsonGenerator,
-                            serializerProvider);
-                    serializeDataTypeForDataView(
-                            FIELD_NAME_VALUE_TYPE,
-                            keyValueDataType.getValueDataType(),
-                            jsonGenerator,
-                            serializerProvider);
-                } else {
-                    jsonGenerator.writeStringField(
-                            FIELD_NAME_DATA_VIEW_CLASS, ListView.class.getName());
-                    DataType elementType =
-                            DataViewUtils.extractElementDataTypeForListView(dataType);
-                    serializeDataTypeForDataView(
-                            FIELD_NAME_ELEMENT_TYPE,
-                            elementType,
-                            jsonGenerator,
-                            serializerProvider);
-                }
-                jsonGenerator.writeEndObject();
-                return;
-            }
+    /**
+     * Checks whether the given type can be serialized as a compact string created from {@link
+     * LogicalType#asSerializableString()}.
+     */
+    private static class CompactSerializationChecker extends LogicalTypeDefaultVisitor<Boolean> {
+
+        private final boolean serializeCatalogObjects;
+
+        CompactSerializationChecker(boolean serializeCatalogObjects) {
+            this.serializeCatalogObjects = serializeCatalogObjects;
         }
 
-        jsonGenerator.writeObject(rawType.asSerializableString());
-    }
+        @Override
+        public Boolean visit(CharType charType) {
+            return charType.getLength() > 0;
+        }
 
-    private void serializeDataTypeForDataView(
-            String key,
-            DataType dataType,
-            JsonGenerator jsonGenerator,
-            SerializerProvider serializerProvider)
-            throws IOException {
-        jsonGenerator.writeFieldName(key);
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeBooleanField(
-                FIELD_NAME_IS_INTERNAL_TYPE, DataTypeUtils.isInternal(dataType));
-        jsonGenerator.writeFieldName(FIELD_NAME_TYPE_NAME);
-        LogicalType logicalType = LogicalTypeDataTypeConverter.toLogicalType(dataType);
-        serialize(logicalType, jsonGenerator, serializerProvider);
-        jsonGenerator.writeEndObject();
+        @Override
+        public Boolean visit(VarCharType varCharType) {
+            return varCharType.getLength() > 0;
+        }
+
+        @Override
+        public Boolean visit(BinaryType binaryType) {
+            return binaryType.getLength() > 0;
+        }
+
+        @Override
+        public Boolean visit(VarBinaryType varBinaryType) {
+            return varBinaryType.getLength() > 0;
+        }
+
+        @Override
+        public Boolean visit(TimestampType timestampType) {
+            return timestampType.getKind() == TimestampKind.REGULAR;
+        }
+
+        @Override
+        public Boolean visit(ZonedTimestampType zonedTimestampType) {
+            return zonedTimestampType.getKind() == TimestampKind.REGULAR;
+        }
+
+        @Override
+        public Boolean visit(LocalZonedTimestampType localZonedTimestampType) {
+            return localZonedTimestampType.getKind() == TimestampKind.REGULAR;
+        }
+
+        @Override
+        public Boolean visit(DistinctType distinctType) {
+            // catalog-based distinct types are always string serializable,
+            // however, depending on the configuration, we serialize the entire type
+            return !serializeCatalogObjects;
+        }
+
+        @Override
+        public Boolean visit(StructuredType structuredType) {
+            // catalog-based structured types are always string serializable,
+            // however, depending on the configuration, we serialize the entire type
+            return structuredType.getObjectIdentifier().isPresent() && !serializeCatalogObjects;

Review comment:
       Not sure if I understand this question correctly, But this is exactly what we do. The children will have a compact representation. However, there is no string representation defined in the SQL standard like `ROW`. We would need to make up syntax that should actually be defined by a type DDL.

##########
File path: flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/serde/DataTypeJsonSerdeTest.java
##########
@@ -0,0 +1,152 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.table.api.DataTypes;
+import org.apache.flink.table.api.TableConfig;
+import org.apache.flink.table.catalog.CatalogManager;
+import org.apache.flink.table.module.ModuleManager;
+import org.apache.flink.table.planner.calcite.FlinkContextImpl;
+import org.apache.flink.table.planner.calcite.FlinkTypeFactory;
+import org.apache.flink.table.planner.functions.sql.FlinkSqlOperatorTable;
+import org.apache.flink.table.types.DataType;
+import org.apache.flink.table.types.logical.LogicalType;
+import org.apache.flink.table.utils.CatalogManagerMocks;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonGenerator;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonProcessingException;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.module.SimpleModule;
+
+import org.junit.Test;
+import org.junit.runner.RunWith;
+import org.junit.runners.Parameterized;
+import org.junit.runners.Parameterized.Parameter;
+import org.junit.runners.Parameterized.Parameters;
+
+import java.io.IOException;
+import java.io.StringWriter;
+import java.util.Arrays;
+import java.util.List;
+
+import static org.assertj.core.api.Assertions.assertThat;
+
+/** Tests for {@link DataType} serialization and deserialization. */
+@RunWith(Parameterized.class)
+public class DataTypeJsonSerdeTest {
+
+    @Parameter public DataType dataType;
+
+    @Test
+    public void testDataTypeSerde() throws IOException {
+        final ObjectMapper mapper = configuredObjectMapper();
+        final String json = toJson(mapper, dataType);
+        final DataType actual = toDataType(mapper, json);
+
+        if (json.contains("children")) {
+            System.out.println();
+        }

Review comment:
       Sorry, my bad.

##########
File path: flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/serde/LogicalTypeJsonSerdeTest.java
##########
@@ -249,28 +331,44 @@ public void testLogicalTypeSerde() throws IOException {
                                         ObjectIdentifier.of("cat", "db", "distinctType"),
                                         new VarCharType(false, 5))
                                 .build(),
+                        // custom RawType
+                        new RawType<>(Integer.class, IntSerializer.INSTANCE),

Review comment:
       It doesn't really make a difference. But I replaced it with `LocalDateTime`.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18274:
URL: https://github.com/apache/flink/pull/18274#issuecomment-1005845834


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=28985",
       "triggerID" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29041",
       "triggerID" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29092",
       "triggerID" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29112",
       "triggerID" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "triggerType" : "PUSH"
     }, {
       "hash" : "c445781fa99453f887f9d91978e5bb9ca9d1f91a",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "c445781fa99453f887f9d91978e5bb9ca9d1f91a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 9f800b453598a8cbf015583929e5fabda9fedbf6 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29092) 
   * 718ea97838ff10191544cc7460d3c18380b0e119 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29112) 
   * c445781fa99453f887f9d91978e5bb9ca9d1f91a UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18274:
URL: https://github.com/apache/flink/pull/18274#issuecomment-1005845834






-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18274:
URL: https://github.com/apache/flink/pull/18274#issuecomment-1005845834


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=28985",
       "triggerID" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29041",
       "triggerID" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29092",
       "triggerID" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29112",
       "triggerID" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 9f800b453598a8cbf015583929e5fabda9fedbf6 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29092) 
   * 718ea97838ff10191544cc7460d3c18380b0e119 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29112) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18274:
URL: https://github.com/apache/flink/pull/18274#issuecomment-1005845834


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=28985",
       "triggerID" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29041",
       "triggerID" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29092",
       "triggerID" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29112",
       "triggerID" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "triggerType" : "PUSH"
     }, {
       "hash" : "c445781fa99453f887f9d91978e5bb9ca9d1f91a",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29114",
       "triggerID" : "c445781fa99453f887f9d91978e5bb9ca9d1f91a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * c445781fa99453f887f9d91978e5bb9ca9d1f91a Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29114) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18274:
URL: https://github.com/apache/flink/pull/18274#issuecomment-1005845834


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=28985",
       "triggerID" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29041",
       "triggerID" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29092",
       "triggerID" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29112",
       "triggerID" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "triggerType" : "PUSH"
     }, {
       "hash" : "c445781fa99453f887f9d91978e5bb9ca9d1f91a",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29114",
       "triggerID" : "c445781fa99453f887f9d91978e5bb9ca9d1f91a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 718ea97838ff10191544cc7460d3c18380b0e119 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29112) 
   * c445781fa99453f887f9d91978e5bb9ca9d1f91a Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29114) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot commented on pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
flinkbot commented on pull request #18274:
URL: https://github.com/apache/flink/pull/18274#issuecomment-1005843931


   Thanks a lot for your contribution to the Apache Flink project. I'm the @flinkbot. I help the community
   to review your pull request. We will use this comment to track the progress of the review.
   
   
   ## Automated Checks
   Last check on commit dd2c1d149708b916cb05bd2b0580015ae2e1f889 (Wed Jan 05 15:49:17 UTC 2022)
   
   **Warnings:**
    * No documentation files were touched! Remember to keep the Flink docs up to date!
   
   
   <sub>Mention the bot in a comment to re-run the automated checks.</sub>
   ## Review Progress
   
   * ❓ 1. The [description] looks good.
   * ❓ 2. There is [consensus] that the contribution should go into to Flink.
   * ❓ 3. Needs [attention] from.
   * ❓ 4. The change fits into the overall [architecture].
   * ❓ 5. Overall code [quality] is good.
   
   Please see the [Pull Request Review Guide](https://flink.apache.org/contributing/reviewing-prs.html) for a full explanation of the review process.<details>
    The Bot is tracking the review progress through labels. Labels are applied according to the order of the review items. For consensus, approval by a Flink committer of PMC member is required <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot approve description` to approve one or more aspects (aspects: `description`, `consensus`, `architecture` and `quality`)
    - `@flinkbot approve all` to approve all aspects
    - `@flinkbot approve-until architecture` to approve everything until `architecture`
    - `@flinkbot attention @username1 [@username2 ..]` to require somebody's attention
    - `@flinkbot disapprove architecture` to remove an approval you gave earlier
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] twalthr commented on a change in pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
twalthr commented on a change in pull request #18274:
URL: https://github.com/apache/flink/pull/18274#discussion_r780160827



##########
File path: flink-table/flink-table-planner/src/main/java/org/apache/flink/table/planner/plan/nodes/exec/serde/LogicalTypeJsonSerializer.java
##########
@@ -109,349 +123,358 @@ public void serialize(
             JsonGenerator jsonGenerator,
             SerializerProvider serializerProvider)
             throws IOException {
-        if (logicalType instanceof CharType) {
-            // Zero-length character strings have no serializable string representation.
-            serializeRowType((CharType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof VarCharType) {
-            // Zero-length character strings have no serializable string representation.
-            serializeVarCharType((VarCharType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof BinaryType) {
-            // Zero-length binary strings have no serializable string representation.
-            serializeBinaryType((BinaryType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof VarBinaryType) {
-            // Zero-length binary strings have no serializable string representation.
-            serializeVarBinaryType((VarBinaryType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof SymbolType) {
-            // SymbolType does not support `asSerializableString`
-            serializeSymbolType((SymbolType<?>) logicalType, jsonGenerator);
-        } else if (logicalType instanceof TypeInformationRawType) {
-            // TypeInformationRawType does not support `asSerializableString`
-            serializeTypeInformationRawType((TypeInformationRawType<?>) logicalType, jsonGenerator);
-        } else if (logicalType instanceof StructuredType) {
-            //  StructuredType does not full support `asSerializableString`
-            serializeStructuredType((StructuredType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof DistinctType) {
-            //  DistinctType does not full support `asSerializableString`
-            serializeDistinctType((DistinctType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof TimestampType) {
-            // TimestampType does not consider `TimestampKind`
-            serializeTimestampType((TimestampType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof ZonedTimestampType) {
-            // ZonedTimestampType does not consider `TimestampKind`
-            serializeZonedTimestampType((ZonedTimestampType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof LocalZonedTimestampType) {
-            // LocalZonedTimestampType does not consider `TimestampKind`
-            serializeLocalZonedTimestampType((LocalZonedTimestampType) logicalType, jsonGenerator);
-        } else if (logicalType instanceof RowType) {
-            serializeRowType((RowType) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof MapType) {
-            serializeMapType((MapType) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof ArrayType) {
-            serializeArrayType((ArrayType) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof MultisetType) {
-            serializeMultisetType((MultisetType) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof RawType) {
-            serializeRawType((RawType<?>) logicalType, jsonGenerator, serializerProvider);
-        } else if (logicalType instanceof UnresolvedUserDefinedType) {
-            throw new TableException(
-                    "Can not serialize an UnresolvedUserDefinedType instance. \n"
-                            + "It needs to be resolved into a proper user-defined type.\"");
-        } else {
-            jsonGenerator.writeObject(logicalType.asSerializableString());
-        }
+        final ReadableConfig config = SerdeContext.from(serializerProvider).getConfiguration();
+        final boolean serializeCatalogObjects =
+                !config.get(TableConfigOptions.PLAN_COMPILE_CATALOG_OBJECTS)
+                        .equals(CatalogPlanCompilation.IDENTIFIER);
+        serializeInternal(logicalType, jsonGenerator, serializeCatalogObjects);
     }
 
-    private void serializeRowType(
-            RowType rowType, JsonGenerator jsonGenerator, SerializerProvider serializerProvider)
+    private static void serializeInternal(
+            LogicalType logicalType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, rowType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, rowType.isNullable());
-        List<RowType.RowField> fields = rowType.getFields();
-        jsonGenerator.writeArrayFieldStart(FIELD_NAME_FIELDS);
-        for (RowType.RowField rowField : fields) {
-            jsonGenerator.writeStartObject();
-            jsonGenerator.writeFieldName(rowField.getName());
-            serialize(rowField.getType(), jsonGenerator, serializerProvider);
-            if (rowField.getDescription().isPresent()) {
-                jsonGenerator.writeStringField(
-                        FIELD_NAME_DESCRIPTION, rowField.getDescription().get());
-            }
-            jsonGenerator.writeEndObject();
+        if (supportsCompactSerialization(logicalType, serializeCatalogObjects)) {
+            serializeTypeWithCompactSerialization(logicalType, jsonGenerator);
+        } else {
+            // fallback to generic serialization that might still use compact serialization for
+            // individual fields
+            serializeTypeWithGenericSerialization(
+                    logicalType, jsonGenerator, serializeCatalogObjects);
         }
-        jsonGenerator.writeEndArray();
-        jsonGenerator.writeEndObject();
     }
 
-    private void serializeMapType(
-            MapType mapType, JsonGenerator jsonGenerator, SerializerProvider serializerProvider)
+    // --------------------------------------------------------------------------------------------
+    // Generic Serialization
+    // --------------------------------------------------------------------------------------------
+
+    private static void serializeTypeWithGenericSerialization(
+            LogicalType logicalType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
         jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, mapType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, mapType.isNullable());
-        jsonGenerator.writeFieldName(FIELD_NAME_KEY_TYPE);
-        serialize(mapType.getKeyType(), jsonGenerator, serializerProvider);
-        jsonGenerator.writeFieldName(FIELD_NAME_VALUE_TYPE);
-        serialize(mapType.getValueType(), jsonGenerator, serializerProvider);
+
+        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, logicalType.getTypeRoot().name());
+        if (!logicalType.isNullable()) {
+            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, false);
+        }
+
+        switch (logicalType.getTypeRoot()) {
+            case CHAR:
+            case VARCHAR:
+            case BINARY:
+            case VARBINARY:
+                serializeZeroLengthString(jsonGenerator);
+                break;
+            case TIMESTAMP_WITHOUT_TIME_ZONE:
+                final TimestampType timestampType = (TimestampType) logicalType;
+                serializeTimestamp(
+                        timestampType.getPrecision(), timestampType.getKind(), jsonGenerator);
+                break;
+            case TIMESTAMP_WITH_TIME_ZONE:
+                final ZonedTimestampType zonedTimestampType = (ZonedTimestampType) logicalType;
+                serializeTimestamp(
+                        zonedTimestampType.getPrecision(),
+                        zonedTimestampType.getKind(),
+                        jsonGenerator);
+                break;
+            case TIMESTAMP_WITH_LOCAL_TIME_ZONE:
+                final LocalZonedTimestampType localZonedTimestampType =
+                        (LocalZonedTimestampType) logicalType;
+                serializeTimestamp(
+                        localZonedTimestampType.getPrecision(),
+                        localZonedTimestampType.getKind(),
+                        jsonGenerator);
+                break;
+            case ARRAY:
+                serializeCollection(
+                        ((ArrayType) logicalType).getElementType(),
+                        jsonGenerator,
+                        serializeCatalogObjects);
+                break;
+            case MULTISET:
+                serializeCollection(
+                        ((MultisetType) logicalType).getElementType(),
+                        jsonGenerator,
+                        serializeCatalogObjects);
+                break;
+            case MAP:
+                serializeMap((MapType) logicalType, jsonGenerator, serializeCatalogObjects);
+                break;
+            case ROW:
+                serializeRow((RowType) logicalType, jsonGenerator, serializeCatalogObjects);
+                break;
+            case DISTINCT_TYPE:
+                serializeDistinctType(
+                        (DistinctType) logicalType, jsonGenerator, serializeCatalogObjects);
+                break;
+            case STRUCTURED_TYPE:
+                serializeStructuredType(
+                        (StructuredType) logicalType, jsonGenerator, serializeCatalogObjects);
+                break;
+            case SYMBOL:
+                // type root is enough
+                break;
+            case RAW:
+                if (logicalType instanceof RawType) {
+                    serializeSpecializedRaw((RawType<?>) logicalType, jsonGenerator);
+                    break;
+                }
+                // fall through
+            default:
+                throw new ValidationException(
+                        String.format(
+                                "Unable to serialize logical type '%s'. Please check the documentation for supported types.",
+                                logicalType.asSummaryString()));
+        }
+
         jsonGenerator.writeEndObject();
     }
 
-    private void serializeArrayType(
-            ArrayType arrayType, JsonGenerator jsonGenerator, SerializerProvider serializerProvider)
-            throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, arrayType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, arrayType.isNullable());
-        jsonGenerator.writeFieldName(FIELD_NAME_ELEMENT_TYPE);
-        serialize(arrayType.getElementType(), jsonGenerator, serializerProvider);
-        jsonGenerator.writeEndObject();
+    private static void serializeZeroLengthString(JsonGenerator jsonGenerator) throws IOException {
+        jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
     }
 
-    private void serializeMultisetType(
-            MultisetType multisetType,
-            JsonGenerator jsonGenerator,
-            SerializerProvider serializerProvider)
-            throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, multisetType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, multisetType.isNullable());
-        jsonGenerator.writeFieldName(FIELD_NAME_ELEMENT_TYPE);
-        serialize(multisetType.getElementType(), jsonGenerator, serializerProvider);
-        jsonGenerator.writeEndObject();
+    private static void serializeTimestamp(
+            int precision, TimestampKind kind, JsonGenerator jsonGenerator) throws IOException {
+        jsonGenerator.writeNumberField(FIELD_NAME_PRECISION, precision);
+        jsonGenerator.writeObjectField(FIELD_NAME_TIMESTAMP_KIND, kind);
     }
 
-    private void serializeRowType(CharType charType, JsonGenerator jsonGenerator)
+    private static void serializeCollection(
+            LogicalType elementType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        // Zero-length character strings have no serializable string representation.
-        if (charType.getLength() == CharType.EMPTY_LITERAL_LENGTH) {
-            jsonGenerator.writeStartObject();
-            jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, charType.getTypeRoot().name());
-            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, charType.isNullable());
-            jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
-            jsonGenerator.writeEndObject();
-        } else {
-            jsonGenerator.writeObject(charType.asSerializableString());
-        }
+        jsonGenerator.writeFieldName(FIELD_NAME_ELEMENT_TYPE);
+        serializeInternal(elementType, jsonGenerator, serializeCatalogObjects);
     }
 
-    private void serializeVarCharType(VarCharType varCharType, JsonGenerator jsonGenerator)
+    private static void serializeMap(
+            MapType mapType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        // Zero-length character strings have no serializable string representation.
-        if (varCharType.getLength() == VarCharType.EMPTY_LITERAL_LENGTH) {
-            jsonGenerator.writeStartObject();
-            jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, varCharType.getTypeRoot().name());
-            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, varCharType.isNullable());
-            jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
-            jsonGenerator.writeEndObject();
-        } else {
-            jsonGenerator.writeObject(varCharType.asSerializableString());
-        }
+        jsonGenerator.writeFieldName(FIELD_NAME_KEY_TYPE);
+        serializeInternal(mapType.getKeyType(), jsonGenerator, serializeCatalogObjects);
+        jsonGenerator.writeFieldName(FIELD_NAME_VALUE_TYPE);
+        serializeInternal(mapType.getValueType(), jsonGenerator, serializeCatalogObjects);
     }
 
-    private void serializeBinaryType(BinaryType binaryType, JsonGenerator jsonGenerator)
+    private static void serializeRow(
+            RowType rowType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        // Zero-length binary strings have no serializable string representation.
-        if (binaryType.getLength() == BinaryType.EMPTY_LITERAL_LENGTH) {
+        jsonGenerator.writeArrayFieldStart(FIELD_NAME_FIELDS);
+        for (RowType.RowField rowField : rowType.getFields()) {
             jsonGenerator.writeStartObject();
-            jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, binaryType.getTypeRoot().name());
-            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, binaryType.isNullable());
-            jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
+            jsonGenerator.writeStringField(FIELD_NAME_FIELD_NAME, rowField.getName());
+            jsonGenerator.writeFieldName(FIELD_NAME_FIELD_TYPE);
+            serializeInternal(rowField.getType(), jsonGenerator, serializeCatalogObjects);
+            if (rowField.getDescription().isPresent()) {
+                jsonGenerator.writeStringField(
+                        FIELD_NAME_FIELD_DESCRIPTION, rowField.getDescription().get());
+            }
             jsonGenerator.writeEndObject();
-        } else {
-            jsonGenerator.writeObject(binaryType.asSerializableString());
         }
+        jsonGenerator.writeEndArray();
     }
 
-    private void serializeVarBinaryType(VarBinaryType varBinaryType, JsonGenerator jsonGenerator)
+    private static void serializeDistinctType(
+            DistinctType distinctType, JsonGenerator jsonGenerator, boolean serializeCatalogObjects)
             throws IOException {
-        // Zero-length binary strings have no serializable string representation.
-        if (varBinaryType.getLength() == VarBinaryType.EMPTY_LITERAL_LENGTH) {
-            jsonGenerator.writeStartObject();
+        jsonGenerator.writeObjectField(
+                FIELD_NAME_OBJECT_IDENTIFIER,
+                distinctType.getObjectIdentifier().orElseThrow(IllegalStateException::new));
+        if (distinctType.getDescription().isPresent()) {
             jsonGenerator.writeStringField(
-                    FIELD_NAME_TYPE_NAME, varBinaryType.getTypeRoot().name());
-            jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, varBinaryType.isNullable());
-            jsonGenerator.writeNumberField(FIELD_NAME_LENGTH, 0);
-            jsonGenerator.writeEndObject();
-        } else {
-            jsonGenerator.writeObject(varBinaryType.asSerializableString());
+                    FIELD_NAME_FIELD_DESCRIPTION, distinctType.getDescription().get());
         }
+        jsonGenerator.writeFieldName(FIELD_NAME_SOURCE_TYPE);
+        serializeInternal(distinctType.getSourceType(), jsonGenerator, serializeCatalogObjects);
     }
 
-    private void serializeSymbolType(SymbolType<?> symbolType, JsonGenerator jsonGenerator)
-            throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, symbolType.isNullable());
-        jsonGenerator.writeStringField(
-                FIELD_NAME_SYMBOL_CLASS, symbolType.getDefaultConversion().getName());
-        jsonGenerator.writeEndObject();
-    }
-
-    private void serializeTypeInformationRawType(
-            TypeInformationRawType<?> rawType, JsonGenerator jsonGenerator) throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, rawType.isNullable());
-        jsonGenerator.writeStringField(
-                FIELD_NAME_TYPE_INFO,
-                EncodingUtils.encodeObjectToString(rawType.getTypeInformation()));
-        jsonGenerator.writeEndObject();
-    }
-
-    private void serializeStructuredType(StructuredType structuredType, JsonGenerator jsonGenerator)
+    private static void serializeStructuredType(
+            StructuredType structuredType,
+            JsonGenerator jsonGenerator,
+            boolean serializeCatalogObjects)
             throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(
-                FIELD_NAME_TYPE_NAME, LogicalTypeRoot.STRUCTURED_TYPE.name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, structuredType.isNullable());
         if (structuredType.getObjectIdentifier().isPresent()) {
             jsonGenerator.writeObjectField(
-                    FIELD_NAME_IDENTIFIER, structuredType.getObjectIdentifier().get());
+                    FIELD_NAME_OBJECT_IDENTIFIER, structuredType.getObjectIdentifier().get());
         }
-        if (structuredType.getImplementationClass().isPresent()) {
+        if (structuredType.getDescription().isPresent()) {
             jsonGenerator.writeStringField(
-                    FIELD_NAME_IMPLEMENTATION_CLASS,
-                    structuredType.getImplementationClass().get().getName());
+                    FIELD_NAME_DESCRIPTION, structuredType.getDescription().get());
+        }
+        if (structuredType.getImplementationClass().isPresent()) {
+            jsonGenerator.writeObjectField(
+                    FIELD_NAME_IMPLEMENTATION_CLASS, structuredType.getImplementationClass().get());
         }
         jsonGenerator.writeFieldName(FIELD_NAME_ATTRIBUTES);
         jsonGenerator.writeStartArray();
-        for (StructuredType.StructuredAttribute attribute : structuredType.getAttributes()) {
+        for (StructuredAttribute attribute : structuredType.getAttributes()) {
             jsonGenerator.writeStartObject();
-            jsonGenerator.writeStringField(FIELD_NAME_NAME, attribute.getName());
-            jsonGenerator.writeObjectField(FIELD_NAME_LOGICAL_TYPE, attribute.getType());
+            jsonGenerator.writeStringField(FIELD_NAME_ATTRIBUTE_NAME, attribute.getName());
+            jsonGenerator.writeFieldName(FIELD_NAME_ATTRIBUTE_TYPE);
+            serializeInternal(attribute.getType(), jsonGenerator, serializeCatalogObjects);
             if (attribute.getDescription().isPresent()) {
                 jsonGenerator.writeStringField(
-                        FIELD_NAME_DESCRIPTION, attribute.getDescription().get());
+                        FIELD_NAME_ATTRIBUTE_DESCRIPTION, attribute.getDescription().get());
             }
             jsonGenerator.writeEndObject();
         }
         jsonGenerator.writeEndArray();
-        jsonGenerator.writeBooleanField(FIELD_NAME_FINAL, structuredType.isFinal());
-        jsonGenerator.writeBooleanField(FIELD_NAME_INSTANTIABLE, structuredType.isInstantiable());
-        jsonGenerator.writeStringField(
-                FIELD_NAME_COMPARISON, structuredType.getComparison().name());
-        if (structuredType.getSuperType().isPresent()) {
-            jsonGenerator.writeObjectField(
-                    FIELD_NAME_SUPPER_TYPE, structuredType.getSuperType().get());
+        if (!structuredType.isFinal()) {
+            jsonGenerator.writeBooleanField(FIELD_NAME_FINAL, false);
         }
-        if (structuredType.getDescription().isPresent()) {
+        if (!structuredType.isInstantiable()) {
+            jsonGenerator.writeBooleanField(FIELD_NAME_INSTANTIABLE, false);
+        }
+        if (structuredType.getComparison() != StructuredComparison.NONE) {
             jsonGenerator.writeStringField(
-                    FIELD_NAME_DESCRIPTION, structuredType.getDescription().get());
+                    FIELD_NAME_COMPARISON, structuredType.getComparison().name());
+        }
+        if (structuredType.getSuperType().isPresent()) {
+            jsonGenerator.writeObjectField(
+                    FIELD_NAME_SUPER_TYPE, structuredType.getSuperType().get());
         }
-        jsonGenerator.writeEndObject();
     }
 
-    private void serializeDistinctType(DistinctType distinctType, JsonGenerator jsonGenerator)
+    private static void serializeSpecializedRaw(RawType<?> rawType, JsonGenerator jsonGenerator)
             throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, LogicalTypeRoot.DISTINCT_TYPE.name());
-        Preconditions.checkArgument(distinctType.getObjectIdentifier().isPresent());
-        jsonGenerator.writeObjectField(
-                FIELD_NAME_IDENTIFIER, distinctType.getObjectIdentifier().get());
-        jsonGenerator.writeObjectField(FIELD_NAME_SOURCE_TYPE, distinctType.getSourceType());
-        if (distinctType.getDescription().isPresent()) {
+        jsonGenerator.writeStringField(FIELD_NAME_CLASS, rawType.getOriginatingClass().getName());
+        final TypeSerializer<?> serializer = rawType.getTypeSerializer();
+        if (serializer.equals(NullSerializer.INSTANCE)) {
             jsonGenerator.writeStringField(
-                    FIELD_NAME_DESCRIPTION, distinctType.getDescription().get());
+                    FIELD_NAME_SPECIAL_SERIALIZER, FIELD_VALUE_EXTERNAL_SERIALIZER_NULL);
+        } else if (serializer instanceof ExternalSerializer) {
+            final ExternalSerializer<?, ?> externalSerializer =
+                    (ExternalSerializer<?, ?>) rawType.getTypeSerializer();
+            if (externalSerializer.isInternalInput()) {
+                throw new TableException(
+                        "Asymmetric external serializers are currently not supported. "
+                                + "The input must not be internal if the output is external.");
+            }
+            jsonGenerator.writeObjectField(
+                    FIELD_NAME_EXTERNAL_DATA_TYPE, externalSerializer.getDataType());
+        } else {
+            throw new TableException("Unsupported special case for RAW type.");
         }
-        jsonGenerator.writeEndObject();
     }
 
-    private void serializeTimestampType(TimestampType timestampType, JsonGenerator jsonGenerator)
-            throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, timestampType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, timestampType.isNullable());
-        jsonGenerator.writeNumberField(FIELD_NAME_PRECISION, timestampType.getPrecision());
-        jsonGenerator.writeObjectField(FIELD_NAME_TIMESTAMP_KIND, timestampType.getKind());
-        jsonGenerator.writeEndObject();
-    }
+    // --------------------------------------------------------------------------------------------
+    // Compact Serialization
+    // --------------------------------------------------------------------------------------------
 
-    private void serializeZonedTimestampType(
-            ZonedTimestampType timestampType, JsonGenerator jsonGenerator) throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, timestampType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, timestampType.isNullable());
-        jsonGenerator.writeNumberField(FIELD_NAME_PRECISION, timestampType.getPrecision());
-        jsonGenerator.writeObjectField(FIELD_NAME_TIMESTAMP_KIND, timestampType.getKind());
-        jsonGenerator.writeEndObject();
+    private static boolean supportsCompactSerialization(
+            LogicalType logicalType, boolean serializeCatalogObjects) {
+        return logicalType.accept(new CompactSerializationChecker(serializeCatalogObjects));
     }
 
-    private void serializeLocalZonedTimestampType(
-            LocalZonedTimestampType timestampType, JsonGenerator jsonGenerator) throws IOException {
-        jsonGenerator.writeStartObject();
-        jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, timestampType.getTypeRoot().name());
-        jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, timestampType.isNullable());
-        jsonGenerator.writeNumberField(FIELD_NAME_PRECISION, timestampType.getPrecision());
-        jsonGenerator.writeObjectField(FIELD_NAME_TIMESTAMP_KIND, timestampType.getKind());
-        jsonGenerator.writeEndObject();
+    private static void serializeTypeWithCompactSerialization(
+            LogicalType logicalType, JsonGenerator jsonGenerator) throws IOException {
+        final String compactString = logicalType.asSerializableString();
+        jsonGenerator.writeString(compactString);
     }
 
-    @SuppressWarnings("rawtypes")
-    private void serializeRawType(
-            RawType<?> rawType, JsonGenerator jsonGenerator, SerializerProvider serializerProvider)
-            throws IOException {
-        TypeSerializer<?> typeSer = rawType.getTypeSerializer();
-        if (typeSer instanceof ExternalSerializer) {
-            ExternalSerializer externalSer = (ExternalSerializer) typeSer;
-            // Currently, ExternalSerializer with `isInternalInput=false` will be serialized,
-            // Once `isInternalInput=true` needs to be serialized, we can add individual field in
-            // the json to support it, and the new json plan is compatible with the previous one.
-            if (externalSer.isInternalInput()) {
-                throw new TableException(
-                        "ExternalSerializer with `isInternalInput=true` is not supported.");
-            }
-            DataType dataType = externalSer.getDataType();
-            boolean isMapView = DataViewUtils.isMapViewDataType(dataType);
-            boolean isListView = DataViewUtils.isListViewDataType(dataType);
-            if (isMapView || isListView) {
-                jsonGenerator.writeStartObject();
-                jsonGenerator.writeStringField(FIELD_NAME_TYPE_NAME, LogicalTypeRoot.RAW.name());
-                jsonGenerator.writeBooleanField(FIELD_NAME_NULLABLE, rawType.isNullable());
-                if (isMapView) {
-                    jsonGenerator.writeStringField(
-                            FIELD_NAME_DATA_VIEW_CLASS, MapView.class.getName());
-                    KeyValueDataType keyValueDataType =
-                            DataViewUtils.extractKeyValueDataTypeForMapView(dataType);
-                    serializeDataTypeForDataView(
-                            FIELD_NAME_KEY_TYPE,
-                            keyValueDataType.getKeyDataType(),
-                            jsonGenerator,
-                            serializerProvider);
-                    serializeDataTypeForDataView(
-                            FIELD_NAME_VALUE_TYPE,
-                            keyValueDataType.getValueDataType(),
-                            jsonGenerator,
-                            serializerProvider);
-                } else {
-                    jsonGenerator.writeStringField(
-                            FIELD_NAME_DATA_VIEW_CLASS, ListView.class.getName());
-                    DataType elementType =
-                            DataViewUtils.extractElementDataTypeForListView(dataType);
-                    serializeDataTypeForDataView(
-                            FIELD_NAME_ELEMENT_TYPE,
-                            elementType,
-                            jsonGenerator,
-                            serializerProvider);
-                }
-                jsonGenerator.writeEndObject();
-                return;
-            }
+    /**
+     * Checks whether the given type can be serialized as a compact string created from {@link
+     * LogicalType#asSerializableString()}.
+     */
+    private static class CompactSerializationChecker extends LogicalTypeDefaultVisitor<Boolean> {
+
+        private final boolean serializeCatalogObjects;
+
+        CompactSerializationChecker(boolean serializeCatalogObjects) {
+            this.serializeCatalogObjects = serializeCatalogObjects;
         }
 
-        jsonGenerator.writeObject(rawType.asSerializableString());
-    }
+        @Override
+        public Boolean visit(CharType charType) {
+            return charType.getLength() > 0;

Review comment:
       Unfortunately not. It is invalid in declarations but valid within a plan if you have something like `SELECT ""`.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18274:
URL: https://github.com/apache/flink/pull/18274#issuecomment-1005845834


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=28985",
       "triggerID" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29041",
       "triggerID" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29092",
       "triggerID" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29112",
       "triggerID" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "triggerType" : "PUSH"
     }, {
       "hash" : "c445781fa99453f887f9d91978e5bb9ca9d1f91a",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29114",
       "triggerID" : "c445781fa99453f887f9d91978e5bb9ca9d1f91a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a872e8c5d8aa822a0358d859a462c979a4965750",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29181",
       "triggerID" : "a872e8c5d8aa822a0358d859a462c979a4965750",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ce106cfbce02a10dcddc3a02ddb56a2183a763",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "d9ce106cfbce02a10dcddc3a02ddb56a2183a763",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * c445781fa99453f887f9d91978e5bb9ca9d1f91a Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29114) 
   * a872e8c5d8aa822a0358d859a462c979a4965750 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29181) 
   * d9ce106cfbce02a10dcddc3a02ddb56a2183a763 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] twalthr commented on a change in pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
twalthr commented on a change in pull request #18274:
URL: https://github.com/apache/flink/pull/18274#discussion_r780179909



##########
File path: flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/plan/nodes/exec/serde/DataTypeJsonSerdeTest.java
##########
@@ -0,0 +1,152 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.plan.nodes.exec.serde;
+
+import org.apache.flink.table.api.DataTypes;
+import org.apache.flink.table.api.TableConfig;
+import org.apache.flink.table.catalog.CatalogManager;
+import org.apache.flink.table.module.ModuleManager;
+import org.apache.flink.table.planner.calcite.FlinkContextImpl;
+import org.apache.flink.table.planner.calcite.FlinkTypeFactory;
+import org.apache.flink.table.planner.functions.sql.FlinkSqlOperatorTable;
+import org.apache.flink.table.types.DataType;
+import org.apache.flink.table.types.logical.LogicalType;
+import org.apache.flink.table.utils.CatalogManagerMocks;
+
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonGenerator;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.core.JsonProcessingException;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper;
+import org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.module.SimpleModule;
+
+import org.junit.Test;
+import org.junit.runner.RunWith;
+import org.junit.runners.Parameterized;
+import org.junit.runners.Parameterized.Parameter;
+import org.junit.runners.Parameterized.Parameters;
+
+import java.io.IOException;
+import java.io.StringWriter;
+import java.util.Arrays;
+import java.util.List;
+
+import static org.assertj.core.api.Assertions.assertThat;
+
+/** Tests for {@link DataType} serialization and deserialization. */
+@RunWith(Parameterized.class)
+public class DataTypeJsonSerdeTest {
+
+    @Parameter public DataType dataType;
+
+    @Test
+    public void testDataTypeSerde() throws IOException {
+        final ObjectMapper mapper = configuredObjectMapper();
+        final String json = toJson(mapper, dataType);
+        final DataType actual = toDataType(mapper, json);
+
+        if (json.contains("children")) {
+            System.out.println();
+        }

Review comment:
       Sorry, my bad.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18274:
URL: https://github.com/apache/flink/pull/18274#issuecomment-1005845834


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=28985",
       "triggerID" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29041",
       "triggerID" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "status" : "SUCCESS",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29092",
       "triggerID" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29112",
       "triggerID" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 9f800b453598a8cbf015583929e5fabda9fedbf6 Azure: [SUCCESS](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29092) 
   * 718ea97838ff10191544cc7460d3c18380b0e119 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29112) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18274:
URL: https://github.com/apache/flink/pull/18274#issuecomment-1005845834


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=28985",
       "triggerID" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29041",
       "triggerID" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29092",
       "triggerID" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29112",
       "triggerID" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "triggerType" : "PUSH"
     }, {
       "hash" : "c445781fa99453f887f9d91978e5bb9ca9d1f91a",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "c445781fa99453f887f9d91978e5bb9ca9d1f91a",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 718ea97838ff10191544cc7460d3c18380b0e119 Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29112) 
   * c445781fa99453f887f9d91978e5bb9ca9d1f91a UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18274:
URL: https://github.com/apache/flink/pull/18274#issuecomment-1005845834


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=28985",
       "triggerID" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29041",
       "triggerID" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29092",
       "triggerID" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29112",
       "triggerID" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "triggerType" : "PUSH"
     }, {
       "hash" : "c445781fa99453f887f9d91978e5bb9ca9d1f91a",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29114",
       "triggerID" : "c445781fa99453f887f9d91978e5bb9ca9d1f91a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a872e8c5d8aa822a0358d859a462c979a4965750",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29181",
       "triggerID" : "a872e8c5d8aa822a0358d859a462c979a4965750",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * c445781fa99453f887f9d91978e5bb9ca9d1f91a Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29114) 
   * a872e8c5d8aa822a0358d859a462c979a4965750 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29181) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] flinkbot edited a comment on pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
flinkbot edited a comment on pull request #18274:
URL: https://github.com/apache/flink/pull/18274#issuecomment-1005845834


   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=28985",
       "triggerID" : "dd2c1d149708b916cb05bd2b0580015ae2e1f889",
       "triggerType" : "PUSH"
     }, {
       "hash" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29041",
       "triggerID" : "8b30dbb1acd4bffe0c4c5d25b669705deb19463e",
       "triggerType" : "PUSH"
     }, {
       "hash" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29092",
       "triggerID" : "9f800b453598a8cbf015583929e5fabda9fedbf6",
       "triggerType" : "PUSH"
     }, {
       "hash" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "status" : "DELETED",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29112",
       "triggerID" : "718ea97838ff10191544cc7460d3c18380b0e119",
       "triggerType" : "PUSH"
     }, {
       "hash" : "c445781fa99453f887f9d91978e5bb9ca9d1f91a",
       "status" : "FAILURE",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29114",
       "triggerID" : "c445781fa99453f887f9d91978e5bb9ca9d1f91a",
       "triggerType" : "PUSH"
     }, {
       "hash" : "a872e8c5d8aa822a0358d859a462c979a4965750",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29181",
       "triggerID" : "a872e8c5d8aa822a0358d859a462c979a4965750",
       "triggerType" : "PUSH"
     }, {
       "hash" : "d9ce106cfbce02a10dcddc3a02ddb56a2183a763",
       "status" : "PENDING",
       "url" : "https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29194",
       "triggerID" : "d9ce106cfbce02a10dcddc3a02ddb56a2183a763",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * c445781fa99453f887f9d91978e5bb9ca9d1f91a Azure: [FAILURE](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29114) 
   * a872e8c5d8aa822a0358d859a462c979a4965750 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29181) 
   * d9ce106cfbce02a10dcddc3a02ddb56a2183a763 Azure: [PENDING](https://dev.azure.com/apache-flink/98463496-1af2-4620-8eab-a2ecc1a2e6fe/_build/results?buildId=29194) 
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



[GitHub] [flink] twalthr closed pull request #18274: [FLINK-25230][table-planner] Harden type serialization for LogicalType and DataType

Posted by GitBox <gi...@apache.org>.
twalthr closed pull request #18274:
URL: https://github.com/apache/flink/pull/18274


   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org