You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by GitBox <gi...@apache.org> on 2019/11/29 18:02:04 UTC

[GitHub] [flink] twalthr commented on a change in pull request #10342: [FLINK-14967][table] Add a utility for creating data types via reflection

twalthr commented on a change in pull request #10342: [FLINK-14967][table] Add a utility for creating data types via reflection
URL: https://github.com/apache/flink/pull/10342#discussion_r352218595
 
 

 ##########
 File path: flink-table/flink-table-common/src/main/java/org/apache/flink/table/annotation/DataTypeHint.java
 ##########
 @@ -0,0 +1,241 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.annotation;
+
+import org.apache.flink.annotation.PublicEvolving;
+import org.apache.flink.api.common.typeutils.TypeSerializer;
+import org.apache.flink.table.api.DataTypes;
+import org.apache.flink.table.types.DataType;
+import org.apache.flink.table.types.inference.TypeInference;
+import org.apache.flink.table.types.logical.LogicalType;
+
+import java.lang.annotation.ElementType;
+import java.lang.annotation.Retention;
+import java.lang.annotation.RetentionPolicy;
+import java.lang.annotation.Target;
+
+/**
+ * A hint that influences the reflection-based extraction of a {@link DataType}.
+ *
+ * <p>Data type hints can parameterize or replace the default extraction logic of individual function parameters
+ * and return types, structured classes, or fields of structured classes. An implementer can choose to
+ * what extent the default extraction logic should be modified.
+ *
+ * <p>The following examples show how to explicitly specify data types, how to parameterize the extraction
+ * logic, or how to accept any data type as an input data type:
+ *
+ * <p>{@code @DataTypeHint("INT")} defines an INT data type with a default conversion class.
+ *
+ * <p>{@code @DataTypeHint(value = "TIMESTAMP(3)", bridgedTo = java.sql.Timestamp.class)} defines a TIMESTAMP
+ * data type of millisecond precision with an explicit conversion class.
+ *
+ * <p>{@code @DataTypeHint(value = "RAW", rawSerializer = MyCustomSerializer.class)} defines a RAW data type
+ * with a custom serializer class.
+ *
+ * <p>{@code @DataTypeHint(version = V1, allowRawGlobally = TRUE)} parameterizes the extraction by requesting
+ * a extraction logic version of 1 and allowing the RAW data type in this structured type (and possibly
+ * nested fields).
+ *
+ * <p>{@code @DataTypeHint(bridgedTo = MyPojo.class, allowRawGlobally = TRUE)} defines that a type should be
+ * extracted from the given conversion class but with parameterized extraction for allowing RAW types.
+ *
+ * <p>{@code @DataTypeHint(inputGroup = ANY)} defines that the input validation should accept any
+ * data type.
+ *
+ * <p>Note: All hint parameters are optional. Hint parameters defined on top of a structured type are
+ * inherited by all (deeply) nested fields unless annotated differently. For example, all occurrences of
+ * {@link java.math.BigDecimal} will be extracted as {@code DECIMAL(12, 2)} if the enclosing structured
+ * class is annotated with {@code @DataTypeHint(defaultDecimalPrecision = 12, defaultDecimalScale = 2)}. Individual
+ * field annotations allow to deviate from those default values.
+ */
+@PublicEvolving
+@Retention(RetentionPolicy.RUNTIME)
+@Target({ElementType.TYPE, ElementType.METHOD, ElementType.FIELD, ElementType.PARAMETER})
+public @interface DataTypeHint {
+
+	// Note to implementers:
+	// Because "null" is not supported as an annotation value. Every annotation parameter has
+	// some representation for unknown values in order to merge multi-level annotations.
+
+	// --------------------------------------------------------------------------------------------
+	// Explicit data type specification
+	// --------------------------------------------------------------------------------------------
+
+	/**
+	 * The explicit string representation of a data type. See {@link DataTypes} for a list of supported
+	 * data types. For example, {@code INT} for an integer data type or {@code DECIMAL(12, 5)} for decimal
+	 * data type with precision 12 and scale 5.
+	 *
+	 * <p>Use an unparameterized {@code RAW} string for explicitly declaring an opaque data type. For
+	 * Flink's default RAW serializer, use {@code @DataTypeHint("RAW")}. For a custom RAW serializer,
+	 * use {@code @DataTypeHint(value = "RAW", rawSerializer = MyCustomSerializer.class)}.
+	 *
+	 * <p>By default, the empty string represents an undefined data type which means that it will be
+	 * derived automatically.
+	 *
+	 * <p>Use {@link #inputGroup()} for accepting a group of similar data types if this hint is used
+	 * to enrich input arguments.
+	 *
+	 * @see LogicalType#asSerializableString()
+	 * @see DataTypes
+	 */
+	String value() default "";
+
+	/**
+	 * Adds a hint that data should be represented using the given class when entering or leaving
+	 * the table ecosystem.
+	 *
+	 * <p>If an explicit data type has been defined via {@link #value()}, a supported conversion class
+	 * depends on the logical type and its nullability property.
+	 *
+	 * <p>If an explicit data type has not been defined via {@link #value()}, this class is used for
+	 * reflective extraction of a data type.
+	 *
+	 * <p>Please see the implementation of {@link LogicalType#supportsInputConversion(Class)},
+	 * {@link LogicalType#supportsOutputConversion(Class)}, or the documentation for more information
+	 * about supported conversions.
+	 *
+	 * <p>By default, the conversion class is reflectively extracted.
+	 *
+	 * @see DataType#bridgedTo(Class)
+	 */
+	Class<?> bridgedTo() default void.class;
+
+	/**
+	 * Adds a hint that defines a custom serializer that should be used for serializing and deserializing
+	 * opaque RAW types. It is used if {@link #value()} is explicitly defined as an unparameterized {@code RAW}
+	 * string or if (possibly nested) fields in a structured type need to be handled as an opaque type.
+	 *
+	 * <p>By default, Flink's default RAW serializer is used.
+	 *
+	 * @see DataTypes#RAW(Class, TypeSerializer)
+	 */
+	Class<?> rawSerializer() default void.class;
+
+	// --------------------------------------------------------------------------------------------
+	// Group of data types specification
+	// --------------------------------------------------------------------------------------------
+
+	/**
+	 * This hint influences the extraction of a {@link TypeInference} in functions. It adds a hint for
+	 * accepting pre-defined groups of similar types, i.e., more than just one explicit data type.
+	 *
+	 * <p>Note: This annotation is only allowed as a top-level hint and is ignored within nested
 
 Review comment:
   I will improve the documentation here.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services