You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by GitBox <gi...@apache.org> on 2022/03/18 07:58:37 UTC

[GitHub] [flink] twalthr commented on a change in pull request #18940: [FLINK-26249][table-planner] Run BuiltInFunctionsTestBase and BuiltInAggregateFunctionsTestBase in parallel

twalthr commented on a change in pull request #18940:
URL: https://github.com/apache/flink/pull/18940#discussion_r829748153



##########
File path: flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/functions/BuiltInFunctionTestBase.java
##########
@@ -60,127 +62,68 @@
 import static org.assertj.core.api.Assertions.catchThrowable;
 
 /**
- * Test base for testing {@link BuiltInFunctionDefinition}.
+ * Test interface implementing the logic to execute tests for {@link BuiltInFunctionDefinition}.
+ *
+ * <p>To create a new set of test cases, just create a subclass and implement the method {@link
+ * #getTestCaseSpecs()}.
  *
  * <p>Note: This test base is not the most efficient one. It currently checks the full pipeline
  * end-to-end. If the testing time is too long, we can change the underlying implementation easily
- * without touching the defined {@link TestSpec}s.
+ * without touching the defined {@link TestSetSpec}s.
  */
-@RunWith(Parameterized.class)
-public abstract class BuiltInFunctionTestBase {
+@Execution(ExecutionMode.CONCURRENT)
+@TestInstance(TestInstance.Lifecycle.PER_CLASS)
+@ExtendWith(MiniClusterExtension.class)
+abstract class BuiltInFunctionTestBase {
 
-    @ClassRule
-    public static MiniClusterWithClientResource miniClusterResource =
-            new MiniClusterWithClientResource(
-                    new MiniClusterResourceConfiguration.Builder()
-                            .setNumberTaskManagers(1)
-                            .setNumberSlotsPerTaskManager(1)
-                            .build());
+    Configuration getConfiguration() {
+        return new Configuration();
+    }
 
-    @Parameter public TestSpec testSpec;
+    abstract Stream<TestSetSpec> getTestCaseSpecs();

Review comment:
       `getTestSetSpecs `

##########
File path: flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/functions/BuiltInFunctionTestBase.java
##########
@@ -342,18 +286,53 @@ TestSpec testResult(
             return this;
         }
 
+        Stream<TestCase> getTestCases(Configuration configuration) {
+            return testItems.stream().map(testItem -> getTestCase(configuration, testItem));
+        }
+
+        private TestCase getTestCase(Configuration configuration, TestItem testItem) {
+            return new TestCase(
+                    testItem.toString(),
+                    () -> {
+                        final TableEnvironmentInternal env =

Review comment:
       I don't remember but is there an actual reason for `TableEnvironmentInternal` otherwise let's expose only `TableEnvironment`.

##########
File path: flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/functions/BuiltInFunctionTestBase.java
##########
@@ -342,18 +286,53 @@ TestSpec testResult(
             return this;
         }
 
+        Stream<TestCase> getTestCases(Configuration configuration) {

Review comment:
       private?

##########
File path: flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/functions/BuiltInFunctionTestBase.java
##########
@@ -342,18 +286,53 @@ TestSpec testResult(
             return this;
         }
 
+        Stream<TestCase> getTestCases(Configuration configuration) {
+            return testItems.stream().map(testItem -> getTestCase(configuration, testItem));
+        }
+
+        private TestCase getTestCase(Configuration configuration, TestItem testItem) {
+            return new TestCase(
+                    testItem.toString(),
+                    () -> {
+                        final TableEnvironmentInternal env =
+                                (TableEnvironmentInternal)
+                                        TableEnvironment.create(
+                                                EnvironmentSettings.newInstance().build());
+                        env.getConfig().addConfiguration(configuration);
+
+                        functions.forEach(
+                                f -> env.createTemporarySystemFunction(f.getSimpleName(), f));
+
+                        final Table inputTable;
+                        if (fieldDataTypes == null) {
+                            inputTable = env.fromValues(Row.of(fieldData));
+                        } else {
+                            final DataTypes.UnresolvedField[] fields =
+                                    IntStream.range(0, fieldDataTypes.length)
+                                            .mapToObj(
+                                                    i ->
+                                                            DataTypes.FIELD(
+                                                                    "f" + i, fieldDataTypes[i]))
+                                            .toArray(DataTypes.UnresolvedField[]::new);
+                            inputTable = env.fromValues(DataTypes.ROW(fields), Row.of(fieldData));
+                        }
+
+                        testItem.test(env, inputTable);
+                    });
+        }
+
         @Override
         public String toString() {
             return (definition != null ? definition.getName() : "Expression")
                     + (description != null ? " : " + description : "");
         }
     }
 
-    private interface TestItem {
-        // marker interface
+    interface TestItem {

Review comment:
       why not private? also below

##########
File path: flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/runtime/stream/sql/FunctionITCase.java
##########
@@ -72,7 +71,8 @@
  * Tests for catalog and system functions in a table environment.
  *
  * <p>Note: This class is meant for testing the core function support. Use {@link
- * BuiltInFunctionTestBase} for testing individual function implementations.
+ * org.apache.flink.table.planner.functions.BuiltInFunctionTestBase} for testing individual function

Review comment:
       Undo this change

##########
File path: flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/functions/FieldAccessFromTableITCase.java
##########
@@ -0,0 +1,111 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.functions;
+
+import org.apache.flink.table.functions.BuiltInFunctionDefinitions;
+import org.apache.flink.types.Row;
+
+import java.util.stream.Stream;
+
+import static java.util.Collections.singletonMap;
+import static org.apache.flink.table.api.DataTypes.ARRAY;
+import static org.apache.flink.table.api.DataTypes.BIGINT;
+import static org.apache.flink.table.api.DataTypes.FIELD;
+import static org.apache.flink.table.api.DataTypes.MAP;
+import static org.apache.flink.table.api.DataTypes.ROW;
+import static org.apache.flink.table.api.DataTypes.STRING;
+import static org.apache.flink.table.api.Expressions.$;
+
+/**
+ * Regular tests. See also {@link ConstructedAccessFunctionsITCase} for tests that access a nested
+ * field of an expression or for {@link BuiltInFunctionDefinitions#FLATTEN} which produces multiple
+ * columns from a single one.
+ */
+class FieldAccessFromTableITCase extends BuiltInFunctionTestBase {

Review comment:
       What is the current status of the MiniCluster? Will it be reused across tests nowadays? Because in the past the guideline was to not create too many classes as the cluster init is expensive. This is why summarized functions into one class in the past.

##########
File path: flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/functions/CoalesceFunctionITCase.java
##########
@@ -20,23 +20,20 @@
 
 import org.apache.flink.table.functions.BuiltInFunctionDefinitions;
 
-import org.junit.runners.Parameterized;
-
-import java.util.Collections;
-import java.util.List;
+import java.util.stream.Stream;
 
 import static org.apache.flink.table.api.DataTypes.BIGINT;
 import static org.apache.flink.table.api.DataTypes.INT;
 import static org.apache.flink.table.api.Expressions.$;
 import static org.apache.flink.table.api.Expressions.coalesce;
 
 /** Test {@link BuiltInFunctionDefinitions#COALESCE} and its return type. */
-public class CoalesceFunctionITCase extends BuiltInFunctionTestBase {
+class CoalesceFunctionITCase extends BuiltInFunctionTestBase {
 
-    @Parameterized.Parameters(name = "{index}: {0}")
-    public static List<TestSpec> testData() {
-        return Collections.singletonList(
-                TestSpec.forFunction(BuiltInFunctionDefinitions.COALESCE)
+    @Override
+    public Stream<TestSetSpec> getTestCaseSpecs() {

Review comment:
       could also be default scoped?

##########
File path: flink-table/flink-table-planner/src/test/java/org/apache/flink/table/planner/functions/FieldAccessFromTableITCase.java
##########
@@ -0,0 +1,111 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *     http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.flink.table.planner.functions;
+
+import org.apache.flink.table.functions.BuiltInFunctionDefinitions;
+import org.apache.flink.types.Row;
+
+import java.util.stream.Stream;
+
+import static java.util.Collections.singletonMap;
+import static org.apache.flink.table.api.DataTypes.ARRAY;
+import static org.apache.flink.table.api.DataTypes.BIGINT;
+import static org.apache.flink.table.api.DataTypes.FIELD;
+import static org.apache.flink.table.api.DataTypes.MAP;
+import static org.apache.flink.table.api.DataTypes.ROW;
+import static org.apache.flink.table.api.DataTypes.STRING;
+import static org.apache.flink.table.api.Expressions.$;
+
+/**
+ * Regular tests. See also {@link ConstructedAccessFunctionsITCase} for tests that access a nested
+ * field of an expression or for {@link BuiltInFunctionDefinitions#FLATTEN} which produces multiple
+ * columns from a single one.
+ */
+class FieldAccessFromTableITCase extends BuiltInFunctionTestBase {
+
+    @Override
+    public Stream<TestSetSpec> getTestCaseSpecs() {
+        return Stream.of(
+
+                // Actually in case of SQL it does not use the GET method, but
+                // a custom logic for accessing nested fields of a Table.
+                TestSetSpec.forFunction(BuiltInFunctionDefinitions.GET)
+                        .onFieldsWithData(null, Row.of(1))
+                        .andDataTypes(
+                                ROW(FIELD("nested", BIGINT().notNull())).nullable(),
+                                ROW(FIELD("nested", BIGINT().notNull())).notNull())
+                        .testResult(
+                                resultSpec(
+                                        $("f0").get("nested"),
+                                        "f0.nested",
+                                        null,
+                                        BIGINT().nullable()),
+                                resultSpec(
+                                        $("f1").get("nested"),
+                                        "f1.nested",
+                                        1L,
+                                        BIGINT().notNull())),
+
+                // In Calcite it maps to FlinkSqlOperatorTable.ITEM
+                TestSetSpec.forFunction(BuiltInFunctionDefinitions.AT)
+                        .onFieldsWithData(
+                                null,
+                                new int[] {1},
+                                null,
+                                singletonMap("nested", 1),
+                                null,
+                                Row.of(1))
+                        .andDataTypes(
+                                ARRAY(BIGINT().notNull()).nullable(),
+                                ARRAY(BIGINT().notNull()).notNull(),
+                                MAP(STRING(), BIGINT().notNull()).nullable(),
+                                MAP(STRING(), BIGINT().notNull()).notNull(),
+                                ROW(FIELD("nested", BIGINT().notNull())).nullable(),
+                                ROW(FIELD("nested", BIGINT().notNull())).notNull())
+                        // accessing elements of MAP or ARRAY is a runtime operations,
+                        // we do not know about the size or contents during the inference
+                        // therefore the results are always nullable
+                        .testResult(
+                                resultSpec($("f0").at(1), "f0[1]", null, BIGINT().nullable()),

Review comment:
       remove the outer `resultSpec(...)`, `testResult` can take it directly. also in other tests.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org