You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by GitBox <gi...@apache.org> on 2022/04/14 04:26:18 UTC

[GitHub] [flink] snuyanzin opened a new pull request, #19467: [FLINK-25548][flink-sql-parser] Migrate tests to JUnit5

snuyanzin opened a new pull request, #19467:
URL: https://github.com/apache/flink/pull/19467

   ## What is the purpose of the change
   
   Update the flink-table/flink-sql-parser and flink-table/flink-sql-parser-hive modules to AssertJ and JUnit 5 following the [JUnit 5 Migration Guide](https://docs.google.com/document/d/1514Wa_aNB9bJUen4xm5uiuXOooOJTtXqS_Jqk9KJitU/edit)
   
   ## Brief change log
   
   Use JUnit5 and AssertJ in tests instead of JUnit4 and Hamcrest
   
   ## Verifying this change
   
   This change is a code cleanup without any test coverage.
   
   ## Does this pull request potentially affect one of the following parts:
   
     - Dependencies (does it add or upgrade a dependency): ( no)
     - The public API, i.e., is any changed class annotated with `@Public(Evolving)`: ( no)
     - The serializers: (no)
     - The runtime per-record code paths (performance sensitive): (no)
     - Anything that affects deployment or recovery: JobManager (and its components), Checkpointing, Kubernetes/Yarn, ZooKeeper: (no)
     - The S3 file system connector: ( no)
   
   ## Documentation
   
     - Does this pull request introduce a new feature? ( no)
     - If yes, how is the feature documented? (not applicable)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] snuyanzin commented on pull request #19467: [FLINK-25548][flink-sql-parser] Migrate tests to JUnit5

Posted by GitBox <gi...@apache.org>.
snuyanzin commented on PR #19467:
URL: https://github.com/apache/flink/pull/19467#issuecomment-1099101498

   @flinkbot run azure


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] zentol merged pull request #19467: [FLINK-25548][flink-sql-parser] Migrate tests to JUnit5

Posted by GitBox <gi...@apache.org>.
zentol merged PR #19467:
URL: https://github.com/apache/flink/pull/19467


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] snuyanzin commented on a diff in pull request #19467: [FLINK-25548][flink-sql-parser] Migrate tests to JUnit5

Posted by GitBox <gi...@apache.org>.
snuyanzin commented on code in PR #19467:
URL: https://github.com/apache/flink/pull/19467#discussion_r854370506


##########
flink-table/flink-sql-parser/src/test/java/org/apache/flink/sql/parser/FlinkDDLDataTypeTest.java:
##########
@@ -52,195 +52,203 @@
 import org.apache.calcite.test.catalog.MockCatalogReaderSimple;
 import org.apache.calcite.util.SourceStringReader;
 import org.apache.calcite.util.Util;
-import org.junit.Test;
-import org.junit.runner.RunWith;
-import org.junit.runners.Parameterized;
+import org.junit.jupiter.params.ParameterizedTest;
+import org.junit.jupiter.params.provider.Arguments;
+import org.junit.jupiter.params.provider.MethodSource;
 
 import javax.annotation.Nullable;
 
 import java.util.Arrays;
 import java.util.Collections;
 import java.util.HashMap;
-import java.util.List;
 import java.util.Map;
+import java.util.stream.Stream;
 
 import static org.assertj.core.api.Assertions.assertThat;
+import static org.junit.jupiter.params.provider.Arguments.of;
 
 /** Tests for all the supported Flink DDL data types. */
-@RunWith(Parameterized.class)
 public class FlinkDDLDataTypeTest {
     private static final Fixture FIXTURE = new Fixture(TestFactory.INSTANCE.getTypeFactory());
     private static final String DDL_FORMAT =
             "create table t1 (\n" + "  f0 %s\n" + ") with (\n" + "  'k1' = 'v1'\n" + ")";
 
-    @Parameterized.Parameters(name = "{index}: {0}")
-    public static List<TestItem> testData() {
-        return Arrays.asList(
-                createTestItem("CHAR", nullable(FIXTURE.char1Type), "CHAR"),
-                createTestItem("CHAR NOT NULL", FIXTURE.char1Type, "CHAR NOT NULL"),
-                createTestItem("CHAR   NOT \t\nNULL", FIXTURE.char1Type, "CHAR NOT NULL"),
-                createTestItem("char not null", FIXTURE.char1Type, "CHAR NOT NULL"),
-                createTestItem("CHAR NULL", nullable(FIXTURE.char1Type), "CHAR"),
-                createTestItem("CHAR(33)", nullable(FIXTURE.char33Type), "CHAR(33)"),
-                createTestItem("VARCHAR", nullable(FIXTURE.varcharType), "VARCHAR"),
-                createTestItem("VARCHAR(33)", nullable(FIXTURE.varchar33Type), "VARCHAR(33)"),
-                createTestItem(
+    public static Stream<Arguments> testData() {

Review Comment:
   yes, you're right, I set this and other similar to package private



##########
flink-table/flink-sql-parser/src/test/java/org/apache/flink/sql/parser/FlinkDDLDataTypeTest.java:
##########
@@ -52,195 +52,203 @@
 import org.apache.calcite.test.catalog.MockCatalogReaderSimple;
 import org.apache.calcite.util.SourceStringReader;
 import org.apache.calcite.util.Util;
-import org.junit.Test;
-import org.junit.runner.RunWith;
-import org.junit.runners.Parameterized;
+import org.junit.jupiter.params.ParameterizedTest;
+import org.junit.jupiter.params.provider.Arguments;
+import org.junit.jupiter.params.provider.MethodSource;
 
 import javax.annotation.Nullable;
 
 import java.util.Arrays;
 import java.util.Collections;
 import java.util.HashMap;
-import java.util.List;
 import java.util.Map;
+import java.util.stream.Stream;
 
 import static org.assertj.core.api.Assertions.assertThat;
+import static org.junit.jupiter.params.provider.Arguments.of;
 
 /** Tests for all the supported Flink DDL data types. */
-@RunWith(Parameterized.class)
 public class FlinkDDLDataTypeTest {

Review Comment:
   done



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] snuyanzin commented on a diff in pull request #19467: [FLINK-25548][flink-sql-parser] Migrate tests to JUnit5

Posted by GitBox <gi...@apache.org>.
snuyanzin commented on code in PR #19467:
URL: https://github.com/apache/flink/pull/19467#discussion_r854370994


##########
flink-table/flink-sql-parser/src/test/java/org/apache/flink/sql/parser/TableApiIdentifierParsingTest.java:
##########
@@ -26,49 +26,44 @@
 import org.apache.calcite.sql.SqlIdentifier;
 import org.apache.calcite.sql.parser.SqlAbstractParserImpl;
 import org.apache.calcite.util.SourceStringReader;
-import org.junit.Test;
-import org.junit.runner.RunWith;
-import org.junit.runners.Parameterized;
+import org.junit.jupiter.params.ParameterizedTest;
+import org.junit.jupiter.params.provider.Arguments;
+import org.junit.jupiter.params.provider.MethodSource;
 
 import java.util.List;
+import java.util.stream.Stream;
 
 import static java.util.Arrays.asList;
 import static java.util.Collections.singletonList;
 import static org.assertj.core.api.Assertions.assertThat;
+import static org.junit.jupiter.params.provider.Arguments.of;
 
 /** Tests for parsing a Table API specific SqlIdentifier. */
-@RunWith(Parameterized.class)
 public class TableApiIdentifierParsingTest {

Review Comment:
   done



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] snuyanzin commented on pull request #19467: [FLINK-25548][flink-sql-parser] Migrate tests to JUnit5

Posted by GitBox <gi...@apache.org>.
snuyanzin commented on PR #19467:
URL: https://github.com/apache/flink/pull/19467#issuecomment-1103609846

   @zentol sorry for the poke
   Since you are one of the committers who deals with migration to junit5, could you please have a look here once you have time?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] snuyanzin commented on pull request #19467: [FLINK-25548][flink-sql-parser] Migrate tests to JUnit5

Posted by GitBox <gi...@apache.org>.
snuyanzin commented on PR #19467:
URL: https://github.com/apache/flink/pull/19467#issuecomment-1104380485

   Thanks for the feedback
   I addressed your comments: added `TestLoggerExtension`, removed not necessary public modifiers


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] snuyanzin commented on a diff in pull request #19467: [FLINK-25548][flink-sql-parser] Migrate tests to JUnit5

Posted by GitBox <gi...@apache.org>.
snuyanzin commented on code in PR #19467:
URL: https://github.com/apache/flink/pull/19467#discussion_r854370265


##########
flink-table/flink-sql-parser/src/test/java/org/apache/flink/sql/parser/TableApiIdentifierParsingTest.java:
##########
@@ -26,49 +26,44 @@
 import org.apache.calcite.sql.SqlIdentifier;
 import org.apache.calcite.sql.parser.SqlAbstractParserImpl;
 import org.apache.calcite.util.SourceStringReader;
-import org.junit.Test;
-import org.junit.runner.RunWith;
-import org.junit.runners.Parameterized;
+import org.junit.jupiter.params.ParameterizedTest;
+import org.junit.jupiter.params.provider.Arguments;
+import org.junit.jupiter.params.provider.MethodSource;
 
 import java.util.List;
+import java.util.stream.Stream;
 
 import static java.util.Arrays.asList;
 import static java.util.Collections.singletonList;
 import static org.assertj.core.api.Assertions.assertThat;
+import static org.junit.jupiter.params.provider.Arguments.of;
 
 /** Tests for parsing a Table API specific SqlIdentifier. */
-@RunWith(Parameterized.class)
 public class TableApiIdentifierParsingTest {
 
     private static final String ANTHROPOS_IN_GREEK_IN_UNICODE =
             "#03B1#03BD#03B8#03C1#03C9#03C0#03BF#03C2";
     private static final String ANTHROPOS_IN_GREEK = "ανθρωπος";
 
-    @Parameterized.Parameters(name = "Parsing: {0}. Expected identifier: {1}")
-    public static Object[][] parameters() {
-        return new Object[][] {
-            new Object[] {"array", singletonList("array")},
-            new Object[] {"table", singletonList("table")},
-            new Object[] {"cat.db.array", asList("cat", "db", "array")},
-            new Object[] {"`cat.db`.table", asList("cat.db", "table")},
-            new Object[] {"db.table", asList("db", "table")},
-            new Object[] {"`ta``ble`", singletonList("ta`ble")},
-            new Object[] {"`c``at`.`d``b`.`ta``ble`", asList("c`at", "d`b", "ta`ble")},
-            new Object[] {
-                "db.U&\"" + ANTHROPOS_IN_GREEK_IN_UNICODE + "\" UESCAPE '#'",
-                asList("db", ANTHROPOS_IN_GREEK)
-            },
-            new Object[] {"db.ανθρωπος", asList("db", ANTHROPOS_IN_GREEK)}
-        };
+    public static Stream<Arguments> parameters() {

Review Comment:
   yes, you're right, I set this and other similar to package private



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] flinkbot commented on pull request #19467: [FLINK-25548][flink-sql-parser] Migrate tests to JUnit5

Posted by GitBox <gi...@apache.org>.
flinkbot commented on PR #19467:
URL: https://github.com/apache/flink/pull/19467#issuecomment-1098700583

   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "b4ef337281b169e720595fded2c78dee0f425341",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "b4ef337281b169e720595fded2c78dee0f425341",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * b4ef337281b169e720595fded2c78dee0f425341 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] zentol commented on a diff in pull request #19467: [FLINK-25548][flink-sql-parser] Migrate tests to JUnit5

Posted by GitBox <gi...@apache.org>.
zentol commented on code in PR #19467:
URL: https://github.com/apache/flink/pull/19467#discussion_r853987636


##########
flink-table/flink-sql-parser/src/test/java/org/apache/flink/sql/parser/TableApiIdentifierParsingTest.java:
##########
@@ -26,49 +26,44 @@
 import org.apache.calcite.sql.SqlIdentifier;
 import org.apache.calcite.sql.parser.SqlAbstractParserImpl;
 import org.apache.calcite.util.SourceStringReader;
-import org.junit.Test;
-import org.junit.runner.RunWith;
-import org.junit.runners.Parameterized;
+import org.junit.jupiter.params.ParameterizedTest;
+import org.junit.jupiter.params.provider.Arguments;
+import org.junit.jupiter.params.provider.MethodSource;
 
 import java.util.List;
+import java.util.stream.Stream;
 
 import static java.util.Arrays.asList;
 import static java.util.Collections.singletonList;
 import static org.assertj.core.api.Assertions.assertThat;
+import static org.junit.jupiter.params.provider.Arguments.of;
 
 /** Tests for parsing a Table API specific SqlIdentifier. */
-@RunWith(Parameterized.class)
 public class TableApiIdentifierParsingTest {
 
     private static final String ANTHROPOS_IN_GREEK_IN_UNICODE =
             "#03B1#03BD#03B8#03C1#03C9#03C0#03BF#03C2";
     private static final String ANTHROPOS_IN_GREEK = "ανθρωπος";
 
-    @Parameterized.Parameters(name = "Parsing: {0}. Expected identifier: {1}")
-    public static Object[][] parameters() {
-        return new Object[][] {
-            new Object[] {"array", singletonList("array")},
-            new Object[] {"table", singletonList("table")},
-            new Object[] {"cat.db.array", asList("cat", "db", "array")},
-            new Object[] {"`cat.db`.table", asList("cat.db", "table")},
-            new Object[] {"db.table", asList("db", "table")},
-            new Object[] {"`ta``ble`", singletonList("ta`ble")},
-            new Object[] {"`c``at`.`d``b`.`ta``ble`", asList("c`at", "d`b", "ta`ble")},
-            new Object[] {
-                "db.U&\"" + ANTHROPOS_IN_GREEK_IN_UNICODE + "\" UESCAPE '#'",
-                asList("db", ANTHROPOS_IN_GREEK)
-            },
-            new Object[] {"db.ανθρωπος", asList("db", ANTHROPOS_IN_GREEK)}
-        };
+    public static Stream<Arguments> parameters() {

Review Comment:
   does this need to be public?



##########
flink-table/flink-sql-parser/src/test/java/org/apache/flink/sql/parser/TableApiIdentifierParsingTest.java:
##########
@@ -26,49 +26,44 @@
 import org.apache.calcite.sql.SqlIdentifier;
 import org.apache.calcite.sql.parser.SqlAbstractParserImpl;
 import org.apache.calcite.util.SourceStringReader;
-import org.junit.Test;
-import org.junit.runner.RunWith;
-import org.junit.runners.Parameterized;
+import org.junit.jupiter.params.ParameterizedTest;
+import org.junit.jupiter.params.provider.Arguments;
+import org.junit.jupiter.params.provider.MethodSource;
 
 import java.util.List;
+import java.util.stream.Stream;
 
 import static java.util.Arrays.asList;
 import static java.util.Collections.singletonList;
 import static org.assertj.core.api.Assertions.assertThat;
+import static org.junit.jupiter.params.provider.Arguments.of;
 
 /** Tests for parsing a Table API specific SqlIdentifier. */
-@RunWith(Parameterized.class)
 public class TableApiIdentifierParsingTest {
 
     private static final String ANTHROPOS_IN_GREEK_IN_UNICODE =
             "#03B1#03BD#03B8#03C1#03C9#03C0#03BF#03C2";
     private static final String ANTHROPOS_IN_GREEK = "ανθρωπος";
 
-    @Parameterized.Parameters(name = "Parsing: {0}. Expected identifier: {1}")
-    public static Object[][] parameters() {
-        return new Object[][] {
-            new Object[] {"array", singletonList("array")},
-            new Object[] {"table", singletonList("table")},
-            new Object[] {"cat.db.array", asList("cat", "db", "array")},
-            new Object[] {"`cat.db`.table", asList("cat.db", "table")},
-            new Object[] {"db.table", asList("db", "table")},
-            new Object[] {"`ta``ble`", singletonList("ta`ble")},
-            new Object[] {"`c``at`.`d``b`.`ta``ble`", asList("c`at", "d`b", "ta`ble")},
-            new Object[] {
-                "db.U&\"" + ANTHROPOS_IN_GREEK_IN_UNICODE + "\" UESCAPE '#'",
-                asList("db", ANTHROPOS_IN_GREEK)
-            },
-            new Object[] {"db.ανθρωπος", asList("db", ANTHROPOS_IN_GREEK)}
-        };
+    public static Stream<Arguments> parameters() {
+        return Stream.of(
+                of("array", singletonList("array")),
+                of("table", singletonList("table")),
+                of("cat.db.array", asList("cat", "db", "array")),
+                of("`cat.db`.table", asList("cat.db", "table")),
+                of("db.table", asList("db", "table")),
+                of("`ta``ble`", singletonList("ta`ble")),
+                of("`c``at`.`d``b`.`ta``ble`", asList("c`at", "d`b", "ta`ble")),
+                of(
+                        "db.U&\"" + ANTHROPOS_IN_GREEK_IN_UNICODE + "\" UESCAPE '#'",
+                        asList("db", ANTHROPOS_IN_GREEK)),
+                of("db.ανθρωπος", asList("db", ANTHROPOS_IN_GREEK)));
     }
 
-    @Parameterized.Parameter public String stringIdentifier;
-
-    @Parameterized.Parameter(1)
-    public List<String> expectedParsedIdentifier;
-
-    @Test
-    public void testTableApiIdentifierParsing() throws ParseException {
+    @ParameterizedTest

Review Comment:
   ```suggestion
       @ParameterizedTest(name = "Parsing: {0}. Expected identifier: {1}")
   ```
   ?



##########
flink-table/flink-sql-parser/src/test/java/org/apache/flink/sql/parser/TableApiIdentifierParsingTest.java:
##########
@@ -26,49 +26,44 @@
 import org.apache.calcite.sql.SqlIdentifier;
 import org.apache.calcite.sql.parser.SqlAbstractParserImpl;
 import org.apache.calcite.util.SourceStringReader;
-import org.junit.Test;
-import org.junit.runner.RunWith;
-import org.junit.runners.Parameterized;
+import org.junit.jupiter.params.ParameterizedTest;
+import org.junit.jupiter.params.provider.Arguments;
+import org.junit.jupiter.params.provider.MethodSource;
 
 import java.util.List;
+import java.util.stream.Stream;
 
 import static java.util.Arrays.asList;
 import static java.util.Collections.singletonList;
 import static org.assertj.core.api.Assertions.assertThat;
+import static org.junit.jupiter.params.provider.Arguments.of;
 
 /** Tests for parsing a Table API specific SqlIdentifier. */
-@RunWith(Parameterized.class)
 public class TableApiIdentifierParsingTest {

Review Comment:
   can be package-private



##########
flink-table/flink-sql-parser/src/test/java/org/apache/flink/sql/parser/FlinkDDLDataTypeTest.java:
##########
@@ -334,27 +344,28 @@ private static TestItem createTestItem(Object... args) {
         if (args.length == 3) {
             testItem.withExpectedUnparsed((String) args[2]);
         }
-        return testItem;
+        return of(testItem);
     }
 
-    @Parameterized.Parameter public TestItem testItem;
-
-    @Test
-    public void testDataTypeParsing() {
+    @ParameterizedTest

Review Comment:
   ```suggestion
       @ParameterizedTest(name = "{index}: {0}")
   ```



##########
flink-table/flink-sql-parser/src/test/java/org/apache/flink/sql/parser/FlinkDDLDataTypeTest.java:
##########
@@ -334,27 +344,28 @@ private static TestItem createTestItem(Object... args) {
         if (args.length == 3) {
             testItem.withExpectedUnparsed((String) args[2]);
         }
-        return testItem;
+        return of(testItem);
     }
 
-    @Parameterized.Parameter public TestItem testItem;
-
-    @Test
-    public void testDataTypeParsing() {
+    @ParameterizedTest
+    @MethodSource("testData")
+    void testDataTypeParsing(TestItem testItem) {
         if (testItem.expectedType != null) {
             checkType(testItem.testExpr, testItem.expectedType);
         }
     }
 
-    @Test
-    public void testThrowsError() {
+    @ParameterizedTest
+    @MethodSource("testData")
+    void testThrowsError(TestItem testItem) {
         if (testItem.expectedError != null) {
             checkFails(testItem.testExpr, testItem.expectedError);
         }
     }
 
-    @Test
-    public void testDataTypeUnparsing() {
+    @ParameterizedTest

Review Comment:
   ```suggestion
       @ParameterizedTest(name = "{index}: {0}")
   ```



##########
flink-table/flink-sql-parser/src/test/java/org/apache/flink/sql/parser/FlinkDDLDataTypeTest.java:
##########
@@ -334,27 +344,28 @@ private static TestItem createTestItem(Object... args) {
         if (args.length == 3) {
             testItem.withExpectedUnparsed((String) args[2]);
         }
-        return testItem;
+        return of(testItem);
     }
 
-    @Parameterized.Parameter public TestItem testItem;
-
-    @Test
-    public void testDataTypeParsing() {
+    @ParameterizedTest
+    @MethodSource("testData")
+    void testDataTypeParsing(TestItem testItem) {
         if (testItem.expectedType != null) {
             checkType(testItem.testExpr, testItem.expectedType);
         }
     }
 
-    @Test
-    public void testThrowsError() {
+    @ParameterizedTest

Review Comment:
   ```suggestion
       @ParameterizedTest(name = "{index}: {0}")
   ```



##########
flink-table/flink-sql-parser/src/test/java/org/apache/flink/sql/parser/FlinkDDLDataTypeTest.java:
##########
@@ -52,195 +52,203 @@
 import org.apache.calcite.test.catalog.MockCatalogReaderSimple;
 import org.apache.calcite.util.SourceStringReader;
 import org.apache.calcite.util.Util;
-import org.junit.Test;
-import org.junit.runner.RunWith;
-import org.junit.runners.Parameterized;
+import org.junit.jupiter.params.ParameterizedTest;
+import org.junit.jupiter.params.provider.Arguments;
+import org.junit.jupiter.params.provider.MethodSource;
 
 import javax.annotation.Nullable;
 
 import java.util.Arrays;
 import java.util.Collections;
 import java.util.HashMap;
-import java.util.List;
 import java.util.Map;
+import java.util.stream.Stream;
 
 import static org.assertj.core.api.Assertions.assertThat;
+import static org.junit.jupiter.params.provider.Arguments.of;
 
 /** Tests for all the supported Flink DDL data types. */
-@RunWith(Parameterized.class)
 public class FlinkDDLDataTypeTest {
     private static final Fixture FIXTURE = new Fixture(TestFactory.INSTANCE.getTypeFactory());
     private static final String DDL_FORMAT =
             "create table t1 (\n" + "  f0 %s\n" + ") with (\n" + "  'k1' = 'v1'\n" + ")";
 
-    @Parameterized.Parameters(name = "{index}: {0}")
-    public static List<TestItem> testData() {
-        return Arrays.asList(
-                createTestItem("CHAR", nullable(FIXTURE.char1Type), "CHAR"),
-                createTestItem("CHAR NOT NULL", FIXTURE.char1Type, "CHAR NOT NULL"),
-                createTestItem("CHAR   NOT \t\nNULL", FIXTURE.char1Type, "CHAR NOT NULL"),
-                createTestItem("char not null", FIXTURE.char1Type, "CHAR NOT NULL"),
-                createTestItem("CHAR NULL", nullable(FIXTURE.char1Type), "CHAR"),
-                createTestItem("CHAR(33)", nullable(FIXTURE.char33Type), "CHAR(33)"),
-                createTestItem("VARCHAR", nullable(FIXTURE.varcharType), "VARCHAR"),
-                createTestItem("VARCHAR(33)", nullable(FIXTURE.varchar33Type), "VARCHAR(33)"),
-                createTestItem(
+    public static Stream<Arguments> testData() {

Review Comment:
   does this need to be public?



##########
flink-table/flink-sql-parser/src/test/java/org/apache/flink/sql/parser/CreateTableLikeTest.java:
##########
@@ -48,10 +47,8 @@
 /** Tests for parsing and validating {@link SqlTableLike} clause. */
 public class CreateTableLikeTest {

Review Comment:
   can be package-private



##########
flink-table/flink-sql-parser-hive/src/test/java/org/apache/flink/sql/parser/hive/FlinkHiveSqlParserImplTest.java:
##########
@@ -22,8 +22,8 @@
 
 import org.apache.calcite.sql.parser.SqlParserImplFactory;
 import org.apache.calcite.sql.parser.SqlParserTest;
-import org.junit.Ignore;
-import org.junit.Test;
+import org.junit.jupiter.api.Disabled;
+import org.junit.jupiter.api.Test;
 
 /** Tests for {@link FlinkHiveSqlParserImpl}. */
 public class FlinkHiveSqlParserImplTest extends SqlParserTest {

Review Comment:
   can be package-private



##########
flink-table/flink-sql-parser/src/test/java/org/apache/flink/sql/parser/FlinkDDLDataTypeTest.java:
##########
@@ -52,195 +52,203 @@
 import org.apache.calcite.test.catalog.MockCatalogReaderSimple;
 import org.apache.calcite.util.SourceStringReader;
 import org.apache.calcite.util.Util;
-import org.junit.Test;
-import org.junit.runner.RunWith;
-import org.junit.runners.Parameterized;
+import org.junit.jupiter.params.ParameterizedTest;
+import org.junit.jupiter.params.provider.Arguments;
+import org.junit.jupiter.params.provider.MethodSource;
 
 import javax.annotation.Nullable;
 
 import java.util.Arrays;
 import java.util.Collections;
 import java.util.HashMap;
-import java.util.List;
 import java.util.Map;
+import java.util.stream.Stream;
 
 import static org.assertj.core.api.Assertions.assertThat;
+import static org.junit.jupiter.params.provider.Arguments.of;
 
 /** Tests for all the supported Flink DDL data types. */
-@RunWith(Parameterized.class)
 public class FlinkDDLDataTypeTest {

Review Comment:
   can be package-private



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] snuyanzin commented on a diff in pull request #19467: [FLINK-25548][flink-sql-parser] Migrate tests to JUnit5

Posted by GitBox <gi...@apache.org>.
snuyanzin commented on code in PR #19467:
URL: https://github.com/apache/flink/pull/19467#discussion_r854370777


##########
flink-table/flink-sql-parser/src/test/java/org/apache/flink/sql/parser/CreateTableLikeTest.java:
##########
@@ -48,10 +47,8 @@
 /** Tests for parsing and validating {@link SqlTableLike} clause. */
 public class CreateTableLikeTest {

Review Comment:
   done



##########
flink-table/flink-sql-parser-hive/src/test/java/org/apache/flink/sql/parser/hive/FlinkHiveSqlParserImplTest.java:
##########
@@ -22,8 +22,8 @@
 
 import org.apache.calcite.sql.parser.SqlParserImplFactory;
 import org.apache.calcite.sql.parser.SqlParserTest;
-import org.junit.Ignore;
-import org.junit.Test;
+import org.junit.jupiter.api.Disabled;
+import org.junit.jupiter.api.Test;
 
 /** Tests for {@link FlinkHiveSqlParserImpl}. */
 public class FlinkHiveSqlParserImplTest extends SqlParserTest {

Review Comment:
   done



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] snuyanzin commented on pull request #19467: [FLINK-25548][flink-sql-parser] Migrate tests to JUnit5

Posted by GitBox <gi...@apache.org>.
snuyanzin commented on PR #19467:
URL: https://github.com/apache/flink/pull/19467#issuecomment-1104195895

   @flinkbot run azure


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org