You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by GitBox <gi...@apache.org> on 2022/04/27 05:01:17 UTC

[GitHub] [flink] BiGsuw opened a new pull request, #19590: [FLINK-27352][tests] [JUnit5 Migration] Module: flink-json

BiGsuw opened a new pull request, #19590:
URL: https://github.com/apache/flink/pull/19590

   What is the purpose of the change
   Update the flink-table/flink-table-common and flink-formats/flink-json module to AssertJ and JUnit 5 following the [JUnit 5 Migration Guide](https://docs.google.com/document/d/1514Wa_aNB9bJUen4xm5uiuXOooOJTtXqS_Jqk9KJitU/edit)
   
   Brief change log
   Use JUnit5 and AssertJ in tests instead of JUnit4 and Hamcrest
   Ignore deprecated class test case upgrades(JsonRowSerializationSchemaTest.class,JsonRowDeserializationSchemaTest.class)
   
   Verifying this change
   This change is a code cleanup without any test coverage.
   
   Does this pull request potentially affect one of the following parts:
   Dependencies (does it add or upgrade a dependency): ( no)
   The public API, i.e., is any changed class annotated with @public(Evolving): ( no)
   The serializers: (no)
   The runtime per-record code paths (performance sensitive): (no)
   Anything that affects deployment or recovery: JobManager (and its components), Checkpointing, Kubernetes/Yarn, ZooKeeper: (no)
   The S3 file system connector: ( no)
   Documentation
   Does this pull request introduce a new feature? ( no)
   If yes, how is the feature documented? (not applicable)


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] zentol merged pull request #19590: [FLINK-27352][tests] [JUnit5 Migration] Module: flink-json

Posted by GitBox <gi...@apache.org>.
zentol merged PR #19590:
URL: https://github.com/apache/flink/pull/19590


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] zentol commented on a diff in pull request #19590: [FLINK-27352][tests] [JUnit5 Migration] Module: flink-json

Posted by GitBox <gi...@apache.org>.
zentol commented on code in PR #19590:
URL: https://github.com/apache/flink/pull/19590#discussion_r861704021


##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/ogg/OggJsonSerDeSchemaTest.java:
##########
@@ -242,7 +240,7 @@ private void testSerializationDeserialization(String resourceFile) throws Except
                         "{\"before\":{\"id\":111,\"name\":\"scooter\",\"description\":\"Big 2-wheel scooter \",\"weight\":5.18},\"after\":null,\"op_type\":\"D\"}",
                         "{\"before\":null,\"after\":{\"id\":111,\"name\":\"scooter\",\"description\":\"Big 2-wheel scooter \",\"weight\":5.17},\"op_type\":\"I\"}",
                         "{\"before\":{\"id\":111,\"name\":\"scooter\",\"description\":\"Big 2-wheel scooter \",\"weight\":5.17},\"after\":null,\"op_type\":\"D\"}");
-        assertEquals(expected, actual);
+        assertThat(actual).isEqualTo(expected);

Review Comment:
   containsExactly



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/ogg/OggJsonSerDeSchemaTest.java:
##########
@@ -119,18 +116,19 @@ public void testDeserializationWithMetadata(String resourceFile) throws Exceptio
 
         final SimpleCollector collector = new SimpleCollector();
         deserializationSchema.deserialize(firstLine.getBytes(StandardCharsets.UTF_8), collector);
-        assertEquals(1, collector.list.size());
+        assertThat(collector.list.size()).isEqualTo(1);

Review Comment:
   hasSize



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/debezium/DebeziumJsonFormatFactoryTest.java:
##########
@@ -96,24 +93,39 @@ public void testSeDeSchema() {
                 sinkMock.valueFormat.createRuntimeEncoder(
                         new SinkRuntimeProviderContext(false), PHYSICAL_DATA_TYPE);
 
-        assertEquals(expectedSer, actualSer);
+        assertThat(actualSer).isEqualTo(expectedSer);
     }
 
     @Test
-    public void testInvalidIgnoreParseError() {
-        thrown.expect(
-                containsCause(
-                        new IllegalArgumentException(
-                                "Unrecognized option for boolean: abc. Expected either true or false(case insensitive)")));
-
+    void testInvalidIgnoreParseError() {
         final Map<String, String> options =
                 getModifiedOptions(opts -> opts.put("debezium-json.ignore-parse-errors", "abc"));
 
-        createTableSource(SCHEMA, options);
+        assertThatThrownBy(() -> createTableSource(SCHEMA, options))
+                .satisfies(
+                        anyCauseMatches(
+                                IllegalArgumentException.class,
+                                "Unrecognized option for boolean: abc. "
+                                        + "Expected either true or false(case insensitive)"));
+    }
+
+    @Test
+    void testInvalidOptionForTimestampFormat() {

Review Comment:
   please don't change the order of tests; it makes the review unnecessarily difficult



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/ogg/OggJsonFileSystemITCase.java:
##########
@@ -36,9 +36,14 @@
 import java.util.stream.Collectors;
 
 import static java.lang.String.format;
+import static org.assertj.core.api.Assertions.assertThat;
 
 /** Test Filesystem connector with OGG Json. */
-public class OggJsonFileSystemITCase extends StreamingTestBase {
+class OggJsonFileSystemITCase extends StreamingTestBase {
+
+    @TempDir private java.nio.file.Path tempSourceDir;
+
+    @TempDir private java.nio.file.Path tempSinkDir;

Review Comment:
   do not use qualified imports



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/canal/CanalJsonSerDeSchemaTest.java:
##########
@@ -82,55 +76,55 @@ public void testFilteringTables() throws Exception {
     }
 
     @Test
-    public void testDeserializeNullRow() throws Exception {
+    void testDeserializeNullRow() throws Exception {
         final List<ReadableMetadata> requestedMetadata = Arrays.asList(ReadableMetadata.values());
         final CanalJsonDeserializationSchema deserializationSchema =
                 createCanalJsonDeserializationSchema(null, null, requestedMetadata);
         final SimpleCollector collector = new SimpleCollector();
 
         deserializationSchema.deserialize(null, collector);
         deserializationSchema.deserialize(new byte[0], collector);
-        assertEquals(0, collector.list.size());
+        assertThat(0).isEqualTo(collector.list.size());

Review Comment:
   hasSize, invert argument order



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/canal/CanalJsonSerDeSchemaTest.java:
##########
@@ -271,7 +265,7 @@ private void testDeserializationWithMetadata(
         final SimpleCollector collector = new SimpleCollector();
 
         deserializationSchema.deserialize(firstLine.getBytes(StandardCharsets.UTF_8), collector);
-        assertEquals(9, collector.list.size());
+        assertThat(collector.list.size()).isEqualTo(9);

Review Comment:
   hasSize



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/ogg/OggJsonSerDeSchemaTest.java:
##########
@@ -204,7 +202,7 @@ private void testSerializationDeserialization(String resourceFile) throws Except
                         "-D(111,scooter,Big 2-wheel scooter ,5.17)");
         List<String> actual =
                 collector.list.stream().map(Object::toString).collect(Collectors.toList());
-        assertEquals(expected, actual);
+        assertThat(actual).isEqualTo(expected);

Review Comment:
   containsExactly



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/ogg/OggJsonFileSystemITCase.java:
##########
@@ -36,9 +36,14 @@
 import java.util.stream.Collectors;
 
 import static java.lang.String.format;
+import static org.assertj.core.api.Assertions.assertThat;
 
 /** Test Filesystem connector with OGG Json. */
-public class OggJsonFileSystemITCase extends StreamingTestBase {
+class OggJsonFileSystemITCase extends StreamingTestBase {
+
+    @TempDir private java.nio.file.Path tempSourceDir;

Review Comment:
   do not use qualified imports



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/maxwell/MaxwellJsonSerDerTest.java:
##########
@@ -85,23 +83,23 @@ public void testDeserializationWithMetadata() throws Exception {
                         TimestampFormat.ISO_8601);
         final SimpleCollector collector = new SimpleCollector();
         deserializationSchema.deserialize(firstLine.getBytes(StandardCharsets.UTF_8), collector);
-        assertEquals(1, collector.list.size());
+        assertThat(collector.list.size()).isEqualTo(1);
         Consumer<RowData> consumer =

Review Comment:
   this can be re-written easily to .satifies(...)



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/debezium/DebeziumJsonSerDeSchemaTest.java:
##########
@@ -318,7 +307,7 @@ private void testDeserializationWithMetadata(
         final SimpleCollector collector = new SimpleCollector();
         deserializationSchema.deserialize(firstLine.getBytes(StandardCharsets.UTF_8), collector);
 
-        assertEquals(1, collector.list.size());
+        assertThat(collector.list.size()).isEqualTo(1);

Review Comment:
   hasSize



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/debezium/DebeziumJsonFormatFactoryTest.java:
##########
@@ -131,49 +143,37 @@ public void testSchemaIncludeOption() {
         DeserializationSchema<RowData> actualDeser =
                 scanSourceMock.valueFormat.createRuntimeDecoder(
                         ScanRuntimeProviderContext.INSTANCE, PHYSICAL_DATA_TYPE);
-        assertEquals(expectedDeser, actualDeser);
-
-        try {
-            final DynamicTableSink actualSink = createTableSink(SCHEMA, options);
-            TestDynamicTableFactory.DynamicTableSinkMock sinkMock =
-                    (TestDynamicTableFactory.DynamicTableSinkMock) actualSink;
-            // should fail
-            sinkMock.valueFormat.createRuntimeEncoder(
-                    new SinkRuntimeProviderContext(false), PHYSICAL_DATA_TYPE);
-            fail();
-        } catch (Exception e) {
-            assertEquals(
-                    e.getCause().getCause().getMessage(),
-                    "Debezium JSON serialization doesn't support "
-                            + "'debezium-json.schema-include' option been set to true.");
-        }
-    }
-
-    @Test
-    public void testInvalidOptionForTimestampFormat() {
-        final Map<String, String> tableOptions =
-                getModifiedOptions(
-                        opts -> opts.put("debezium-json.timestamp-format.standard", "test"));
-
-        thrown.expect(ValidationException.class);
-        thrown.expect(
-                containsCause(
-                        new ValidationException(
-                                "Unsupported value 'test' for timestamp-format.standard. Supported values are [SQL, ISO-8601].")));
-        createTableSource(SCHEMA, tableOptions);
+        assertThat(actualDeser).isEqualTo(expectedDeser);
+
+        assertThatThrownBy(
+                        () -> {
+                            final DynamicTableSink actualSink = createTableSink(SCHEMA, options);
+                            TestDynamicTableFactory.DynamicTableSinkMock sinkMock =
+                                    (TestDynamicTableFactory.DynamicTableSinkMock) actualSink;
+                            // should fail
+                            sinkMock.valueFormat.createRuntimeEncoder(
+                                    new SinkRuntimeProviderContext(false), PHYSICAL_DATA_TYPE);
+                            fail();

Review Comment:
   ```suggestion
   ```



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/maxwell/MaxwellJsonSerDerTest.java:
##########
@@ -85,23 +83,23 @@ public void testDeserializationWithMetadata() throws Exception {
                         TimestampFormat.ISO_8601);
         final SimpleCollector collector = new SimpleCollector();
         deserializationSchema.deserialize(firstLine.getBytes(StandardCharsets.UTF_8), collector);
-        assertEquals(1, collector.list.size());
+        assertThat(collector.list.size()).isEqualTo(1);

Review Comment:
   hasSize



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/JsonRowDataSerDeSchemaTest.java:
##########
@@ -330,12 +328,12 @@ public void testSerDeMultiRows() throws Exception {
             byte[] serializedJson = objectMapper.writeValueAsBytes(root);
             RowData rowData = deserializationSchema.deserialize(serializedJson);
             byte[] actual = serializationSchema.serialize(rowData);
-            assertEquals(new String(serializedJson), new String(actual));
+            assertThat(new String(actual)).isEqualTo(new String(serializedJson));

Review Comment:
   see above



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/debezium/DebeziumJsonFileSystemITCase.java:
##########
@@ -36,9 +36,14 @@
 import java.util.stream.Collectors;
 
 import static java.lang.String.format;
+import static org.assertj.core.api.Assertions.assertThat;
 
 /** Test Filesystem connector with DebeziumJson. */
-public class DebeziumJsonFileSystemITCase extends StreamingTestBase {
+class DebeziumJsonFileSystemITCase extends StreamingTestBase {
+
+    @TempDir private java.nio.file.Path tempSourceDir;
+
+    @TempDir private java.nio.file.Path tempSinkDir;

Review Comment:
   no qualified imports



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/debezium/DebeziumJsonFormatFactoryTest.java:
##########
@@ -131,49 +143,37 @@ public void testSchemaIncludeOption() {
         DeserializationSchema<RowData> actualDeser =
                 scanSourceMock.valueFormat.createRuntimeDecoder(
                         ScanRuntimeProviderContext.INSTANCE, PHYSICAL_DATA_TYPE);
-        assertEquals(expectedDeser, actualDeser);
-
-        try {
-            final DynamicTableSink actualSink = createTableSink(SCHEMA, options);
-            TestDynamicTableFactory.DynamicTableSinkMock sinkMock =
-                    (TestDynamicTableFactory.DynamicTableSinkMock) actualSink;
-            // should fail
-            sinkMock.valueFormat.createRuntimeEncoder(
-                    new SinkRuntimeProviderContext(false), PHYSICAL_DATA_TYPE);
-            fail();
-        } catch (Exception e) {
-            assertEquals(
-                    e.getCause().getCause().getMessage(),
-                    "Debezium JSON serialization doesn't support "
-                            + "'debezium-json.schema-include' option been set to true.");
-        }
-    }
-
-    @Test
-    public void testInvalidOptionForTimestampFormat() {
-        final Map<String, String> tableOptions =
-                getModifiedOptions(
-                        opts -> opts.put("debezium-json.timestamp-format.standard", "test"));
-
-        thrown.expect(ValidationException.class);
-        thrown.expect(
-                containsCause(
-                        new ValidationException(
-                                "Unsupported value 'test' for timestamp-format.standard. Supported values are [SQL, ISO-8601].")));
-        createTableSource(SCHEMA, tableOptions);
+        assertThat(actualDeser).isEqualTo(expectedDeser);
+
+        assertThatThrownBy(
+                        () -> {
+                            final DynamicTableSink actualSink = createTableSink(SCHEMA, options);
+                            TestDynamicTableFactory.DynamicTableSinkMock sinkMock =
+                                    (TestDynamicTableFactory.DynamicTableSinkMock) actualSink;

Review Comment:
   move these lines out, then the next comment line is unnecesary



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/JsonRowDataSerDeSchemaTest.java:
##########
@@ -202,15 +200,15 @@ public void testSerDe() throws Exception {
                         true);
 
         byte[] actualBytes = serializationSchema.serialize(rowData);
-        assertEquals(new String(serializedJson), new String(actualBytes));
+        assertThat(new String(actualBytes)).isEqualTo(new String(serializedJson));

Review Comment:
   could compare the byte arrays instead I suppose



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/debezium/DebeziumJsonSerDeSchemaTest.java:
##########
@@ -119,63 +109,62 @@ public void testTombstoneMessages() throws Exception {
         SimpleCollector collector = new SimpleCollector();
         deserializationSchema.deserialize(null, collector);
         deserializationSchema.deserialize(new byte[] {}, collector);
-        assertTrue(collector.list.isEmpty());
+        assertThat(collector.list).isNullOrEmpty();

Review Comment:
   the list is never null; why do we allow it?



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/JsonBatchFileSystemITCase.java:
##########
@@ -36,10 +36,14 @@
 import java.util.Arrays;
 import java.util.List;
 
+import static org.assertj.core.api.Assertions.assertThat;
+
 /** ITCase to test json format for {@link JsonFormatFactory}. */
-public class JsonBatchFileSystemITCase extends BatchFileSystemITCaseBase {
+class JsonBatchFileSystemITCase extends BatchFileSystemITCaseBase {

Review Comment:
   given that the BatchFileSystemITCaseBase hasn't been migrated to junit4 (and thus a lot of things from it won't work), does this test actually behave as expected? 



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/debezium/DebeziumJsonSerDeSchemaTest.java:
##########
@@ -318,7 +307,7 @@ private void testDeserializationWithMetadata(
         final SimpleCollector collector = new SimpleCollector();
         deserializationSchema.deserialize(firstLine.getBytes(StandardCharsets.UTF_8), collector);
 
-        assertEquals(1, collector.list.size());
+        assertThat(collector.list.size()).isEqualTo(1);
         testConsumer.accept(collector.list.get(0));

Review Comment:
   rewrite to .satifies()



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/canal/CanalJsonSerDeSchemaTest.java:
##########
@@ -271,7 +265,7 @@ private void testDeserializationWithMetadata(
         final SimpleCollector collector = new SimpleCollector();
 
         deserializationSchema.deserialize(firstLine.getBytes(StandardCharsets.UTF_8), collector);
-        assertEquals(9, collector.list.size());
+        assertThat(collector.list.size()).isEqualTo(9);
         testConsumer.accept(collector.list.get(0));

Review Comment:
   .satisfies()



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/JsonRowDataSerDeSchemaTest.java:
##########
@@ -645,28 +637,22 @@ public void testSerializationWithTypesMismatch() {
                         "null",
                         true);
         String errorMessage = "Fail to serialize at field: f1.";
-        try {
-            serializationSchema.serialize(genericRowData);
-            fail("expecting exception message: " + errorMessage);
-        } catch (Throwable t) {
-            assertThat(t, FlinkMatchers.containsMessage(errorMessage));
-        }
+
+        assertThatThrownBy(() -> serializationSchema.serialize(genericRowData))
+                .satisfies(anyCauseMatches(RuntimeException.class, errorMessage));
     }
 
     @Test
-    public void testDeserializationWithTypesMismatch() {
+    void testDeserializationWithTypesMismatch() {
         RowType rowType = (RowType) ROW(FIELD("f0", STRING()), FIELD("f1", INT())).getLogicalType();
         String json = "{\"f0\":\"abc\", \"f1\": \"abc\"}";
         JsonRowDataDeserializationSchema deserializationSchema =
                 new JsonRowDataDeserializationSchema(
                         rowType, InternalTypeInfo.of(rowType), false, false, TimestampFormat.SQL);
         String errorMessage = "Fail to deserialize at field: f1.";
-        try {
-            deserializationSchema.deserialize(json.getBytes());
-            fail("expecting exception message: " + errorMessage);
-        } catch (Throwable t) {
-            assertThat(t, FlinkMatchers.containsMessage(errorMessage));
-        }
+
+        assertThatThrownBy(() -> deserializationSchema.deserialize(json.getBytes()))
+                .satisfies(anyCauseMatches(RuntimeException.class, errorMessage));

Review Comment:
   ```suggestion
                   .satisfies(anyCauseMatches(errorMessage));
   ```



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/JsonRowDataSerDeSchemaTest.java:
##########
@@ -431,42 +429,41 @@ public void testDeserializationMissingField() throws Exception {
 
         Row expected = new Row(1);
         Row actual = convertToExternal(deserializationSchema.deserialize(serializedJson), dataType);
-        assertEquals(expected, actual);
+        assertThat(actual).isEqualTo(expected);
 
         // fail on missing field
         deserializationSchema =
                 new JsonRowDataDeserializationSchema(
                         schema, InternalTypeInfo.of(schema), true, false, TimestampFormat.ISO_8601);
 
         String errorMessage = "Failed to deserialize JSON '{\"id\":123123123}'.";
-        try {
-            deserializationSchema.deserialize(serializedJson);
-            fail("expecting exception message: " + errorMessage);
-        } catch (Throwable t) {
-            assertEquals(errorMessage, t.getMessage());
-        }
+
+        JsonRowDataDeserializationSchema finalDeserializationSchema = deserializationSchema;
+        assertThatThrownBy(() -> finalDeserializationSchema.deserialize(serializedJson))
+                .hasMessageContaining(errorMessage);
 
         // ignore on parse error
         deserializationSchema =
                 new JsonRowDataDeserializationSchema(
                         schema, InternalTypeInfo.of(schema), false, true, TimestampFormat.ISO_8601);
         actual = convertToExternal(deserializationSchema.deserialize(serializedJson), dataType);
-        assertEquals(expected, actual);
+        assertThat(actual).isEqualTo(expected);
 
         errorMessage =
                 "JSON format doesn't support failOnMissingField and ignoreParseErrors are both enabled.";
-        try {
-            // failOnMissingField and ignoreParseErrors both enabled
-            new JsonRowDataDeserializationSchema(
-                    schema, InternalTypeInfo.of(schema), true, true, TimestampFormat.ISO_8601);
-            Assert.fail("expecting exception message: " + errorMessage);
-        } catch (Throwable t) {
-            assertEquals(errorMessage, t.getMessage());
-        }
+        assertThatThrownBy(
+                        () ->
+                                new JsonRowDataDeserializationSchema(
+                                        schema,
+                                        InternalTypeInfo.of(schema),
+                                        true,
+                                        true,
+                                        TimestampFormat.ISO_8601))
+                .hasMessageContaining(errorMessage);

Review Comment:
   ```suggestion
                   .hasMessage(errorMessage);
   ```



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/JsonBatchFileSystemITCase.java:
##########
@@ -97,11 +117,11 @@ public void bigDataTest() throws IOException {
         }
         expected.sort(String::compareTo);
 
-        Assert.assertEquals(expected, elements);
+        assertThat(elements).isEqualTo(expected);
     }
 
     private static File generateTestData(int numRecords) throws IOException {
-        File tempDir = TEMPORARY_FOLDER.newFolder();
+        File tempDir = temporaryFolder;
 
         File root = new File(tempDir, "id=0");

Review Comment:
   ```suggestion
           File root = new File(tempraryFolder, "id=0");
   ```
   in-line variable



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/JsonBatchFileSystemITCase.java:
##########
@@ -88,7 +108,7 @@ public void bigDataTest() throws IOException {
         TableResult result = tEnv().executeSql("select * from bigdata_source");
         List<String> elements = new ArrayList<>();
         result.collect().forEachRemaining(r -> elements.add((String) r.getField(1)));
-        Assert.assertEquals(numRecords, elements.size());
+        assertThat(elements.size()).isEqualTo(numRecords);

Review Comment:
   hasSize



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/JsonBatchFileSystemITCase.java:
##########
@@ -50,11 +54,27 @@ public String[] formatProperties() {
     }
 
     @Test
-    public void testParseError() throws Exception {
-        String path = new URI(resultPath()).getPath();
+    void testParseError(@TempDir java.nio.file.Path temporaryFolder) throws Exception {
+        String sql =
+                String.format(
+                        "create table nonPartitionedTable ("
+                                + "x string,"
+                                + "y int,"
+                                + "a int,"
+                                + "b bigint"
+                                + ") with ("
+                                + "'connector' = 'filesystem',"
+                                + "'path' = '%s',"
+                                + "%s)",
+                        "file://" + temporaryFolder.toString(),
+                        String.join(",\n", formatProperties()));
+        tEnv().executeSql(sql);
+
+        String path = new URI(temporaryFolder.toString()).getPath();
         new File(path).mkdirs();
         File file = new File(path, "temp_file");
         file.createNewFile();
+

Review Comment:
   revert



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/JsonBatchFileSystemITCase.java:
##########
@@ -50,11 +54,27 @@ public String[] formatProperties() {
     }
 
     @Test
-    public void testParseError() throws Exception {
-        String path = new URI(resultPath()).getPath();
+    void testParseError(@TempDir java.nio.file.Path temporaryFolder) throws Exception {
+        String sql =

Review Comment:
   where does this come from?



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/JsonBatchFileSystemITCase.java:
##########
@@ -50,11 +54,27 @@ public String[] formatProperties() {
     }
 
     @Test
-    public void testParseError() throws Exception {
-        String path = new URI(resultPath()).getPath();
+    void testParseError(@TempDir java.nio.file.Path temporaryFolder) throws Exception {

Review Comment:
   no qualified imports



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/JsonRowDataSerDeSchemaTest.java:
##########
@@ -431,42 +429,41 @@ public void testDeserializationMissingField() throws Exception {
 
         Row expected = new Row(1);
         Row actual = convertToExternal(deserializationSchema.deserialize(serializedJson), dataType);
-        assertEquals(expected, actual);
+        assertThat(actual).isEqualTo(expected);
 
         // fail on missing field
         deserializationSchema =
                 new JsonRowDataDeserializationSchema(
                         schema, InternalTypeInfo.of(schema), true, false, TimestampFormat.ISO_8601);
 
         String errorMessage = "Failed to deserialize JSON '{\"id\":123123123}'.";
-        try {
-            deserializationSchema.deserialize(serializedJson);
-            fail("expecting exception message: " + errorMessage);
-        } catch (Throwable t) {
-            assertEquals(errorMessage, t.getMessage());
-        }
+
+        JsonRowDataDeserializationSchema finalDeserializationSchema = deserializationSchema;
+        assertThatThrownBy(() -> finalDeserializationSchema.deserialize(serializedJson))
+                .hasMessageContaining(errorMessage);

Review Comment:
   ```suggestion
                   .hasMessage(errorMessage);
   ```



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/JsonRowDataSerDeSchemaTest.java:
##########
@@ -310,7 +308,7 @@ public void testSerDeMultiRows() throws Exception {
             byte[] serializedJson = objectMapper.writeValueAsBytes(root);
             RowData rowData = deserializationSchema.deserialize(serializedJson);
             byte[] actual = serializationSchema.serialize(rowData);
-            assertEquals(new String(serializedJson), new String(actual));
+            assertThat(new String(actual)).isEqualTo(new String(serializedJson));

Review Comment:
   see above



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] BiGsuw commented on a diff in pull request #19590: [FLINK-27352][tests] [JUnit5 Migration] Module: flink-json

Posted by GitBox <gi...@apache.org>.
BiGsuw commented on code in PR #19590:
URL: https://github.com/apache/flink/pull/19590#discussion_r861966383


##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/JsonBatchFileSystemITCase.java:
##########
@@ -36,10 +36,14 @@
 import java.util.Arrays;
 import java.util.List;
 
+import static org.assertj.core.api.Assertions.assertThat;
+
 /** ITCase to test json format for {@link JsonFormatFactory}. */
-public class JsonBatchFileSystemITCase extends BatchFileSystemITCaseBase {
+class JsonBatchFileSystemITCase extends BatchFileSystemITCaseBase {

Review Comment:
   yes,it test actually behave as expected
   <img width="1459" alt="image" src="https://user-images.githubusercontent.com/20968409/165985869-4a2ff329-176b-44c4-96e5-10c658a1ae2f.png">
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] BiGsuw commented on pull request #19590: [FLINK-27352][tests] [JUnit5 Migration] Module: flink-json

Posted by GitBox <gi...@apache.org>.
BiGsuw commented on PR #19590:
URL: https://github.com/apache/flink/pull/19590#issuecomment-1113559303

   @zentol  I have revised it according to your suggestion, please review it again, Thanks


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] BiGsuw commented on pull request #19590: [FLINK-27352][tests] [JUnit5 Migration] Module: flink-json

Posted by GitBox <gi...@apache.org>.
BiGsuw commented on PR #19590:
URL: https://github.com/apache/flink/pull/19590#issuecomment-1113010844

   Could you help to check it ? Thanks. @zentol 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] BiGsuw commented on a diff in pull request #19590: [FLINK-27352][tests] [JUnit5 Migration] Module: flink-json

Posted by GitBox <gi...@apache.org>.
BiGsuw commented on code in PR #19590:
URL: https://github.com/apache/flink/pull/19590#discussion_r862741340


##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/JsonBatchFileSystemITCase.java:
##########
@@ -50,11 +54,27 @@ public String[] formatProperties() {
     }
 
     @Test
-    public void testParseError() throws Exception {
-        String path = new URI(resultPath()).getPath();
+    void testParseError(@TempDir java.nio.file.Path temporaryFolder) throws Exception {
+        String sql =

Review Comment:
   ok, i'm changing it back to junit4 now, so @RegisterExtension and @TempDir won't work because they need junit5, i'm back to @Rule



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] BiGsuw commented on a diff in pull request #19590: [FLINK-27352][tests] [JUnit5 Migration] Module: flink-json

Posted by GitBox <gi...@apache.org>.
BiGsuw commented on code in PR #19590:
URL: https://github.com/apache/flink/pull/19590#discussion_r861960703


##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/JsonBatchFileSystemITCase.java:
##########
@@ -50,11 +54,27 @@ public String[] formatProperties() {
     }
 
     @Test
-    public void testParseError() throws Exception {
-        String path = new URI(resultPath()).getPath();
+    void testParseError(@TempDir java.nio.file.Path temporaryFolder) throws Exception {
+        String sql =

Review Comment:
   it comes from FileSystemITCaseBase.open, 
   This is a basic class. I dare not change it without authorization. I can only select the fragments required for code execution.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] zentol commented on a diff in pull request #19590: [FLINK-27352][tests] [JUnit5 Migration] Module: flink-json

Posted by GitBox <gi...@apache.org>.
zentol commented on code in PR #19590:
URL: https://github.com/apache/flink/pull/19590#discussion_r862824854


##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/maxwell/MaxwellJsonSerDerTest.java:
##########
@@ -85,23 +83,23 @@ public void testDeserializationWithMetadata() throws Exception {
                         TimestampFormat.ISO_8601);
         final SimpleCollector collector = new SimpleCollector();
         deserializationSchema.deserialize(firstLine.getBytes(StandardCharsets.UTF_8), collector);
-        assertEquals(1, collector.list.size());
+        assertThat(collector.list).hasSize(1);
         Consumer<RowData> consumer =
                 row -> {

Review Comment:
   ```suggestion
           assertThat(collector.list.get(0))
               .satisfies(row -> {
   ```



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/maxwell/MaxwellJsonSerDerTest.java:
##########
@@ -85,23 +83,23 @@ public void testDeserializationWithMetadata() throws Exception {
                         TimestampFormat.ISO_8601);
         final SimpleCollector collector = new SimpleCollector();
         deserializationSchema.deserialize(firstLine.getBytes(StandardCharsets.UTF_8), collector);
-        assertEquals(1, collector.list.size());
+        assertThat(collector.list).hasSize(1);
         Consumer<RowData> consumer =
                 row -> {
-                    assertThat(row.getInt(0), equalTo(101));
-                    assertThat(row.getString(1).toString(), equalTo("scooter"));
-                    assertThat(row.getString(2).toString(), equalTo("Small 2-wheel scooter"));
-                    assertThat(row.getFloat(3), equalTo(3.14f));
-                    assertThat(row.getString(4).toString(), equalTo("test"));
-                    assertThat(row.getString(5).toString(), equalTo("product"));
-                    assertThat(row.getArray(6).getString(0).toString(), equalTo("id"));
-                    assertThat(row.getTimestamp(7, 3).getMillisecond(), equalTo(1596684883000L));
+                    assertThat(row.getInt(0)).isEqualTo(101);
+                    assertThat(row.getString(1).toString()).isEqualTo("scooter");
+                    assertThat(row.getString(2).toString()).isEqualTo("Small 2-wheel scooter");
+                    assertThat(row.getFloat(3)).isEqualTo(3.14f);
+                    assertThat(row.getString(4).toString()).isEqualTo("test");
+                    assertThat(row.getString(5).toString()).isEqualTo("product");
+                    assertThat(row.getArray(6).getString(0).toString()).isEqualTo("id");
+                    assertThat(row.getTimestamp(7, 3).getMillisecond()).isEqualTo(1596684883000L);
                 };
-        consumer.accept(collector.list.get(0));
+        assertThat(collector.list.get(0)).satisfies(consumer);

Review Comment:
   ```suggestion
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] zentol commented on pull request #19590: [FLINK-27352][tests] [JUnit5 Migration] Module: flink-json

Posted by GitBox <gi...@apache.org>.
zentol commented on PR #19590:
URL: https://github.com/apache/flink/pull/19590#issuecomment-1114593729

   @BiGsuw What is your jira username btw?


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] zentol commented on a diff in pull request #19590: [FLINK-27352][tests] [JUnit5 Migration] Module: flink-json

Posted by GitBox <gi...@apache.org>.
zentol commented on code in PR #19590:
URL: https://github.com/apache/flink/pull/19590#discussion_r862612817


##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/JsonBatchFileSystemITCase.java:
##########
@@ -36,10 +36,14 @@
 import java.util.Arrays;
 import java.util.List;
 
+import static org.assertj.core.api.Assertions.assertThat;
+
 /** ITCase to test json format for {@link JsonFormatFactory}. */
-public class JsonBatchFileSystemITCase extends BatchFileSystemITCaseBase {
+class JsonBatchFileSystemITCase extends BatchFileSystemITCaseBase {

Review Comment:
   "Running" and "behaving as expected" are 2 different things. For example. the BatchAbstractTestBase sets up a mini cluster against which the test should be run. Is this being honored?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] zentol commented on a diff in pull request #19590: [FLINK-27352][tests] [JUnit5 Migration] Module: flink-json

Posted by GitBox <gi...@apache.org>.
zentol commented on code in PR #19590:
URL: https://github.com/apache/flink/pull/19590#discussion_r862821848


##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/JsonBatchFileSystemITCase.java:
##########
@@ -50,11 +54,27 @@ public String[] formatProperties() {
     }
 
     @Test
-    public void testParseError() throws Exception {
-        String path = new URI(resultPath()).getPath();
+    void testParseError(@TempDir java.nio.file.Path temporaryFolder) throws Exception {
+        String sql =

Review Comment:
   thanks!



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] zentol commented on a diff in pull request #19590: [FLINK-27352][tests] [JUnit5 Migration] Module: flink-json

Posted by GitBox <gi...@apache.org>.
zentol commented on code in PR #19590:
URL: https://github.com/apache/flink/pull/19590#discussion_r862613016


##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/JsonBatchFileSystemITCase.java:
##########
@@ -20,26 +20,31 @@
 
 import org.apache.flink.table.api.TableResult;
 import org.apache.flink.table.planner.runtime.batch.sql.BatchFileSystemITCaseBase;
-import org.apache.flink.table.utils.LegacyRowResource;
+import org.apache.flink.table.utils.LegacyRowExtension;
 import org.apache.flink.types.Row;
 import org.apache.flink.util.FileUtils;
 
-import org.junit.Assert;
-import org.junit.Rule;
-import org.junit.Test;
+import org.junit.jupiter.api.Test;
+import org.junit.jupiter.api.extension.RegisterExtension;
+import org.junit.jupiter.api.io.TempDir;
 
 import java.io.File;
 import java.io.IOException;
 import java.io.PrintWriter;
 import java.net.URI;
+import java.nio.file.Path;
 import java.util.ArrayList;
 import java.util.Arrays;
 import java.util.List;
 
+import static org.assertj.core.api.Assertions.assertThat;
+
 /** ITCase to test json format for {@link JsonFormatFactory}. */
-public class JsonBatchFileSystemITCase extends BatchFileSystemITCaseBase {
+class JsonBatchFileSystemITCase extends BatchFileSystemITCaseBase {
+
+    @RegisterExtension LegacyRowExtension usesLegacyRows = LegacyRowExtension.INSTANCE;

Review Comment:
   ```suggestion
       @RegisterExtension private LegacyRowExtension usesLegacyRows = LegacyRowExtension.INSTANCE;
   ```



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/ogg/OggJsonSerDeSchemaTest.java:
##########
@@ -242,7 +240,7 @@ private void testSerializationDeserialization(String resourceFile) throws Except
                         "{\"before\":{\"id\":111,\"name\":\"scooter\",\"description\":\"Big 2-wheel scooter \",\"weight\":5.18},\"after\":null,\"op_type\":\"D\"}",
                         "{\"before\":null,\"after\":{\"id\":111,\"name\":\"scooter\",\"description\":\"Big 2-wheel scooter \",\"weight\":5.17},\"op_type\":\"I\"}",
                         "{\"before\":{\"id\":111,\"name\":\"scooter\",\"description\":\"Big 2-wheel scooter \",\"weight\":5.17},\"after\":null,\"op_type\":\"D\"}");
-        assertEquals(expected, actual);
+        assertThat(actual).containsExactlyElementsOf(expected);

Review Comment:
   ```suggestion
           assertThat(expected).containsExactlyElementsOf(actual);
   ```



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/JsonRowDataSerDeSchemaTest.java:
##########
@@ -501,11 +498,11 @@ public void testSerDeSQLTimestampFormat() throws Exception {
         byte[] serializedJson = objectMapper.writeValueAsBytes(root);
         RowData rowData = deserializationSchema.deserialize(serializedJson);
         byte[] actual = serializationSchema.serialize(rowData);
-        assertEquals(new String(serializedJson), new String(actual));
+        assertThat(new String(actual)).isEqualTo(new String(serializedJson));

Review Comment:
   again, compare on byte array



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/JsonRowDataSerDeSchemaTest.java:
##########
@@ -330,12 +328,12 @@ public void testSerDeMultiRows() throws Exception {
             byte[] serializedJson = objectMapper.writeValueAsBytes(root);
             RowData rowData = deserializationSchema.deserialize(serializedJson);
             byte[] actual = serializationSchema.serialize(rowData);
-            assertEquals(new String(serializedJson), new String(actual));
+            assertThat(actual).isIn(serializedJson);

Review Comment:
   this is not correct. It allows serializedJson to contain more stuff. You should use containsExactly. Also, let's switch the assertion order.



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/debezium/DebeziumJsonSerDeSchemaTest.java:
##########
@@ -318,7 +307,7 @@ private void testDeserializationWithMetadata(
         final SimpleCollector collector = new SimpleCollector();
         deserializationSchema.deserialize(firstLine.getBytes(StandardCharsets.UTF_8), collector);
 
-        assertEquals(1, collector.list.size());
+        assertThat(collector.list.size()).isEqualTo(1);
         testConsumer.accept(collector.list.get(0));

Review Comment:
   assertThat(collector.list.get(0)).satisfies(testConsumer);



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/ogg/OggJsonSerDeSchemaTest.java:
##########
@@ -204,7 +202,7 @@ private void testSerializationDeserialization(String resourceFile) throws Except
                         "-D(111,scooter,Big 2-wheel scooter ,5.17)");
         List<String> actual =
                 collector.list.stream().map(Object::toString).collect(Collectors.toList());
-        assertEquals(expected, actual);
+        assertThat(actual).containsExactlyElementsOf(expected);

Review Comment:
   ```suggestion
           assertThat(expected).containsExactlyElementsOf(actual);
   ```



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/JsonBatchFileSystemITCase.java:
##########
@@ -50,11 +54,27 @@ public String[] formatProperties() {
     }
 
     @Test
-    public void testParseError() throws Exception {
-        String path = new URI(resultPath()).getPath();
+    void testParseError(@TempDir java.nio.file.Path temporaryFolder) throws Exception {
+        String sql =

Review Comment:
   That approach will just create a mess when the FileSystemITCaseBase has been migrated, as no one will have a clue that something was copied into the tests themselves.
   If this test doesn't work as expected without the FileSystemITCaseBase etc. being migrated, then this test should still use junit4 (but still use assertj).



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/ogg/OggJsonSerDeSchemaTest.java:
##########
@@ -94,7 +91,7 @@ public void testTombstoneMessages() throws Exception {
         SimpleCollector collector = new SimpleCollector();
         deserializationSchema.deserialize(null, collector);
         deserializationSchema.deserialize(new byte[] {}, collector);
-        assertTrue(collector.list.isEmpty());
+        assertThat(collector.list).isNullOrEmpty();

Review Comment:
   ```suggestion
           assertThat(collector.list).isEmpty();
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] BiGsuw commented on a diff in pull request #19590: [FLINK-27352][tests] [JUnit5 Migration] Module: flink-json

Posted by GitBox <gi...@apache.org>.
BiGsuw commented on code in PR #19590:
URL: https://github.com/apache/flink/pull/19590#discussion_r862746586


##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/JsonBatchFileSystemITCase.java:
##########
@@ -20,26 +20,31 @@
 
 import org.apache.flink.table.api.TableResult;
 import org.apache.flink.table.planner.runtime.batch.sql.BatchFileSystemITCaseBase;
-import org.apache.flink.table.utils.LegacyRowResource;
+import org.apache.flink.table.utils.LegacyRowExtension;
 import org.apache.flink.types.Row;
 import org.apache.flink.util.FileUtils;
 
-import org.junit.Assert;
-import org.junit.Rule;
-import org.junit.Test;
+import org.junit.jupiter.api.Test;
+import org.junit.jupiter.api.extension.RegisterExtension;
+import org.junit.jupiter.api.io.TempDir;
 
 import java.io.File;
 import java.io.IOException;
 import java.io.PrintWriter;
 import java.net.URI;
+import java.nio.file.Path;
 import java.util.ArrayList;
 import java.util.Arrays;
 import java.util.List;
 
+import static org.assertj.core.api.Assertions.assertThat;
+
 /** ITCase to test json format for {@link JsonFormatFactory}. */
-public class JsonBatchFileSystemITCase extends BatchFileSystemITCaseBase {
+class JsonBatchFileSystemITCase extends BatchFileSystemITCaseBase {
+
+    @RegisterExtension LegacyRowExtension usesLegacyRows = LegacyRowExtension.INSTANCE;

Review Comment:
   This piece accepts your suggestion, still keep junit4, LegacyRowExtension.class new commit will be deleted



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] flinkbot commented on pull request #19590: [FLINK-27352][tests] [JUnit5 Migration] Module: flink-json

Posted by GitBox <gi...@apache.org>.
flinkbot commented on PR #19590:
URL: https://github.com/apache/flink/pull/19590#issuecomment-1110544425

   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "bd1c9f50f4efa4fc07afc15be8391266cf87e70b",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "bd1c9f50f4efa4fc07afc15be8391266cf87e70b",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * bd1c9f50f4efa4fc07afc15be8391266cf87e70b UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] BiGsuw commented on pull request #19590: [FLINK-27352][tests] [JUnit5 Migration] Module: flink-json

Posted by GitBox <gi...@apache.org>.
BiGsuw commented on PR #19590:
URL: https://github.com/apache/flink/pull/19590#issuecomment-1114714764

   > 
   
   This ticket (https://issues.apache.org/jira/browse/FLINK-27352) was created by me, my jira name is EMing Zhou, thanks


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] zentol commented on a diff in pull request #19590: [FLINK-27352][tests] [JUnit5 Migration] Module: flink-json

Posted by GitBox <gi...@apache.org>.
zentol commented on code in PR #19590:
URL: https://github.com/apache/flink/pull/19590#discussion_r862825359


##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/ogg/OggJsonSerDeSchemaTest.java:
##########
@@ -119,18 +116,19 @@ public void testDeserializationWithMetadata(String resourceFile) throws Exceptio
 
         final SimpleCollector collector = new SimpleCollector();
         deserializationSchema.deserialize(firstLine.getBytes(StandardCharsets.UTF_8), collector);
-        assertEquals(1, collector.list.size());
+        assertThat(collector.list).hasSize(1);
 
         Consumer<RowData> consumer =
                 row -> {
-                    assertEquals(101, row.getInt(0));
-                    assertEquals("scooter", row.getString(1).toString());
-                    assertEquals("Small 2-wheel scooter", row.getString(2).toString());
-                    assertEquals(3.140000104904175, row.getFloat(3), 1e-15);
-                    assertEquals("OGG.TBL_TEST", row.getString(4).toString());
-                    assertEquals("id", row.getArray(5).getString(0).toString());
-                    assertEquals(1589377175766L, row.getTimestamp(6, 6).getMillisecond());
-                    assertEquals(1589384406000L, row.getTimestamp(7, 6).getMillisecond());
+                    assertThat(row.getInt(0)).isEqualTo(101);
+                    assertThat(row.getString(1).toString()).isEqualTo("scooter");
+                    assertThat(row.getString(2).toString()).isEqualTo("Small 2-wheel scooter");
+                    assertThat(row.getFloat(3))
+                            .isCloseTo(3.140000104904175f, Percentage.withPercentage(1e-15));
+                    assertThat(row.getString(4).toString()).isEqualTo("OGG.TBL_TEST");
+                    assertThat(row.getArray(5).getString(0).toString()).isEqualTo("id");
+                    assertThat(row.getTimestamp(6, 6).getMillisecond()).isEqualTo(1589377175766L);
+                    assertThat(row.getTimestamp(7, 6).getMillisecond()).isEqualTo(1589384406000L);
                 };

Review Comment:
   ```suggestion
                   });
   ```



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/maxwell/MaxwellJsonSerDerTest.java:
##########
@@ -85,23 +83,23 @@ public void testDeserializationWithMetadata() throws Exception {
                         TimestampFormat.ISO_8601);
         final SimpleCollector collector = new SimpleCollector();
         deserializationSchema.deserialize(firstLine.getBytes(StandardCharsets.UTF_8), collector);
-        assertEquals(1, collector.list.size());
+        assertThat(collector.list).hasSize(1);
         Consumer<RowData> consumer =
                 row -> {
-                    assertThat(row.getInt(0), equalTo(101));
-                    assertThat(row.getString(1).toString(), equalTo("scooter"));
-                    assertThat(row.getString(2).toString(), equalTo("Small 2-wheel scooter"));
-                    assertThat(row.getFloat(3), equalTo(3.14f));
-                    assertThat(row.getString(4).toString(), equalTo("test"));
-                    assertThat(row.getString(5).toString(), equalTo("product"));
-                    assertThat(row.getArray(6).getString(0).toString(), equalTo("id"));
-                    assertThat(row.getTimestamp(7, 3).getMillisecond(), equalTo(1596684883000L));
+                    assertThat(row.getInt(0)).isEqualTo(101);
+                    assertThat(row.getString(1).toString()).isEqualTo("scooter");
+                    assertThat(row.getString(2).toString()).isEqualTo("Small 2-wheel scooter");
+                    assertThat(row.getFloat(3)).isEqualTo(3.14f);
+                    assertThat(row.getString(4).toString()).isEqualTo("test");
+                    assertThat(row.getString(5).toString()).isEqualTo("product");
+                    assertThat(row.getArray(6).getString(0).toString()).isEqualTo("id");
+                    assertThat(row.getTimestamp(7, 3).getMillisecond()).isEqualTo(1596684883000L);
                 };

Review Comment:
   ```suggestion
                   });
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] BiGsuw commented on a diff in pull request #19590: [FLINK-27352][tests] [JUnit5 Migration] Module: flink-json

Posted by GitBox <gi...@apache.org>.
BiGsuw commented on code in PR #19590:
URL: https://github.com/apache/flink/pull/19590#discussion_r862743234


##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/JsonBatchFileSystemITCase.java:
##########
@@ -36,10 +36,14 @@
 import java.util.Arrays;
 import java.util.List;
 
+import static org.assertj.core.api.Assertions.assertThat;
+
 /** ITCase to test json format for {@link JsonFormatFactory}. */
-public class JsonBatchFileSystemITCase extends BatchFileSystemITCaseBase {
+class JsonBatchFileSystemITCase extends BatchFileSystemITCaseBase {

Review Comment:
   ok, i now change it back to junit4 and accept your suggestion to apply assertThat



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] zentol commented on a diff in pull request #19590: [FLINK-27352][tests] [JUnit5 Migration] Module: flink-json

Posted by GitBox <gi...@apache.org>.
zentol commented on code in PR #19590:
URL: https://github.com/apache/flink/pull/19590#discussion_r862825207


##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/ogg/OggJsonSerDeSchemaTest.java:
##########
@@ -119,18 +116,19 @@ public void testDeserializationWithMetadata(String resourceFile) throws Exceptio
 
         final SimpleCollector collector = new SimpleCollector();
         deserializationSchema.deserialize(firstLine.getBytes(StandardCharsets.UTF_8), collector);
-        assertEquals(1, collector.list.size());
+        assertThat(collector.list).hasSize(1);
 
         Consumer<RowData> consumer =
                 row -> {

Review Comment:
   ```suggestion
           assertThat(collector.list.get(0))
               .satisfies(row -> {
   ```



##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/ogg/OggJsonSerDeSchemaTest.java:
##########
@@ -119,18 +116,19 @@ public void testDeserializationWithMetadata(String resourceFile) throws Exceptio
 
         final SimpleCollector collector = new SimpleCollector();
         deserializationSchema.deserialize(firstLine.getBytes(StandardCharsets.UTF_8), collector);
-        assertEquals(1, collector.list.size());
+        assertThat(collector.list).hasSize(1);
 
         Consumer<RowData> consumer =
                 row -> {
-                    assertEquals(101, row.getInt(0));
-                    assertEquals("scooter", row.getString(1).toString());
-                    assertEquals("Small 2-wheel scooter", row.getString(2).toString());
-                    assertEquals(3.140000104904175, row.getFloat(3), 1e-15);
-                    assertEquals("OGG.TBL_TEST", row.getString(4).toString());
-                    assertEquals("id", row.getArray(5).getString(0).toString());
-                    assertEquals(1589377175766L, row.getTimestamp(6, 6).getMillisecond());
-                    assertEquals(1589384406000L, row.getTimestamp(7, 6).getMillisecond());
+                    assertThat(row.getInt(0)).isEqualTo(101);
+                    assertThat(row.getString(1).toString()).isEqualTo("scooter");
+                    assertThat(row.getString(2).toString()).isEqualTo("Small 2-wheel scooter");
+                    assertThat(row.getFloat(3))
+                            .isCloseTo(3.140000104904175f, Percentage.withPercentage(1e-15));
+                    assertThat(row.getString(4).toString()).isEqualTo("OGG.TBL_TEST");
+                    assertThat(row.getArray(5).getString(0).toString()).isEqualTo("id");
+                    assertThat(row.getTimestamp(6, 6).getMillisecond()).isEqualTo(1589377175766L);
+                    assertThat(row.getTimestamp(7, 6).getMillisecond()).isEqualTo(1589384406000L);
                 };
         consumer.accept(collector.list.get(0));

Review Comment:
   ```suggestion
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] BiGsuw commented on a diff in pull request #19590: [FLINK-27352][tests] [JUnit5 Migration] Module: flink-json

Posted by GitBox <gi...@apache.org>.
BiGsuw commented on code in PR #19590:
URL: https://github.com/apache/flink/pull/19590#discussion_r861939399


##########
flink-formats/flink-json/src/test/java/org/apache/flink/formats/json/debezium/DebeziumJsonSerDeSchemaTest.java:
##########
@@ -318,7 +307,7 @@ private void testDeserializationWithMetadata(
         final SimpleCollector collector = new SimpleCollector();
         deserializationSchema.deserialize(firstLine.getBytes(StandardCharsets.UTF_8), collector);
 
-        assertEquals(1, collector.list.size());
+        assertThat(collector.list.size()).isEqualTo(1);
         testConsumer.accept(collector.list.get(0));

Review Comment:
   I  use assertThat(collector.list).hasSize(1)  replace  assertThat(collector.list.size()).isEqualTo(1)  ,however I don't know how to use the sa method here, can you give me an example,thanks



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org