You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by GitBox <gi...@apache.org> on 2022/11/02 12:11:46 UTC

[GitHub] [flink] snuyanzin opened a new pull request, #21221: [FLINK-29846] Upgrade ArchUnit to 1.0.0

snuyanzin opened a new pull request, #21221:
URL: https://github.com/apache/flink/pull/21221

   ## What is the purpose of the change
   The PR upgrades ArchUnit to 1.0.0
   
   There are some breaking changes mentioned at [1] and [2]
   especially 
   ```
   The ArchUnit types Function, Predicate and Optional have been replaced by the JDK 8 equivalents
   DescribedPredicate now extends the JDK 8 Predicate, so apply(..) has to be replaced by test(..)
   
   As mentioned in Enhancements/Core ArchRules will now by default reject evaluating if the set passed to the should-clause is empty. This will break existing rules that don't check any elements in their should-clause. You can restore the old behavior by setting the ArchUnit property archRule.failOnEmptyShould=false
   ```
   for the latest one in 0.23.1 there was added an option allowing making it per rule[3]. 
   In Flink it impacts `INTEGRATION_TEST_ENDING_WITH_ITCASE` rule since not in every module there are inheritors from `AbstractTestBase`
   [1] https://github.com/TNG/ArchUnit/releases/tag/v1.0.0-rc1
   [2] https://github.com/TNG/ArchUnit/releases/tag/v0.23.0
   [3] https://github.com/TNG/ArchUnit/releases/tag/v0.23.1
   
   ## Verifying this change
   
   This change is a trivial rework / code cleanup without any test coverage.
   
   ## Does this pull request potentially affect one of the following parts:
   
     - Dependencies (does it add or upgrade a dependency): (yes )
     - The public API, i.e., is any changed class annotated with `@Public(Evolving)`: ( no)
     - The serializers: ( no)
     - The runtime per-record code paths (performance sensitive): ( no)
     - Anything that affects deployment or recovery: JobManager (and its components), Checkpointing, Kubernetes/Yarn, ZooKeeper: ( no)
     - The S3 file system connector: ( no)
   
   ## Documentation
   
     - Does this pull request introduce a new feature? ( no)
     - If yes, how is the feature documented? (not applicable)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] snuyanzin commented on a diff in pull request #21221: [FLINK-29846] Upgrade ArchUnit to 1.0.0

Posted by GitBox <gi...@apache.org>.
snuyanzin commented on code in PR #21221:
URL: https://github.com/apache/flink/pull/21221#discussion_r1045750909


##########
flink-architecture-tests/flink-architecture-tests-production/archunit-violations/b8900323-6aab-4e7e-9b17-f53b3c3dca46:
##########
@@ -17,11 +17,10 @@ Method <org.apache.flink.connector.hbase.sink.HBaseSinkFunction.prepareRuntimeCo
 Method <org.apache.flink.connector.hbase.sink.HBaseSinkFunction.prepareRuntimeConfiguration()> has return type <org.apache.hadoop.conf.Configuration> in (HBaseSinkFunction.java:0)
 Method <org.apache.flink.connector.hbase.source.HBaseRowDataLookupFunction.prepareRuntimeConfiguration()> calls method <org.apache.hadoop.conf.Configuration.get(java.lang.String)> in (HBaseRowDataLookupFunction.java:147)
 Method <org.apache.flink.connector.hbase.source.HBaseRowDataLookupFunction.prepareRuntimeConfiguration()> has return type <org.apache.hadoop.conf.Configuration> in (HBaseRowDataLookupFunction.java:0)
-Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:113)
-Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:114)
+Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:101)
+Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:102)
+Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:108)
 Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> has return type <org.apache.hadoop.conf.Configuration> in (HBaseConnectorOptionsUtil.java:0)
-Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.lambda$getHBaseConfiguration$0(org.apache.hadoop.conf.Configuration, java.lang.Object, java.lang.Object)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:120)
-Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.lambda$getHBaseConfiguration$0(org.apache.hadoop.conf.Configuration, java.lang.Object, java.lang.Object)> has parameter of type <org.apache.hadoop.conf.Configuration> in (HBaseConnectorOptionsUtil.java:0)

Review Comment:
   Finally it looks like it is NOT lost.
   Currently there are 3 usages of `Configuration.set(java.lang.String, java.lang.String)` logged.
   Before this update it was logged like
   ```
   Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:113)
   Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:114)
   Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.lambda$getHBaseConfiguration$0(org.apache.hadoop.conf.Configuration, java.lang.Object, java.lang.Object)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:120)
   Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.lambda$getHBaseConfiguration$0(org.apache.hadoop.conf.Configuration, java.lang.Object, java.lang.Object)> has parameter of type <org.apache.hadoop.conf.Configuration> in (HBaseConnectorOptionsUtil.java:0)
   ```
   So it required 2 lines for the last one since the last one in in lambda like 
   ```java
   properties.forEach((k, v) -> hbaseClientConf.set(k.toString(), v.toString()));
   ```
   now ArchUnit logs lambda also with one line
   so 3 lines for 3 violations
   ```
   Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:101)
   Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:102)
   Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:108)
   
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] zentol commented on a diff in pull request #21221: [FLINK-29846] Upgrade ArchUnit to 1.0.0

Posted by GitBox <gi...@apache.org>.
zentol commented on code in PR #21221:
URL: https://github.com/apache/flink/pull/21221#discussion_r1046023273


##########
flink-architecture-tests/flink-architecture-tests-production/archunit-violations/b8900323-6aab-4e7e-9b17-f53b3c3dca46:
##########
@@ -17,11 +17,10 @@ Method <org.apache.flink.connector.hbase.sink.HBaseSinkFunction.prepareRuntimeCo
 Method <org.apache.flink.connector.hbase.sink.HBaseSinkFunction.prepareRuntimeConfiguration()> has return type <org.apache.hadoop.conf.Configuration> in (HBaseSinkFunction.java:0)
 Method <org.apache.flink.connector.hbase.source.HBaseRowDataLookupFunction.prepareRuntimeConfiguration()> calls method <org.apache.hadoop.conf.Configuration.get(java.lang.String)> in (HBaseRowDataLookupFunction.java:147)
 Method <org.apache.flink.connector.hbase.source.HBaseRowDataLookupFunction.prepareRuntimeConfiguration()> has return type <org.apache.hadoop.conf.Configuration> in (HBaseRowDataLookupFunction.java:0)
-Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:113)
-Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:114)
+Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:101)
+Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:102)
+Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:108)
 Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> has return type <org.apache.hadoop.conf.Configuration> in (HBaseConnectorOptionsUtil.java:0)
-Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.lambda$getHBaseConfiguration$0(org.apache.hadoop.conf.Configuration, java.lang.Object, java.lang.Object)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:120)
-Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.lambda$getHBaseConfiguration$0(org.apache.hadoop.conf.Configuration, java.lang.Object, java.lang.Object)> has parameter of type <org.apache.hadoop.conf.Configuration> in (HBaseConnectorOptionsUtil.java:0)

Review Comment:
   IOW that the entry is gone shouldn't be a problem afaict.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] snuyanzin commented on a diff in pull request #21221: [FLINK-29846] Upgrade ArchUnit to 1.0.0

Posted by GitBox <gi...@apache.org>.
snuyanzin commented on code in PR #21221:
URL: https://github.com/apache/flink/pull/21221#discussion_r1011782751


##########
flink-architecture-tests/flink-architecture-tests-base/src/main/java/org/apache/flink/architecture/common/Predicates.java:
##########
@@ -48,7 +47,9 @@ public static DescribedPredicate<JavaClass> areDirectlyAnnotatedWithAtLeastOneOf
             Class<? extends Annotation>... annotations) {
         return Arrays.stream(annotations)
                 .map(CanBeAnnotated.Predicates::annotatedWith)
-                .reduce(DescribedPredicate::or)
+                .reduce(
+                        (canBeAnnotatedDescribedPredicate, other) ->
+                                canBeAnnotatedDescribedPredicate.or(other))

Review Comment:
   seems like that here
   https://github.com/TNG/ArchUnit/commit/2065e012e26b2913257efc04fc4d4868176b459e#diff-05df2dd6f3ee640302d7633e8d17fb55cee3a3a43c3d72e41e727a0ef8037dc3R191-R209



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] snuyanzin commented on a diff in pull request #21221: [FLINK-29846] Upgrade ArchUnit to 1.0.0

Posted by GitBox <gi...@apache.org>.
snuyanzin commented on code in PR #21221:
URL: https://github.com/apache/flink/pull/21221#discussion_r1045750909


##########
flink-architecture-tests/flink-architecture-tests-production/archunit-violations/b8900323-6aab-4e7e-9b17-f53b3c3dca46:
##########
@@ -17,11 +17,10 @@ Method <org.apache.flink.connector.hbase.sink.HBaseSinkFunction.prepareRuntimeCo
 Method <org.apache.flink.connector.hbase.sink.HBaseSinkFunction.prepareRuntimeConfiguration()> has return type <org.apache.hadoop.conf.Configuration> in (HBaseSinkFunction.java:0)
 Method <org.apache.flink.connector.hbase.source.HBaseRowDataLookupFunction.prepareRuntimeConfiguration()> calls method <org.apache.hadoop.conf.Configuration.get(java.lang.String)> in (HBaseRowDataLookupFunction.java:147)
 Method <org.apache.flink.connector.hbase.source.HBaseRowDataLookupFunction.prepareRuntimeConfiguration()> has return type <org.apache.hadoop.conf.Configuration> in (HBaseRowDataLookupFunction.java:0)
-Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:113)
-Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:114)
+Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:101)
+Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:102)
+Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:108)
 Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> has return type <org.apache.hadoop.conf.Configuration> in (HBaseConnectorOptionsUtil.java:0)
-Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.lambda$getHBaseConfiguration$0(org.apache.hadoop.conf.Configuration, java.lang.Object, java.lang.Object)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:120)
-Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.lambda$getHBaseConfiguration$0(org.apache.hadoop.conf.Configuration, java.lang.Object, java.lang.Object)> has parameter of type <org.apache.hadoop.conf.Configuration> in (HBaseConnectorOptionsUtil.java:0)

Review Comment:
   Finally it looks like it is NOT lost.
   Currently there are 3 usages of `Configuration.set(java.lang.String, java.lang.String)` logged.
   Before this update it was logged like
   ```
   Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:113)
   Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:114)
   Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.lambda$getHBaseConfiguration$0(org.apache.hadoop.conf.Configuration, java.lang.Object, java.lang.Object)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:120)
   Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.lambda$getHBaseConfiguration$0(org.apache.hadoop.conf.Configuration, java.lang.Object, java.lang.Object)> has parameter of type <org.apache.hadoop.conf.Configuration> in (HBaseConnectorOptionsUtil.java:0)
   ```
   So it required 2 lines for the last one since the last one is in lambda like 
   ```java
   properties.forEach((k, v) -> hbaseClientConf.set(k.toString(), v.toString()));
   ```
   now ArchUnit logs lambda also with one line
   so 3 lines for 3 violations
   ```
   Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:101)
   Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:102)
   Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:108)
   
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] zentol commented on pull request #21221: [FLINK-29846] Upgrade ArchUnit to 1.0.0

Posted by GitBox <gi...@apache.org>.
zentol commented on PR #21221:
URL: https://github.com/apache/flink/pull/21221#issuecomment-1302016120

   Looks like it finds more violations now.
   
   ```
   Nov 03 07:53:12 [ERROR] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.362 s <<< FAILURE! - in org.apache.flink.architecture.rules.ConnectorRules
   Nov 03 07:53:12 [ERROR] ConnectorRules.CONNECTOR_CLASSES_ONLY_DEPEND_ON_PUBLIC_API  Time elapsed: 0.359 s  <<< ERROR!
   Nov 03 07:53:12 com.tngtech.archunit.library.freeze.StoreUpdateFailedException: Updating frozen violations is disabled (enable by configuration freeze.store.default.allowStoreUpdate=true)
   ```
   
   Unfortunately the error message is still terrible.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] snuyanzin commented on a diff in pull request #21221: [FLINK-29846] Upgrade ArchUnit to 1.0.0

Posted by GitBox <gi...@apache.org>.
snuyanzin commented on code in PR #21221:
URL: https://github.com/apache/flink/pull/21221#discussion_r1015506243


##########
flink-architecture-tests/flink-architecture-tests-production/archunit-violations/b8900323-6aab-4e7e-9b17-f53b3c3dca46:
##########
@@ -17,11 +17,10 @@ Method <org.apache.flink.connector.hbase.sink.HBaseSinkFunction.prepareRuntimeCo
 Method <org.apache.flink.connector.hbase.sink.HBaseSinkFunction.prepareRuntimeConfiguration()> has return type <org.apache.hadoop.conf.Configuration> in (HBaseSinkFunction.java:0)
 Method <org.apache.flink.connector.hbase.source.HBaseRowDataLookupFunction.prepareRuntimeConfiguration()> calls method <org.apache.hadoop.conf.Configuration.get(java.lang.String)> in (HBaseRowDataLookupFunction.java:147)
 Method <org.apache.flink.connector.hbase.source.HBaseRowDataLookupFunction.prepareRuntimeConfiguration()> has return type <org.apache.hadoop.conf.Configuration> in (HBaseRowDataLookupFunction.java:0)
-Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:113)
-Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:114)
+Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:101)
+Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:102)
+Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:108)
 Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> has return type <org.apache.hadoop.conf.Configuration> in (HBaseConnectorOptionsUtil.java:0)
-Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.lambda$getHBaseConfiguration$0(org.apache.hadoop.conf.Configuration, java.lang.Object, java.lang.Object)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:120)
-Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.lambda$getHBaseConfiguration$0(org.apache.hadoop.conf.Configuration, java.lang.Object, java.lang.Object)> has parameter of type <org.apache.hadoop.conf.Configuration> in (HBaseConnectorOptionsUtil.java:0)

Review Comment:
   this line is disappearing after applying this commit done under https://github.com/TNG/ArchUnit/pull/847
   It looks like the merge 2 lambda refs (for this case) into one
   within the code `org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil#getHBaseConfiguration`
   there are 3 `set` methods
   ```java
       public static Configuration getHBaseConfiguration(ReadableConfig tableOptions) {
           // create default configuration from current runtime env (`hbase-site.xml` in classpath)
           // first,
           Configuration hbaseClientConf = HBaseConfigurationUtil.getHBaseConfiguration();
           hbaseClientConf.set(HConstants.ZOOKEEPER_QUORUM, tableOptions.get(ZOOKEEPER_QUORUM));
           hbaseClientConf.set(
                   HConstants.ZOOKEEPER_ZNODE_PARENT, tableOptions.get(ZOOKEEPER_ZNODE_PARENT));
           // add HBase properties
           final Properties properties =
                   getHBaseClientProperties(
                           ((org.apache.flink.configuration.Configuration) tableOptions).toMap());
           properties.forEach((k, v) -> hbaseClientConf.set(k.toString(), v.toString()));
           return hbaseClientConf;
       }
   ```
   and now there are only four lines for that
   ```
   Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:101)
   Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:102)
   Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:108)
   Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> has return type <org.apache.hadoop.conf.Configuration> in (HBaseConnectorOptionsUtil.java:0)
   ```
   
   still not clear why it was one more before that... I checked other classes with lambda e.g. `org.apache.flink.connector.jdbc.internal.converter.OracleRowConverter` and there is no such behavior...
   Will have a look a bit more may be will find something



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] zentol commented on a diff in pull request #21221: [FLINK-29846] Upgrade ArchUnit to 1.0.0

Posted by GitBox <gi...@apache.org>.
zentol commented on code in PR #21221:
URL: https://github.com/apache/flink/pull/21221#discussion_r1045782253


##########
flink-architecture-tests/flink-architecture-tests-production/archunit-violations/b8900323-6aab-4e7e-9b17-f53b3c3dca46:
##########
@@ -17,11 +17,10 @@ Method <org.apache.flink.connector.hbase.sink.HBaseSinkFunction.prepareRuntimeCo
 Method <org.apache.flink.connector.hbase.sink.HBaseSinkFunction.prepareRuntimeConfiguration()> has return type <org.apache.hadoop.conf.Configuration> in (HBaseSinkFunction.java:0)
 Method <org.apache.flink.connector.hbase.source.HBaseRowDataLookupFunction.prepareRuntimeConfiguration()> calls method <org.apache.hadoop.conf.Configuration.get(java.lang.String)> in (HBaseRowDataLookupFunction.java:147)
 Method <org.apache.flink.connector.hbase.source.HBaseRowDataLookupFunction.prepareRuntimeConfiguration()> has return type <org.apache.hadoop.conf.Configuration> in (HBaseRowDataLookupFunction.java:0)
-Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:113)
-Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:114)
+Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:101)
+Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:102)
+Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:108)
 Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> has return type <org.apache.hadoop.conf.Configuration> in (HBaseConnectorOptionsUtil.java:0)
-Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.lambda$getHBaseConfiguration$0(org.apache.hadoop.conf.Configuration, java.lang.Object, java.lang.Object)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:120)
-Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.lambda$getHBaseConfiguration$0(org.apache.hadoop.conf.Configuration, java.lang.Object, java.lang.Object)> has parameter of type <org.apache.hadoop.conf.Configuration> in (HBaseConnectorOptionsUtil.java:0)

Review Comment:
   AFAICT this hasn't got anything to do with "merging" errors.
   
   The lambda issue was for having a Hadoop configuration _parameter_, a different error altogether, and that is now gone.
   
   I would assume that the parameter type check is just no longer applied to lambdas, presumably because when you define such a lambda you either
   a) access said type somewhere else (to pass as a parameter)
   b) pass the lambda as an argument, but then you get another violation anyway.
   
   (nevermind that this "architecture" test is pretty weird since it complains about _every single usage of external libraries_).



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] zentol commented on a diff in pull request #21221: [FLINK-29846] Upgrade ArchUnit to 1.0.0

Posted by GitBox <gi...@apache.org>.
zentol commented on code in PR #21221:
URL: https://github.com/apache/flink/pull/21221#discussion_r1015150448


##########
flink-architecture-tests/flink-architecture-tests-production/archunit-violations/b8900323-6aab-4e7e-9b17-f53b3c3dca46:
##########
@@ -17,11 +17,10 @@ Method <org.apache.flink.connector.hbase.sink.HBaseSinkFunction.prepareRuntimeCo
 Method <org.apache.flink.connector.hbase.sink.HBaseSinkFunction.prepareRuntimeConfiguration()> has return type <org.apache.hadoop.conf.Configuration> in (HBaseSinkFunction.java:0)
 Method <org.apache.flink.connector.hbase.source.HBaseRowDataLookupFunction.prepareRuntimeConfiguration()> calls method <org.apache.hadoop.conf.Configuration.get(java.lang.String)> in (HBaseRowDataLookupFunction.java:147)
 Method <org.apache.flink.connector.hbase.source.HBaseRowDataLookupFunction.prepareRuntimeConfiguration()> has return type <org.apache.hadoop.conf.Configuration> in (HBaseRowDataLookupFunction.java:0)
-Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:113)
-Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:114)
+Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:101)
+Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:102)
+Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:108)
 Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> has return type <org.apache.hadoop.conf.Configuration> in (HBaseConnectorOptionsUtil.java:0)
-Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.lambda$getHBaseConfiguration$0(org.apache.hadoop.conf.Configuration, java.lang.Object, java.lang.Object)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:120)
-Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.lambda$getHBaseConfiguration$0(org.apache.hadoop.conf.Configuration, java.lang.Object, java.lang.Object)> has parameter of type <org.apache.hadoop.conf.Configuration> in (HBaseConnectorOptionsUtil.java:0)

Review Comment:
   We seem to have lost this violation.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] snuyanzin commented on a diff in pull request #21221: [FLINK-29846] Upgrade ArchUnit to 1.0.0

Posted by GitBox <gi...@apache.org>.
snuyanzin commented on code in PR #21221:
URL: https://github.com/apache/flink/pull/21221#discussion_r1015506243


##########
flink-architecture-tests/flink-architecture-tests-production/archunit-violations/b8900323-6aab-4e7e-9b17-f53b3c3dca46:
##########
@@ -17,11 +17,10 @@ Method <org.apache.flink.connector.hbase.sink.HBaseSinkFunction.prepareRuntimeCo
 Method <org.apache.flink.connector.hbase.sink.HBaseSinkFunction.prepareRuntimeConfiguration()> has return type <org.apache.hadoop.conf.Configuration> in (HBaseSinkFunction.java:0)
 Method <org.apache.flink.connector.hbase.source.HBaseRowDataLookupFunction.prepareRuntimeConfiguration()> calls method <org.apache.hadoop.conf.Configuration.get(java.lang.String)> in (HBaseRowDataLookupFunction.java:147)
 Method <org.apache.flink.connector.hbase.source.HBaseRowDataLookupFunction.prepareRuntimeConfiguration()> has return type <org.apache.hadoop.conf.Configuration> in (HBaseRowDataLookupFunction.java:0)
-Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:113)
-Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:114)
+Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:101)
+Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:102)
+Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:108)
 Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> has return type <org.apache.hadoop.conf.Configuration> in (HBaseConnectorOptionsUtil.java:0)
-Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.lambda$getHBaseConfiguration$0(org.apache.hadoop.conf.Configuration, java.lang.Object, java.lang.Object)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:120)
-Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.lambda$getHBaseConfiguration$0(org.apache.hadoop.conf.Configuration, java.lang.Object, java.lang.Object)> has parameter of type <org.apache.hadoop.conf.Configuration> in (HBaseConnectorOptionsUtil.java:0)

Review Comment:
   this line is disappearing after applying this commit https://github.com/TNG/ArchUnit/commit/a404fb4b5a948011b7764f5ba0a31426bc98bfe8 done under https://github.com/TNG/ArchUnit/pull/847
   It looks like the merge 2 lambda refs (for this case) into one
   within the code `org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil#getHBaseConfiguration`
   there are 3 `set` methods
   ```java
       public static Configuration getHBaseConfiguration(ReadableConfig tableOptions) {
           // create default configuration from current runtime env (`hbase-site.xml` in classpath)
           // first,
           Configuration hbaseClientConf = HBaseConfigurationUtil.getHBaseConfiguration();
           hbaseClientConf.set(HConstants.ZOOKEEPER_QUORUM, tableOptions.get(ZOOKEEPER_QUORUM));
           hbaseClientConf.set(
                   HConstants.ZOOKEEPER_ZNODE_PARENT, tableOptions.get(ZOOKEEPER_ZNODE_PARENT));
           // add HBase properties
           final Properties properties =
                   getHBaseClientProperties(
                           ((org.apache.flink.configuration.Configuration) tableOptions).toMap());
           properties.forEach((k, v) -> hbaseClientConf.set(k.toString(), v.toString()));
           return hbaseClientConf;
       }
   ```
   and now there are only four lines for that
   ```
   Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:101)
   Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:102)
   Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> calls method <org.apache.hadoop.conf.Configuration.set(java.lang.String, java.lang.String)> in (HBaseConnectorOptionsUtil.java:108)
   Method <org.apache.flink.connector.hbase.table.HBaseConnectorOptionsUtil.getHBaseConfiguration(org.apache.flink.configuration.ReadableConfig)> has return type <org.apache.hadoop.conf.Configuration> in (HBaseConnectorOptionsUtil.java:0)
   ```
   
   still not clear why it was one more before that... I checked other classes with lambda e.g. `org.apache.flink.connector.jdbc.internal.converter.OracleRowConverter` and there is no such behavior...
   Will have a look a bit more may be will find something



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] zentol commented on a diff in pull request #21221: [FLINK-29846] Upgrade ArchUnit to 1.0.0

Posted by GitBox <gi...@apache.org>.
zentol commented on code in PR #21221:
URL: https://github.com/apache/flink/pull/21221#discussion_r1011673954


##########
flink-architecture-tests/flink-architecture-tests-test/src/main/java/org/apache/flink/architecture/rules/ITCaseRules.java:
##########
@@ -57,6 +57,7 @@ public class ITCaseRules {
                                     .doNotHaveModifier(ABSTRACT)
                                     .should()
                                     .haveSimpleNameEndingWith("ITCase"))
+                    .allowEmptyShould(true)

Review Comment:
   add a comment why we allow this.



##########
flink-architecture-tests/flink-architecture-tests-base/src/main/java/org/apache/flink/architecture/common/Predicates.java:
##########
@@ -61,7 +62,7 @@ public static DescribedPredicate<JavaClass> containAnyFieldsInClassHierarchyThat
             DescribedPredicate<? super JavaField> predicate) {
         return new ContainAnyFieldsThatPredicate<>(
                 "fields",
-                new ChainableFunction<JavaClass, Set<JavaField>>() {
+                new Function<JavaClass, Set<JavaField>>() {

Review Comment:
   can this be a lambda function?



##########
flink-architecture-tests/flink-architecture-tests-base/src/main/java/org/apache/flink/architecture/common/Predicates.java:
##########
@@ -48,7 +47,9 @@ public static DescribedPredicate<JavaClass> areDirectlyAnnotatedWithAtLeastOneOf
             Class<? extends Annotation>... annotations) {
         return Arrays.stream(annotations)
                 .map(CanBeAnnotated.Predicates::annotatedWith)
-                .reduce(DescribedPredicate::or)
+                .reduce(
+                        (canBeAnnotatedDescribedPredicate, other) ->
+                                canBeAnnotatedDescribedPredicate.or(other))

Review Comment:
   Did the added static `or` method ruin this? :weary:  



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] flinkbot commented on pull request #21221: [FLINK-29846] Upgrade ArchUnit to 1.0.0

Posted by GitBox <gi...@apache.org>.
flinkbot commented on PR #21221:
URL: https://github.com/apache/flink/pull/21221#issuecomment-1300270832

   <!--
   Meta data
   {
     "version" : 1,
     "metaDataEntries" : [ {
       "hash" : "8ab7e44f8ee7512fe77ad13ae8c079950de4c3f3",
       "status" : "UNKNOWN",
       "url" : "TBD",
       "triggerID" : "8ab7e44f8ee7512fe77ad13ae8c079950de4c3f3",
       "triggerType" : "PUSH"
     } ]
   }-->
   ## CI report:
   
   * 8ab7e44f8ee7512fe77ad13ae8c079950de4c3f3 UNKNOWN
   
   <details>
   <summary>Bot commands</summary>
     The @flinkbot bot supports the following commands:
   
    - `@flinkbot run azure` re-run the last Azure build
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


[GitHub] [flink] zentol merged pull request #21221: [FLINK-29846] Upgrade ArchUnit to 1.0.0

Posted by GitBox <gi...@apache.org>.
zentol merged PR #21221:
URL: https://github.com/apache/flink/pull/21221


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@flink.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org