You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@iceberg.apache.org by GitBox <gi...@apache.org> on 2021/01/16 18:39:34 UTC

[GitHub] [iceberg] rdblue edited a comment on pull request #1893: Flink: Support filter pushdown in IcebergTableSource

rdblue edited a comment on pull request #1893:
URL: https://github.com/apache/iceberg/pull/1893#issuecomment-761613165


   @zhangjun0x01, there are still a few things to fix in the tests, mostly minor. But I also found a major problem, which is that `TestFlinkTableSource` now takes a very long time to run. The problem is that the tests run for each format and for 3 different catalog configurations. That means each test runs 9 times and because it is a test that actually runs SQL it takes a long time. The whole suite takes much longer than needed; on my machine, it took 20 minutes with code coverage turned on!
   
   The filter pushdown tests only need to run for one catalog and one file format because the purpose of those tests is to validate the assumptions of the `FlinkFilter` class with real filters from Flink SQL. The file format and catalog are orthogonal and we don't need to test each one of them. Can you change the parameterization to run with only Avro and a single catalog case?


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org
For additional commands, e-mail: issues-help@iceberg.apache.org