You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "sunjincheng (JIRA)" <ji...@apache.org> on 2017/06/01 14:32:04 UTC
[jira] [Assigned] (FLINK-5270) Refactor the batch Scala-expression
Table API tests
[ https://issues.apache.org/jira/browse/FLINK-5270?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
sunjincheng reassigned FLINK-5270:
----------------------------------
Assignee: sunjincheng
> Refactor the batch Scala-expression Table API tests
> ---------------------------------------------------
>
> Key: FLINK-5270
> URL: https://issues.apache.org/jira/browse/FLINK-5270
> Project: Flink
> Issue Type: Sub-task
> Components: Table API & SQL
> Affects Versions: 1.2.0
> Reporter: Fabian Hueske
> Assignee: sunjincheng
>
> Most tests of the batch Scala Table API tests are full-blown integration tests which are rather expensive to execute.
> Most of tests should be converted into unit tests that validate the resulting execution plan (consisting of {{DataSetRel}} nodes) based on the {{TableTestBase}} class.
> In addition we need a few integration tests that check the translation from the optimized {{DataSetRel}} plan to a DataSet program. These tests should be done by extending the {{TableProgramsCollectionTestBase}} (see FLINK-5268). These tests must cover the translation process of each {{DataSetRel}} operator.
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)