You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@flink.apache.org by "Hequn Cheng (JIRA)" <ji...@apache.org> on 2018/01/23 12:51:00 UTC

[jira] [Created] (FLINK-8492) Fix unsupported exception for udtf with multi calc

Hequn Cheng created FLINK-8492:
----------------------------------

             Summary: Fix unsupported exception for udtf with multi calc
                 Key: FLINK-8492
                 URL: https://issues.apache.org/jira/browse/FLINK-8492
             Project: Flink
          Issue Type: Bug
          Components: Table API &amp; SQL
            Reporter: Hequn Cheng
            Assignee: Hequn Cheng


Considering the following test, unsupported exception will be thrown due to multi calc existing between correlate and TableFunctionScan.

@Test
def testCrossJoinWithMultiFilter(): Unit = {
 val t = testData(env).toTable(tEnv).as('a, 'b, 'c)
 val func0 = new TableFunc0

 val result = t
 .join(func0('c) as('d, 'e))
 .select('c, 'd, 'e)
 .where('e > 10)
 .where('e > 20)
 .select('c, 'd)
 .toAppendStream[Row]

 result.addSink(new StreamITCase.StringSink[Row])
 env.execute()

 val expected = mutable.MutableList("Jack#22,Jack,22", "Anna#44,Anna,44")
 assertEquals(expected.sorted, StreamITCase.testResults.sorted)
}

I can see two options to fix this problem:
 # Adapt calcite OptRule to merge the continuous calc.
 # Merge multi calc in correlate convert rule.

I prefer the second one, not only because it is easy to implement but also i think with or without a optimize rule should not influence flink functionality. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)