You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "Jingsong Lee (Jira)" <ji...@apache.org> on 2020/04/01 09:13:00 UTC

[jira] [Comment Edited] (FLINK-16860) TableException: Failed to push filter into table source! when upgrading flink to 1.9.2

    [ https://issues.apache.org/jira/browse/FLINK-16860?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17072521#comment-17072521 ] 

Jingsong Lee edited comment on FLINK-16860 at 4/1/20, 9:12 AM:
---------------------------------------------------------------

Ah... Yes, after FLINK-12399. there are bugs in {{OrcTableSource}}.

The case is there are some filters to be pushed down to source, but source can not consume them. We need distinguish empty filters from no pushed down in {{expainSource}}.


was (Author: lzljs3620320):
Ah... Yes, after FLINK-12399. there are bugs in {{FilterableTableSource}} implementations, like orc and parquet.

The case is there are some filters to be pushed down to source, but source can not consume them. We need distinguish empty filters from no pushed down in {{expainSource}}.

> TableException: Failed to push filter into table source! when upgrading flink to 1.9.2
> --------------------------------------------------------------------------------------
>
>                 Key: FLINK-16860
>                 URL: https://issues.apache.org/jira/browse/FLINK-16860
>             Project: Flink
>          Issue Type: Bug
>          Components: Connectors / ORC, Table SQL / API
>    Affects Versions: 1.9.2, 1.10.0
>         Environment: flink 1.8.2
> flink 1.9.2
>            Reporter: Nikola
>            Priority: Major
>             Fix For: 1.9.3, 1.10.1
>
>         Attachments: flink-1.8.2.txt, flink-1.9.2.txt
>
>
> We have a batch job which we currently have on a flink cluster running 1.8.2
>  The job runs fine. We wanted to upgrade to flink 1.10, but that yielded errors, so we started downgrading until we found that the issue is in flink 1.9.2
> The job on 1.9.2 fails with:
> {code:java}
> Caused by: org.apache.flink.table.api.TableException: Failed to push filter into table source! table source with pushdown capability must override and change explainSource() API to explain the pushdown applied!{code}
> Which is not happening on flink 1.8.2. You can check the logs for the exactly same job, just running on different cluster versions: [^flink-1.8.2.txt] [^flink-1.9.2.txt]
>  
> I tried to narrow it down and it seems that this exception has been added in FLINK-12399 and there was a small discussion regarding the exception: [https://github.com/apache/flink/pull/8468#discussion_r329876088]
> Our code looks something like this:
>  
> {code:java}
> String tempTableName = "tempTable";
> String sql = SqlBuilder.buildSql(tempTableName);
> BatchTableEnvironment tableEnv = BatchTableEnvironment.create(env);
> OrcTableSource orcTableSource = OrcTableSource.builder()
>  .path(hdfsFolder, true)
>  .forOrcSchema(ORC.getSchema())
>  .withConfiguration(config)
>  .build();
> tableEnv.registerTableSource(tempTableName, orcTableSource);
> Table tempTable = tableEnv.sqlQuery(sql);
> return tableEnv.toDataSet(tempTable, Row.class); 
> {code}
> Where the sql build is nothing more than
> {code:java}
> SELECT * FROM table WHERE id IN (1,2,3) AND mid IN(4,5,6){code}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)