You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "Timo Walther (JIRA)" <ji...@apache.org> on 2015/12/03 14:53:10 UTC
[jira] [Commented] (FLINK-2099) Add a SQL API
[ https://issues.apache.org/jira/browse/FLINK-2099?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15037796#comment-15037796 ]
Timo Walther commented on FLINK-2099:
-------------------------------------
I will soon present my SQL prototype. However, I'm looking for more complex ITCases. Do you know any suited benchmarks I can use for that?
It would be great if the benchmark contains input tables (e.g. as CSV files), complex SQL queries and the result (also as CSV files). Actually, TPC-H would be perfect but does not has the right license if I remember correctly. Do you know an open source alternative?
> Add a SQL API
> -------------
>
> Key: FLINK-2099
> URL: https://issues.apache.org/jira/browse/FLINK-2099
> Project: Flink
> Issue Type: New Feature
> Components: Table API
> Reporter: Timo Walther
> Assignee: Timo Walther
>
> From the mailing list:
> Fabian: Flink's Table API is pretty close to what SQL provides. IMO, the best
> approach would be to leverage that and build a SQL parser (maybe together
> with a logical optimizer) on top of the Table API. Parser (and optimizer)
> could be built using Apache Calcite which is providing exactly this.
> Since the Table API is still a fairly new component and not very feature
> rich, it might make sense to extend and strengthen it before putting
> something major on top.
> Ted: It would also be relatively simple (I think) to retarget drill to Flink if
> Flink doesn't provide enough typing meta-data to do traditional SQL.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)