You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flink.apache.org by Soheil Pourbafrani <so...@gmail.com> on 2019/07/02 18:36:08 UTC

Can Flink infers the table columns type

Hi

I want load MySQL tables in Flink without need to specifying column names
and types (like what we can do in Apache Spark DataFrames).

Using the JDBCInputFormat we should pass the table fields type in the
method setRowTypeInfo. I couldn't find any way to force Flink to infer the
column type.

In addition, I was wondering if it's possible to do the same using
TableSource?

thanks

Re: Can Flink infers the table columns type

Posted by Dawid Wysakowicz <dw...@apache.org>.
Hi,

Unfortunately the automatic schema inference of jdbc source is not
supported yet. There is also no jdbc TableSource yet, but you should be
able to write one yourself that reuses the JDBCInputFormat. You may take
a look at BatchTableSource/StreamTableSource interfaces and
corresponding methods to create DataSet/DataStream:
StreamExecutionEnvironment.createInput/ExecutionEnvironment.createInput.

You may expect more (table) connectors in the Flink versions 1.10+. They
should have the automatic schema inference when possible.

Best,

Dawid

On 02/07/2019 20:36, Soheil Pourbafrani wrote:
> Hi 
>
> I want load MySQL tables in Flink without need to specifying column
> names and types (like what we can do in Apache Spark DataFrames). 
>
> Using the JDBCInputFormat we should pass the table fields type in the
> method setRowTypeInfo. I couldn't find any way to force Flink to infer
> the column type.
>
> In addition, I was wondering if it's possible to do the same
> using TableSource?
> thanks