You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "yuemeng (JIRA)" <ji...@apache.org> on 2017/07/11 08:26:00 UTC

[jira] [Created] (FLINK-7148) Flink SQL API support DDL

yuemeng created FLINK-7148:
------------------------------

             Summary: Flink SQL API support  DDL
                 Key: FLINK-7148
                 URL: https://issues.apache.org/jira/browse/FLINK-7148
             Project: Flink
          Issue Type: Bug
          Components: Table API & SQL
            Reporter: yuemeng


For now,Flink SQL can't support DDL operation,user can only register a table by call registerTableInternal in TableEnvironment. we should support DDL such as  create table or create function like:
{code}
CREATE TABLE kafka_source (
  id INT,
  price INT
) PROPERTIES (
  category = 'source',
  type = 'kafka',
  version = '0.9.0.1',
  separator = ',',
  topic = 'test',
  brokers = 'xxxx:9092',
  group_id = 'test'
);

CREATE TABLE db_sink (
  id INT,
  price DOUBLE
) PROPERTIES (
  category = 'sink',
  type = 'mysql',
  table_name = 'udaf_test',
  url = 'jdbc:mysql://127.0.0.1:3308/ds?useUnicode=true&characterEncoding=UTF8',
  username = 'ds_dev',
  password = 's]k51_(>R'
);

CREATE TEMPORARY function 'AVGUDAF' AS 'com.xxxx.server.codegen.aggregate.udaf.avg.IntegerAvgUDAF';
INSERT INTO db_sink SELECT id ,AVGUDAF(price) FROM kafka_source group by id

{code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)