You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "yuemeng (JIRA)" <ji...@apache.org> on 2017/07/11 08:01:00 UTC

[jira] [Created] (FLINK-7146) FLINK SQLs support DDL

yuemeng created FLINK-7146:
------------------------------

             Summary: FLINK SQLs support DDL
                 Key: FLINK-7146
                 URL: https://issues.apache.org/jira/browse/FLINK-7146
             Project: Flink
          Issue Type: Sub-task
          Components: Table API & SQL
            Reporter: yuemeng


For now,Flink SQL can't support DDL, we can only register a table by call registerTableInternal in TableEnvironment
we should support DDL for sql such as create a table or create function like:
{code}

CREATE TABLE kafka_source (
  id INT,
  price INT
) PROPERTIES (
  category = 'source',
  type = 'kafka',
  version = '0.9.0.1',
  separator = ',',
  topic = 'test',
  brokers = 'xxxxxx:9092',
  group_id = 'test'
);

CREATE TABLE db_sink (
  id INT,
  price DOUBLE
) PROPERTIES (
  category = 'sink',
  type = 'mysql',
  table_name = 'udaf_test',
  url = 'jdbc:mysql://127.0.0.1:3308/ds?useUnicode=true&characterEncoding=UTF8',
  username = 'ds_dev',
  password = 's]k51_(>R'
);

CREATE TEMPORARY function 'AVGUDAF' AS 'com.xxxxx.server.codegen.aggregate.udaf.avg.IntegerAvgUDAF';

INSERT INTO db_sink SELECT id ,AVGUDAF(price) FROM kafka_source group by id


{code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)