You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flink.apache.org by "元始(Bob Hu)" <65...@qq.com> on 2020/07/14 01:41:42 UTC

stop job problem and ddl problem

dear,
i have two problems now:
1. when i stop a flink job using command "yarn application -kill <appId&gt;" or "echo 'stop' | ./bin/yarn-session.sh -id <appId&gt;",a new flink job is started&nbsp;automatically sometimes,how can i aviod this? flink version 1.11


2.how should i write the DDL of a sink table when one of the column type is POJO(such as JSONObject)?
i tried this:
CREATE TABLE test (name string,age int,json any)
        WITH (
          'connector' = 'print',
          'hostname' = 'localhost',
          'port' = '9999',
          'byte-delimiter' = '10',
          'format' = 'changelog-csv',
          'changelog-csv.column-delimiter' = '|'
        )and got:Exception in thread "main" org.apache.flink.table.api.TableException: Type is not supported: ANY 	at org.apache.flink.table.planner.calcite.FlinkTypeFactory$.toLogicalType(FlinkTypeFactory.scala:551) 	at org.apache.flink.table.planner.calcite.FlinkTypeFactory.toLogicalType(FlinkTypeFactory.scala) 	at org.apache.flink.table.planner.operations.MergeTableLikeUtil$SchemaBuilder.appendDerivedColumns(MergeTableLikeUtil.java:408) 	at org.apache.flink.table.planner.operations.MergeTableLikeUtil$SchemaBuilder.access$000(MergeTableLikeUtil.java:201) 	at org.apache.flink.table.planner.operations.MergeTableLikeUtil.mergeTables(MergeTableLikeUtil.java:143) 	at org.apache.flink.table.planner.operations.SqlCreateTableConverter.createCatalogTable(SqlCreateTableConverter.java:117) 	at org.apache.flink.table.planner.operations.SqlCreateTableConverter.convertCreateTable(SqlCreateTableConverter.java:76) 	at org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:190) 	at org.apache.flink.table.planner.delegation.ParserImpl.parse(ParserImpl.java:78) 	at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:678) 	at userSink.TestSink.main(TestSink.java:29)

Re: stop job problem and ddl problem

Posted by Benchao Li <li...@apache.org>.
Maybe you could try out RAW[1]

[1]
https://ci.apache.org/projects/flink/flink-docs-master/dev/table/types.html#raw

元始(Bob Hu) <65...@qq.com> 于2020年7月14日周二 上午9:41写道:

> dear,
> i have two problems now:
> 1. when i stop a flink job using command "yarn application -kill <appId>"
> or "echo 'stop' | ./bin/yarn-session.sh -id <appId>",a new flink job is
> started automatically sometimes,how can i aviod this? flink version 1.11
>
> 2.how should i write the DDL of a sink table when one of the column type
> is POJO(such as JSONObject)?
> i tried this:
>
> CREATE TABLE test (name string,age int,json any)
>         WITH (
>           'connector' = 'print',
>           'hostname' = 'localhost',
>           'port' = '9999',
>           'byte-delimiter' = '10',
>           'format' = 'changelog-csv',
>           'changelog-csv.column-delimiter' = '|'
>         )
>
> and got:
>
> Exception in thread "main" org.apache.flink.table.api.TableException: Type is not supported: ANY
> 	at org.apache.flink.table.planner.calcite.FlinkTypeFactory$.toLogicalType(FlinkTypeFactory.scala:551)
> 	at org.apache.flink.table.planner.calcite.FlinkTypeFactory.toLogicalType(FlinkTypeFactory.scala)
> 	at org.apache.flink.table.planner.operations.MergeTableLikeUtil$SchemaBuilder.appendDerivedColumns(MergeTableLikeUtil.java:408)
> 	at org.apache.flink.table.planner.operations.MergeTableLikeUtil$SchemaBuilder.access$000(MergeTableLikeUtil.java:201)
> 	at org.apache.flink.table.planner.operations.MergeTableLikeUtil.mergeTables(MergeTableLikeUtil.java:143)
> 	at org.apache.flink.table.planner.operations.SqlCreateTableConverter.createCatalogTable(SqlCreateTableConverter.java:117)
> 	at org.apache.flink.table.planner.operations.SqlCreateTableConverter.convertCreateTable(SqlCreateTableConverter.java:76)
> 	at org.apache.flink.table.planner.operations.SqlToOperationConverter.convert(SqlToOperationConverter.java:190)
> 	at org.apache.flink.table.planner.delegation.ParserImpl.parse(ParserImpl.java:78)
> 	at org.apache.flink.table.api.internal.TableEnvironmentImpl.executeSql(TableEnvironmentImpl.java:678)
> 	at userSink.TestSink.main(TestSink.java:29)
>
>

-- 

Best,
Benchao Li