You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user-zh@flink.apache.org by whh_960101 <wh...@163.com> on 2020/10/14 08:20:56 UTC

pyflink sql select带特殊符号的字段名

您好,我使用pyflink时的代码如下,有如下问题:
source  = st_env.from_path('source') #st_env是StreamTableEnvironment,source是kafka源端
table = source.select("@timestamp").execute_insert('sink').get_job_client().get_job_execution_result().result()


kafka源端的json队列@timestamp字段名是固定死的,而我需要取这个字段进行处理,@timestamp涉及到@特殊符号和timestamp关键字,按照上面的代码会报解析sql错误,这个地方我该怎么修改,去网上查了加``或者''或者""都不行


希望您们能够给予解答!感谢!