You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user-zh@flink.apache.org by Asahi Lee <97...@qq.com.INVALID> on 2021/07/29 13:03:41 UTC

回复: flink 1.13.1 使用hive方言,执行hive sql解析报错

hi!
我验证了,不是else的问题,下面的sql也报同样的问题?Invalid table alias or column reference 'u' ,我的sql里面没有'u'的名称!
CREATE CATALOG `tempo_df_hive_default_catalog` WITH(
&nbsp; &nbsp; 'type' = 'hive',
&nbsp; &nbsp; 'default-database' = 'default'
);
USE CATALOG tempo_df_hive_default_catalog;
CREATE TABLE IF NOT EXISTS `default`.`tempo_blackhole_table` (
&nbsp; &nbsp;f0 INT
);
insert into cosldatacenter.dw_riginfoparam
select&nbsp;
c.LARGE_EQUIP_ID,
c.EQUIP_CODE,
c.EQUIP_NAME,
c.ENQUEUE_DATE,
c.SHI_TOTAL_LEN,
c.SHI_TYPE_WIDTH,
c.SHI_TYPE_DEPTH,
case when b.param_cn = '月池尺寸' then a.param_value else null end as Moonpool,
case when b.param_cn = '最大风速' then a.param_value else null end as MaxWindvelocity,
case when b.param_cn = '最大波浪高度' then a.param_value else null end as MaxWaveheight,
case when b.param_cn = '气隙' then a.param_value else null end as Airgap,
case when b.param_cn = '设计最大作业水深' then a.param_value else null end as MaxOpeWaterdepth,
case when b.param_cn = '额定钻井深度' then a.param_value else null end as DrilldepthCap,
case when b.param_cn = '钻井可变载荷' then a.param_value else null end as DrillVL,
case when b.param_cn = '钻井水' then a.param_value else null end as DrillWater,
case when b.param_cn = '生活水' then a.param_value else null end as PotableWater
from cosldatacenter.ods_emp_maindata_iadc_paramvalue a&nbsp;
inner join cosldatacenter.ods_emp_maindata_iadc_paramdef b on a.param_id = b.param_id
inner join cosldatacenter.ods_emp_md_large_equip c on a.SUBJECT_ID=c.LARGE_EQUIP_ID;
INSERT INTO `default`.`tempo_blackhole_table` SELECT 1 ;





org.apache.hadoop.hive.ql.parse.SemanticException: Line 2:178 Invalid table alias or column reference 'u': (possible column names are: a.paramvalue_id, a.platform_id, a.equipment_id, a.param_id, a.param_value, a.remark, a.create_time, a.creator, a.update_time, a.update_person, a.record_flag, a.subject_id, a.output_unit, a.show_seq, b.param_id, b.iadc_id, b.param_code, b.param_en, b.param_cn, b.output_standard, b.output_unit, b.param_type, b.param_value, b.remark, b.create_time, b.creator, b.update_time, b.update_person, b.record_flag, c.large_equip_id, c.equip_name, c.equip_type, c.equip_function, c.equip_board, c.ship_yard, c.manufacturer_date, c.enqueue_date, c.dockrepair_date, c.scrap_date, c.enqueue_mode, c.work_for_org, c.work_in_org, c.old_age, c.create_time, c.creator, c.update_time, c.update_person, c.record_flag, c.data_timestamp, c.work_unit_id, c.work_status, c.work_location, c.work_area, c.equip_code, c.shi_main_power, c.shi_total_len, c.shi_type_width, c.shi_type_depth, c.shi_design_draft, c.shi_total_tonnage, c.shi_load_tonnage, c.remark, c.unit_classification1, c.unit_classification2)




------------------&nbsp;原始邮件&nbsp;------------------
发件人:                                                                                                                        "user-zh"                                                                                    <xbjtdcq@gmail.com&gt;;
发送时间:&nbsp;2021年7月29日(星期四) 下午3:32
收件人:&nbsp;"user-zh"<user-zh@flink.apache.org&gt;;

主题:&nbsp;Re: flink 1.13.1 使用hive方言,执行hive sql解析报错



看起来是sql语法报错,这里面的ELSE呢?

祝好,
Leonard


&gt; 在 2021年7月27日,20:04,Asahi Lee <978466273@qq.com.INVALID&gt; 写道:
&gt; 
&gt; CASE
&gt; 		WHEN mipd.`param_cn` = '月池尺寸' THEN
&gt; 		mipv.`param_value`&amp;nbsp;
&gt; 	END AS `Moonpool`

回复: flink 1.13.1 使用hive方言,执行hive sql解析报错

Posted by Asahi Lee <97...@qq.com.INVALID>.
和中文有关系,我通过SqlParser.parseStmtlist()解析sql后,SqlNode toString()后,导致中文为unicode码导致。




------------------&nbsp;原始邮件&nbsp;------------------
发件人:                                                                                                                        "user-zh"                                                                                    <lirui.fudan@gmail.com&gt;;
发送时间:&nbsp;2021年8月4日(星期三) 下午3:06
收件人:&nbsp;"user-zh"<user-zh@flink.apache.org&gt;;

主题:&nbsp;Re: flink 1.13.1 使用hive方言,执行hive sql解析报错



用1.1.0试了一下也没复现,你insert语句中的中文如果换成英文试试看解析能不能过呢

On Mon, Aug 2, 2021 at 3:05 PM Asahi Lee <978466273@qq.com.invalid&gt; wrote:

&gt; hive 1.1.0版本
&gt;
&gt;
&gt;
&gt;
&gt; ------------------&amp;nbsp;原始邮件&amp;nbsp;------------------
&gt; 发件人:
&gt;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; "user-zh"
&gt;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; <
&gt; lirui.fudan@gmail.com&amp;gt;;
&gt; 发送时间:&amp;nbsp;2021年8月2日(星期一) 中午12:23
&gt; 收件人:&amp;nbsp;"user-zh"<user-zh@flink.apache.org&amp;gt;;
&gt;
&gt; 主题:&amp;nbsp;Re: flink 1.13.1 使用hive方言,执行hive sql解析报错
&gt;
&gt;
&gt;
&gt; 我本地试了一下没有复现你的问题,你的hive版本是什么呢?
&gt;
&gt; On Fri, Jul 30, 2021 at 3:00 PM Asahi Lee <978466273@qq.com.invalid&amp;gt;
&gt; wrote:
&gt;
&gt; &amp;gt; CREATE TABLE `cosldatacenter.ods_emp_maindata_iadc_paramvalue`(
&gt; &amp;gt; &amp;amp;nbsp; `paramvalue_id` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `platform_id` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `equipment_id` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `param_id` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `param_value` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `remark` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `create_time` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `creator` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `update_time` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `update_person` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `record_flag` double COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `subject_id` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `output_unit` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `show_seq` double COMMENT '')
&gt; &amp;gt; COMMENT ''
&gt; &amp;gt; ROW FORMAT SERDE&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcSerde'&amp;amp;nbsp;
&gt; &amp;gt; WITH SERDEPROPERTIES (&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'field.delim'=',',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'serialization.format'=',')&amp;amp;nbsp;
&gt; &amp;gt; STORED AS INPUTFORMAT&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'org.apache.hadoop.hive.ql.io
&gt; .orc.OrcInputFormat'&amp;amp;nbsp;
&gt; &amp;gt; OUTPUTFORMAT&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat'
&gt; &amp;gt; LOCATION
&gt; &amp;gt; &amp;amp;nbsp;
&gt; &amp;gt;
&gt; 'hdfs://hadoop02:8020/user/hive/warehouse/cosldatacenter.db/ods_emp_maindata_iadc_paramvalue'
&gt; &amp;gt; TBLPROPERTIES (
&gt; &amp;gt; &amp;amp;nbsp; 'COLUMN_STATS_ACCURATE'='false',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'last_modified_by'='root',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'last_modified_time'='1621834335',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'numFiles'='0',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'numRows'='-1',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'rawDataSize'='-1',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'totalSize'='0',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'transient_lastDdlTime'='1621834335')
&gt; &amp;gt;
&gt; &amp;gt;
&gt; &amp;gt;
&gt; &amp;gt; CREATE TABLE `cosldatacenter.ods_emp_md_large_equip`(
&gt; &amp;gt; &amp;amp;nbsp; `large_equip_id` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `equip_name` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `equip_type` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `equip_function` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `equip_board` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `ship_yard` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `manufacturer_date` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `enqueue_date` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `dockrepair_date` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `scrap_date` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `enqueue_mode` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `work_for_org` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `work_in_org` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `old_age` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `create_time` date COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `creator` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `update_time` date COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `update_person` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `record_flag` double COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `data_timestamp` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `work_unit_id` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `work_status` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `work_location` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `work_area` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `equip_code` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `shi_main_power` double COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `shi_total_len` double COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `shi_type_width` double COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `shi_type_depth` double COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `shi_design_draft` double COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `shi_total_tonnage` double COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `shi_load_tonnage` double COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `remark` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `unit_classification1` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `unit_classification2` string COMMENT '')
&gt; &amp;gt; COMMENT ''
&gt; &amp;gt; ROW FORMAT SERDE&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcSerde'&amp;amp;nbsp;
&gt; &amp;gt; WITH SERDEPROPERTIES (&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'field.delim'=',',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'serialization.format'=',')&amp;amp;nbsp;
&gt; &amp;gt; STORED AS INPUTFORMAT&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'org.apache.hadoop.hive.ql.io
&gt; .orc.OrcInputFormat'&amp;amp;nbsp;
&gt; &amp;gt; OUTPUTFORMAT&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat'
&gt; &amp;gt; LOCATION
&gt; &amp;gt; &amp;amp;nbsp;
&gt; &amp;gt;
&gt; 'hdfs://hadoop02:8020/user/hive/warehouse/cosldatacenter.db/ods_emp_md_large_equip'
&gt; &amp;gt; TBLPROPERTIES (
&gt; &amp;gt; &amp;amp;nbsp; 'COLUMN_STATS_ACCURATE'='false',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'last_modified_by'='root',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'last_modified_time'='1621834338',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'numFiles'='0',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'numRows'='-1',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'rawDataSize'='-1',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'totalSize'='0',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'transient_lastDdlTime'='1621834338')
&gt; &amp;gt;
&gt; &amp;gt;
&gt; &amp;gt;
&gt; &amp;gt; CREATE TABLE `cosldatacenter.ods_emp_maindata_iadc_paramdef`(
&gt; &amp;gt; &amp;amp;nbsp; `param_id` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `iadc_id` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `param_code` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `param_en` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `param_cn` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `output_standard` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `output_unit` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `param_type` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `param_value` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `remark` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `create_time` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `creator` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `update_time` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `update_person` string COMMENT '',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `record_flag` double COMMENT '')
&gt; &amp;gt; COMMENT ''
&gt; &amp;gt; ROW FORMAT SERDE&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcSerde'&amp;amp;nbsp;
&gt; &amp;gt; WITH SERDEPROPERTIES (&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'field.delim'=',',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'serialization.format'=',')&amp;amp;nbsp;
&gt; &amp;gt; STORED AS INPUTFORMAT&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'org.apache.hadoop.hive.ql.io
&gt; .orc.OrcInputFormat'&amp;amp;nbsp;
&gt; &amp;gt; OUTPUTFORMAT&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat'
&gt; &amp;gt; LOCATION
&gt; &amp;gt; &amp;amp;nbsp;
&gt; &amp;gt;
&gt; 'hdfs://hadoop02:8020/user/hive/warehouse/cosldatacenter.db/ods_emp_maindata_iadc_paramdef'
&gt; &amp;gt; TBLPROPERTIES (
&gt; &amp;gt; &amp;amp;nbsp; 'COLUMN_STATS_ACCURATE'='false',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'last_modified_by'='root',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'last_modified_time'='1621834335',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'numFiles'='0',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'numRows'='-1',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'rawDataSize'='-1',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'totalSize'='0',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'transient_lastDdlTime'='1621834335')
&gt; &amp;gt;
&gt; &amp;gt;
&gt; &amp;gt;
&gt; &amp;gt; CREATE TABLE `cosldatacenter.dw_riginfoparam`(
&gt; &amp;gt; &amp;amp;nbsp; `large_equip_id` string,&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `equip_code` string,&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `equip_name` string,&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `enqueue_date` string,&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `shi_total_len` double,&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `shi_type_width` double,&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `shi_type_depth` double,&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `moonpool` string,&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `maxwindvelocity` string,&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `maxwaveheight` string,&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `airgap` string,&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `maxopewaterdepth` string,&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `drilldepthcap` string,&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `drillvl` string,&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `drillwater` string,&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; `potablewater` string)
&gt; &amp;gt; ROW FORMAT SERDE&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcSerde'&amp;amp;nbsp;
&gt; &amp;gt; WITH SERDEPROPERTIES (&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'field.delim'=',',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'serialization.format'=',')&amp;amp;nbsp;
&gt; &amp;gt; STORED AS INPUTFORMAT&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'org.apache.hadoop.hive.ql.io
&gt; .orc.OrcInputFormat'&amp;amp;nbsp;
&gt; &amp;gt; OUTPUTFORMAT&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat'
&gt; &amp;gt; LOCATION
&gt; &amp;gt; &amp;amp;nbsp;
&gt; &amp;gt;
&gt; 'hdfs://hadoop02:8020/user/hive/warehouse/cosldatacenter.db/dw_riginfoparam'
&gt; &amp;gt; TBLPROPERTIES (
&gt; &amp;gt; &amp;amp;nbsp; 'COLUMN_STATS_ACCURATE'='false',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'numFiles'='1',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'numRows'='-1',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'rawDataSize'='-1',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'totalSize'='1564',&amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;nbsp; 'transient_lastDdlTime'='1627353556')
&gt; &amp;gt;
&gt; &amp;gt;
&gt; &amp;gt;
&gt; &amp;gt;
&gt; &amp;gt;
&gt; &amp;gt;
&gt; &amp;gt;
&gt; &amp;gt; ------------------&amp;amp;nbsp;原始邮件&amp;amp;nbsp;------------------
&gt; &amp;gt; 发件人:
&gt; &amp;gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;
&gt; "user-zh"
&gt; &amp;gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;
&gt; <
&gt; &amp;gt; lirui.fudan@gmail.com&amp;amp;gt;;
&gt; &amp;gt; 发送时间:&amp;amp;nbsp;2021年7月30日(星期五) 中午11:18
&gt; &amp;gt; 收件人:&amp;amp;nbsp;"user-zh"<user-zh@flink.apache.org&amp;amp;gt;;
&gt; &amp;gt;
&gt; &amp;gt; 主题:&amp;amp;nbsp;Re: flink 1.13.1 使用hive方言,执行hive sql解析报错
&gt; &amp;gt;
&gt; &amp;gt;
&gt; &amp;gt;
&gt; &amp;gt; 你好,
&gt; &amp;gt;
&gt; &amp;gt; 能不能把你insert语句中使用到的表的DDL发一下?贴一下show create table的结果就可以了。
&gt; &amp;gt;
&gt; &amp;gt; On Thu, Jul 29, 2021 at 9:03 PM Asahi Lee <978466273@qq.com.invalid
&gt; &amp;amp;gt;
&gt; &amp;gt; wrote:
&gt; &amp;gt;
&gt; &amp;gt; &amp;amp;gt; hi!
&gt; &amp;gt; &amp;amp;gt; 我验证了,不是else的问题,下面的sql也报同样的问题?Invalid table alias or column
&gt; reference
&gt; &amp;gt; 'u'
&gt; &amp;gt; &amp;amp;gt; ,我的sql里面没有'u'的名称!
&gt; &amp;gt; &amp;amp;gt; CREATE CATALOG `tempo_df_hive_default_catalog` WITH(
&gt; &amp;gt; &amp;amp;gt; &amp;amp;amp;nbsp; &amp;amp;amp;nbsp; 'type' = 'hive',
&gt; &amp;gt; &amp;amp;gt; &amp;amp;amp;nbsp; &amp;amp;amp;nbsp; 'default-database' = 'default'
&gt; &amp;gt; &amp;amp;gt; );
&gt; &amp;gt; &amp;amp;gt; USE CATALOG tempo_df_hive_default_catalog;
&gt; &amp;gt; &amp;amp;gt; CREATE TABLE IF NOT EXISTS `default`.`tempo_blackhole_table`
&gt; (
&gt; &amp;gt; &amp;amp;gt; &amp;amp;amp;nbsp; &amp;amp;amp;nbsp;f0 INT
&gt; &amp;gt; &amp;amp;gt; );
&gt; &amp;gt; &amp;amp;gt; insert into cosldatacenter.dw_riginfoparam
&gt; &amp;gt; &amp;amp;gt; select&amp;amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;gt; c.LARGE_EQUIP_ID,
&gt; &amp;gt; &amp;amp;gt; c.EQUIP_CODE,
&gt; &amp;gt; &amp;amp;gt; c.EQUIP_NAME,
&gt; &amp;gt; &amp;amp;gt; c.ENQUEUE_DATE,
&gt; &amp;gt; &amp;amp;gt; c.SHI_TOTAL_LEN,
&gt; &amp;gt; &amp;amp;gt; c.SHI_TYPE_WIDTH,
&gt; &amp;gt; &amp;amp;gt; c.SHI_TYPE_DEPTH,
&gt; &amp;gt; &amp;amp;gt; case when b.param_cn = '月池尺寸' then a.param_value else null
&gt; end as
&gt; &amp;gt; Moonpool,
&gt; &amp;gt; &amp;amp;gt; case when b.param_cn = '最大风速' then a.param_value else null
&gt; end as
&gt; &amp;gt; &amp;amp;gt; MaxWindvelocity,
&gt; &amp;gt; &amp;amp;gt; case when b.param_cn = '最大波浪高度' then a.param_value else null
&gt; end as
&gt; &amp;gt; &amp;amp;gt; MaxWaveheight,
&gt; &amp;gt; &amp;amp;gt; case when b.param_cn = '气隙' then a.param_value else null end
&gt; as
&gt; &amp;gt; Airgap,
&gt; &amp;gt; &amp;amp;gt; case when b.param_cn = '设计最大作业水深' then a.param_value else
&gt; null end as
&gt; &amp;gt; &amp;amp;gt; MaxOpeWaterdepth,
&gt; &amp;gt; &amp;amp;gt; case when b.param_cn = '额定钻井深度' then a.param_value else null
&gt; end as
&gt; &amp;gt; &amp;amp;gt; DrilldepthCap,
&gt; &amp;gt; &amp;amp;gt; case when b.param_cn = '钻井可变载荷' then a.param_value else null
&gt; end as
&gt; &amp;gt; &amp;amp;gt; DrillVL,
&gt; &amp;gt; &amp;amp;gt; case when b.param_cn = '钻井水' then a.param_value else null
&gt; end as
&gt; &amp;gt; &amp;amp;gt; DrillWater,
&gt; &amp;gt; &amp;amp;gt; case when b.param_cn = '生活水' then a.param_value else null
&gt; end as
&gt; &amp;gt; &amp;amp;gt; PotableWater
&gt; &amp;gt; &amp;amp;gt; from cosldatacenter.ods_emp_maindata_iadc_paramvalue
&gt; a&amp;amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;gt; inner join cosldatacenter.ods_emp_maindata_iadc_paramdef b on
&gt; &amp;gt; a.param_id =
&gt; &amp;gt; &amp;amp;gt; b.param_id
&gt; &amp;gt; &amp;amp;gt; inner join cosldatacenter.ods_emp_md_large_equip c on
&gt; &amp;gt; &amp;amp;gt; a.SUBJECT_ID=c.LARGE_EQUIP_ID;
&gt; &amp;gt; &amp;amp;gt; INSERT INTO `default`.`tempo_blackhole_table` SELECT 1 ;
&gt; &amp;gt; &amp;amp;gt;
&gt; &amp;gt; &amp;amp;gt;
&gt; &amp;gt; &amp;amp;gt;
&gt; &amp;gt; &amp;amp;gt;
&gt; &amp;gt; &amp;amp;gt;
&gt; &amp;gt; &amp;amp;gt; org.apache.hadoop.hive.ql.parse.SemanticException: Line
&gt; 2:178 Invalid
&gt; &amp;gt; &amp;amp;gt; table alias or column reference 'u': (possible column names
&gt; are:
&gt; &amp;gt; &amp;amp;gt; a.paramvalue_id, a.platform_id, a.equipment_id, a.param_id,
&gt; &amp;gt; a.param_value,
&gt; &amp;gt; &amp;amp;gt; a.remark, a.create_time, a.creator, a.update_time,
&gt; a.update_person,
&gt; &amp;gt; &amp;amp;gt; a.record_flag, a.subject_id, a.output_unit, a.show_seq,
&gt; b.param_id,
&gt; &amp;gt; &amp;amp;gt; b.iadc_id, b.param_code, b.param_en, b.param_cn,
&gt; b.output_standard,
&gt; &amp;gt; &amp;amp;gt; b.output_unit, b.param_type, b.param_value, b.remark,
&gt; b.create_time,
&gt; &amp;gt; &amp;amp;gt; b.creator, b.update_time, b.update_person, b.record_flag,
&gt; &amp;gt; c.large_equip_id,
&gt; &amp;gt; &amp;amp;gt; c.equip_name, c.equip_type, c.equip_function, c.equip_board,
&gt; &amp;gt; c.ship_yard,
&gt; &amp;gt; &amp;amp;gt; c.manufacturer_date, c.enqueue_date, c.dockrepair_date,
&gt; c.scrap_date,
&gt; &amp;gt; &amp;amp;gt; c.enqueue_mode, c.work_for_org, c.work_in_org, c.old_age,
&gt; &amp;gt; c.create_time,
&gt; &amp;gt; &amp;amp;gt; c.creator, c.update_time, c.update_person, c.record_flag,
&gt; &amp;gt; c.data_timestamp,
&gt; &amp;gt; &amp;amp;gt; c.work_unit_id, c.work_status, c.work_location, c.work_area,
&gt; &amp;gt; c.equip_code,
&gt; &amp;gt; &amp;amp;gt; c.shi_main_power, c.shi_total_len, c.shi_type_width,
&gt; c.shi_type_depth,
&gt; &amp;gt; &amp;amp;gt; c.shi_design_draft, c.shi_total_tonnage, c.shi_load_tonnage,
&gt; c.remark,
&gt; &amp;gt; &amp;amp;gt; c.unit_classification1, c.unit_classification2)
&gt; &amp;gt; &amp;amp;gt;
&gt; &amp;gt; &amp;amp;gt;
&gt; &amp;gt; &amp;amp;gt;
&gt; &amp;gt; &amp;amp;gt;
&gt; &amp;gt; &amp;amp;gt;
&gt; ------------------&amp;amp;amp;nbsp;原始邮件&amp;amp;amp;nbsp;------------------
&gt; &amp;gt; &amp;amp;gt; 发件人:
&gt; &amp;gt;
&gt; &amp;amp;gt;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;
&gt; &amp;gt; "user-zh"
&gt; &amp;gt;
&gt; &amp;amp;gt;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;
&gt; &amp;gt; <
&gt; &amp;gt; &amp;amp;gt; xbjtdcq@gmail.com&amp;amp;amp;gt;;
&gt; &amp;gt; &amp;amp;gt; 发送时间:&amp;amp;amp;nbsp;2021年7月29日(星期四) 下午3:32
&gt; &amp;gt; &amp;amp;gt; 收件人:&amp;amp;amp;nbsp;"user-zh"<user-zh@flink.apache.org
&gt; &amp;amp;amp;gt;;
&gt; &amp;gt; &amp;amp;gt;
&gt; &amp;gt; &amp;amp;gt; 主题:&amp;amp;amp;nbsp;Re: flink 1.13.1 使用hive方言,执行hive sql解析报错
&gt; &amp;gt; &amp;amp;gt;
&gt; &amp;gt; &amp;amp;gt;
&gt; &amp;gt; &amp;amp;gt;
&gt; &amp;gt; &amp;amp;gt; 看起来是sql语法报错,这里面的ELSE呢?
&gt; &amp;gt; &amp;amp;gt;
&gt; &amp;gt; &amp;amp;gt; 祝好,
&gt; &amp;gt; &amp;amp;gt; Leonard
&gt; &amp;gt; &amp;amp;gt;
&gt; &amp;gt; &amp;amp;gt;
&gt; &amp;gt; &amp;amp;gt; &amp;amp;amp;gt; 在 2021年7月27日,20:04,Asahi Lee
&gt; <978466273@qq.com.INVALID&amp;amp;amp;gt;
&gt; &amp;gt; 写道:
&gt; &amp;gt; &amp;amp;gt; &amp;amp;amp;gt;
&gt; &amp;gt; &amp;amp;gt; &amp;amp;amp;gt; CASE
&gt; &amp;gt; &amp;amp;gt;
&gt; &amp;gt;
&gt; &amp;amp;amp;gt;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;
&gt; &amp;gt; WHEN mipd.`param_cn` = '月池尺寸' THEN
&gt; &amp;gt; &amp;amp;gt;
&gt; &amp;gt;
&gt; &amp;amp;amp;gt;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp;
&gt; &amp;gt; mipv.`param_value`&amp;amp;amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;gt; &amp;amp;amp;gt;&amp;amp;nbsp;&amp;amp;nbsp;&amp;amp;nbsp; END AS `Moonpool`
&gt; &amp;gt;
&gt; &amp;gt;
&gt; &amp;gt;
&gt; &amp;gt; --
&gt; &amp;gt; Best regards!
&gt; &amp;gt; Rui Li
&gt;
&gt;
&gt;
&gt; --
&gt; Best regards!
&gt; Rui Li



-- 
Best regards!
Rui Li

Re: flink 1.13.1 使用hive方言,执行hive sql解析报错

Posted by Rui Li <li...@gmail.com>.
用1.1.0试了一下也没复现,你insert语句中的中文如果换成英文试试看解析能不能过呢

On Mon, Aug 2, 2021 at 3:05 PM Asahi Lee <97...@qq.com.invalid> wrote:

> hive 1.1.0版本
>
>
>
>
> ------------------&nbsp;原始邮件&nbsp;------------------
> 发件人:
>                                                   "user-zh"
>                                                                     <
> lirui.fudan@gmail.com&gt;;
> 发送时间:&nbsp;2021年8月2日(星期一) 中午12:23
> 收件人:&nbsp;"user-zh"<user-zh@flink.apache.org&gt;;
>
> 主题:&nbsp;Re: flink 1.13.1 使用hive方言,执行hive sql解析报错
>
>
>
> 我本地试了一下没有复现你的问题,你的hive版本是什么呢?
>
> On Fri, Jul 30, 2021 at 3:00 PM Asahi Lee <978466273@qq.com.invalid&gt;
> wrote:
>
> &gt; CREATE TABLE `cosldatacenter.ods_emp_maindata_iadc_paramvalue`(
> &gt; &amp;nbsp; `paramvalue_id` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `platform_id` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `equipment_id` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `param_id` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `param_value` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `remark` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `create_time` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `creator` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `update_time` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `update_person` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `record_flag` double COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `subject_id` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `output_unit` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `show_seq` double COMMENT '')
> &gt; COMMENT ''
> &gt; ROW FORMAT SERDE&amp;nbsp;
> &gt; &amp;nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcSerde'&amp;nbsp;
> &gt; WITH SERDEPROPERTIES (&amp;nbsp;
> &gt; &amp;nbsp; 'field.delim'=',',&amp;nbsp;
> &gt; &amp;nbsp; 'serialization.format'=',')&amp;nbsp;
> &gt; STORED AS INPUTFORMAT&amp;nbsp;
> &gt; &amp;nbsp; 'org.apache.hadoop.hive.ql.io
> .orc.OrcInputFormat'&amp;nbsp;
> &gt; OUTPUTFORMAT&amp;nbsp;
> &gt; &amp;nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat'
> &gt; LOCATION
> &gt; &amp;nbsp;
> &gt;
> 'hdfs://hadoop02:8020/user/hive/warehouse/cosldatacenter.db/ods_emp_maindata_iadc_paramvalue'
> &gt; TBLPROPERTIES (
> &gt; &amp;nbsp; 'COLUMN_STATS_ACCURATE'='false',&amp;nbsp;
> &gt; &amp;nbsp; 'last_modified_by'='root',&amp;nbsp;
> &gt; &amp;nbsp; 'last_modified_time'='1621834335',&amp;nbsp;
> &gt; &amp;nbsp; 'numFiles'='0',&amp;nbsp;
> &gt; &amp;nbsp; 'numRows'='-1',&amp;nbsp;
> &gt; &amp;nbsp; 'rawDataSize'='-1',&amp;nbsp;
> &gt; &amp;nbsp; 'totalSize'='0',&amp;nbsp;
> &gt; &amp;nbsp; 'transient_lastDdlTime'='1621834335')
> &gt;
> &gt;
> &gt;
> &gt; CREATE TABLE `cosldatacenter.ods_emp_md_large_equip`(
> &gt; &amp;nbsp; `large_equip_id` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `equip_name` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `equip_type` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `equip_function` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `equip_board` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `ship_yard` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `manufacturer_date` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `enqueue_date` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `dockrepair_date` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `scrap_date` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `enqueue_mode` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `work_for_org` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `work_in_org` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `old_age` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `create_time` date COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `creator` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `update_time` date COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `update_person` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `record_flag` double COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `data_timestamp` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `work_unit_id` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `work_status` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `work_location` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `work_area` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `equip_code` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `shi_main_power` double COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `shi_total_len` double COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `shi_type_width` double COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `shi_type_depth` double COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `shi_design_draft` double COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `shi_total_tonnage` double COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `shi_load_tonnage` double COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `remark` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `unit_classification1` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `unit_classification2` string COMMENT '')
> &gt; COMMENT ''
> &gt; ROW FORMAT SERDE&amp;nbsp;
> &gt; &amp;nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcSerde'&amp;nbsp;
> &gt; WITH SERDEPROPERTIES (&amp;nbsp;
> &gt; &amp;nbsp; 'field.delim'=',',&amp;nbsp;
> &gt; &amp;nbsp; 'serialization.format'=',')&amp;nbsp;
> &gt; STORED AS INPUTFORMAT&amp;nbsp;
> &gt; &amp;nbsp; 'org.apache.hadoop.hive.ql.io
> .orc.OrcInputFormat'&amp;nbsp;
> &gt; OUTPUTFORMAT&amp;nbsp;
> &gt; &amp;nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat'
> &gt; LOCATION
> &gt; &amp;nbsp;
> &gt;
> 'hdfs://hadoop02:8020/user/hive/warehouse/cosldatacenter.db/ods_emp_md_large_equip'
> &gt; TBLPROPERTIES (
> &gt; &amp;nbsp; 'COLUMN_STATS_ACCURATE'='false',&amp;nbsp;
> &gt; &amp;nbsp; 'last_modified_by'='root',&amp;nbsp;
> &gt; &amp;nbsp; 'last_modified_time'='1621834338',&amp;nbsp;
> &gt; &amp;nbsp; 'numFiles'='0',&amp;nbsp;
> &gt; &amp;nbsp; 'numRows'='-1',&amp;nbsp;
> &gt; &amp;nbsp; 'rawDataSize'='-1',&amp;nbsp;
> &gt; &amp;nbsp; 'totalSize'='0',&amp;nbsp;
> &gt; &amp;nbsp; 'transient_lastDdlTime'='1621834338')
> &gt;
> &gt;
> &gt;
> &gt; CREATE TABLE `cosldatacenter.ods_emp_maindata_iadc_paramdef`(
> &gt; &amp;nbsp; `param_id` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `iadc_id` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `param_code` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `param_en` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `param_cn` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `output_standard` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `output_unit` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `param_type` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `param_value` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `remark` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `create_time` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `creator` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `update_time` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `update_person` string COMMENT '',&amp;nbsp;
> &gt; &amp;nbsp; `record_flag` double COMMENT '')
> &gt; COMMENT ''
> &gt; ROW FORMAT SERDE&amp;nbsp;
> &gt; &amp;nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcSerde'&amp;nbsp;
> &gt; WITH SERDEPROPERTIES (&amp;nbsp;
> &gt; &amp;nbsp; 'field.delim'=',',&amp;nbsp;
> &gt; &amp;nbsp; 'serialization.format'=',')&amp;nbsp;
> &gt; STORED AS INPUTFORMAT&amp;nbsp;
> &gt; &amp;nbsp; 'org.apache.hadoop.hive.ql.io
> .orc.OrcInputFormat'&amp;nbsp;
> &gt; OUTPUTFORMAT&amp;nbsp;
> &gt; &amp;nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat'
> &gt; LOCATION
> &gt; &amp;nbsp;
> &gt;
> 'hdfs://hadoop02:8020/user/hive/warehouse/cosldatacenter.db/ods_emp_maindata_iadc_paramdef'
> &gt; TBLPROPERTIES (
> &gt; &amp;nbsp; 'COLUMN_STATS_ACCURATE'='false',&amp;nbsp;
> &gt; &amp;nbsp; 'last_modified_by'='root',&amp;nbsp;
> &gt; &amp;nbsp; 'last_modified_time'='1621834335',&amp;nbsp;
> &gt; &amp;nbsp; 'numFiles'='0',&amp;nbsp;
> &gt; &amp;nbsp; 'numRows'='-1',&amp;nbsp;
> &gt; &amp;nbsp; 'rawDataSize'='-1',&amp;nbsp;
> &gt; &amp;nbsp; 'totalSize'='0',&amp;nbsp;
> &gt; &amp;nbsp; 'transient_lastDdlTime'='1621834335')
> &gt;
> &gt;
> &gt;
> &gt; CREATE TABLE `cosldatacenter.dw_riginfoparam`(
> &gt; &amp;nbsp; `large_equip_id` string,&amp;nbsp;
> &gt; &amp;nbsp; `equip_code` string,&amp;nbsp;
> &gt; &amp;nbsp; `equip_name` string,&amp;nbsp;
> &gt; &amp;nbsp; `enqueue_date` string,&amp;nbsp;
> &gt; &amp;nbsp; `shi_total_len` double,&amp;nbsp;
> &gt; &amp;nbsp; `shi_type_width` double,&amp;nbsp;
> &gt; &amp;nbsp; `shi_type_depth` double,&amp;nbsp;
> &gt; &amp;nbsp; `moonpool` string,&amp;nbsp;
> &gt; &amp;nbsp; `maxwindvelocity` string,&amp;nbsp;
> &gt; &amp;nbsp; `maxwaveheight` string,&amp;nbsp;
> &gt; &amp;nbsp; `airgap` string,&amp;nbsp;
> &gt; &amp;nbsp; `maxopewaterdepth` string,&amp;nbsp;
> &gt; &amp;nbsp; `drilldepthcap` string,&amp;nbsp;
> &gt; &amp;nbsp; `drillvl` string,&amp;nbsp;
> &gt; &amp;nbsp; `drillwater` string,&amp;nbsp;
> &gt; &amp;nbsp; `potablewater` string)
> &gt; ROW FORMAT SERDE&amp;nbsp;
> &gt; &amp;nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcSerde'&amp;nbsp;
> &gt; WITH SERDEPROPERTIES (&amp;nbsp;
> &gt; &amp;nbsp; 'field.delim'=',',&amp;nbsp;
> &gt; &amp;nbsp; 'serialization.format'=',')&amp;nbsp;
> &gt; STORED AS INPUTFORMAT&amp;nbsp;
> &gt; &amp;nbsp; 'org.apache.hadoop.hive.ql.io
> .orc.OrcInputFormat'&amp;nbsp;
> &gt; OUTPUTFORMAT&amp;nbsp;
> &gt; &amp;nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat'
> &gt; LOCATION
> &gt; &amp;nbsp;
> &gt;
> 'hdfs://hadoop02:8020/user/hive/warehouse/cosldatacenter.db/dw_riginfoparam'
> &gt; TBLPROPERTIES (
> &gt; &amp;nbsp; 'COLUMN_STATS_ACCURATE'='false',&amp;nbsp;
> &gt; &amp;nbsp; 'numFiles'='1',&amp;nbsp;
> &gt; &amp;nbsp; 'numRows'='-1',&amp;nbsp;
> &gt; &amp;nbsp; 'rawDataSize'='-1',&amp;nbsp;
> &gt; &amp;nbsp; 'totalSize'='1564',&amp;nbsp;
> &gt; &amp;nbsp; 'transient_lastDdlTime'='1627353556')
> &gt;
> &gt;
> &gt;
> &gt;
> &gt;
> &gt;
> &gt;
> &gt; ------------------&amp;nbsp;原始邮件&amp;nbsp;------------------
> &gt; 发件人:
> &gt;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
> "user-zh"
> &gt;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
> <
> &gt; lirui.fudan@gmail.com&amp;gt;;
> &gt; 发送时间:&amp;nbsp;2021年7月30日(星期五) 中午11:18
> &gt; 收件人:&amp;nbsp;"user-zh"<user-zh@flink.apache.org&amp;gt;;
> &gt;
> &gt; 主题:&amp;nbsp;Re: flink 1.13.1 使用hive方言,执行hive sql解析报错
> &gt;
> &gt;
> &gt;
> &gt; 你好,
> &gt;
> &gt; 能不能把你insert语句中使用到的表的DDL发一下?贴一下show create table的结果就可以了。
> &gt;
> &gt; On Thu, Jul 29, 2021 at 9:03 PM Asahi Lee <978466273@qq.com.invalid
> &amp;gt;
> &gt; wrote:
> &gt;
> &gt; &amp;gt; hi!
> &gt; &amp;gt; 我验证了,不是else的问题,下面的sql也报同样的问题?Invalid table alias or column
> reference
> &gt; 'u'
> &gt; &amp;gt; ,我的sql里面没有'u'的名称!
> &gt; &amp;gt; CREATE CATALOG `tempo_df_hive_default_catalog` WITH(
> &gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; 'type' = 'hive',
> &gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; 'default-database' = 'default'
> &gt; &amp;gt; );
> &gt; &amp;gt; USE CATALOG tempo_df_hive_default_catalog;
> &gt; &amp;gt; CREATE TABLE IF NOT EXISTS `default`.`tempo_blackhole_table`
> (
> &gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp;f0 INT
> &gt; &amp;gt; );
> &gt; &amp;gt; insert into cosldatacenter.dw_riginfoparam
> &gt; &amp;gt; select&amp;amp;nbsp;
> &gt; &amp;gt; c.LARGE_EQUIP_ID,
> &gt; &amp;gt; c.EQUIP_CODE,
> &gt; &amp;gt; c.EQUIP_NAME,
> &gt; &amp;gt; c.ENQUEUE_DATE,
> &gt; &amp;gt; c.SHI_TOTAL_LEN,
> &gt; &amp;gt; c.SHI_TYPE_WIDTH,
> &gt; &amp;gt; c.SHI_TYPE_DEPTH,
> &gt; &amp;gt; case when b.param_cn = '月池尺寸' then a.param_value else null
> end as
> &gt; Moonpool,
> &gt; &amp;gt; case when b.param_cn = '最大风速' then a.param_value else null
> end as
> &gt; &amp;gt; MaxWindvelocity,
> &gt; &amp;gt; case when b.param_cn = '最大波浪高度' then a.param_value else null
> end as
> &gt; &amp;gt; MaxWaveheight,
> &gt; &amp;gt; case when b.param_cn = '气隙' then a.param_value else null end
> as
> &gt; Airgap,
> &gt; &amp;gt; case when b.param_cn = '设计最大作业水深' then a.param_value else
> null end as
> &gt; &amp;gt; MaxOpeWaterdepth,
> &gt; &amp;gt; case when b.param_cn = '额定钻井深度' then a.param_value else null
> end as
> &gt; &amp;gt; DrilldepthCap,
> &gt; &amp;gt; case when b.param_cn = '钻井可变载荷' then a.param_value else null
> end as
> &gt; &amp;gt; DrillVL,
> &gt; &amp;gt; case when b.param_cn = '钻井水' then a.param_value else null
> end as
> &gt; &amp;gt; DrillWater,
> &gt; &amp;gt; case when b.param_cn = '生活水' then a.param_value else null
> end as
> &gt; &amp;gt; PotableWater
> &gt; &amp;gt; from cosldatacenter.ods_emp_maindata_iadc_paramvalue
> a&amp;amp;nbsp;
> &gt; &amp;gt; inner join cosldatacenter.ods_emp_maindata_iadc_paramdef b on
> &gt; a.param_id =
> &gt; &amp;gt; b.param_id
> &gt; &amp;gt; inner join cosldatacenter.ods_emp_md_large_equip c on
> &gt; &amp;gt; a.SUBJECT_ID=c.LARGE_EQUIP_ID;
> &gt; &amp;gt; INSERT INTO `default`.`tempo_blackhole_table` SELECT 1 ;
> &gt; &amp;gt;
> &gt; &amp;gt;
> &gt; &amp;gt;
> &gt; &amp;gt;
> &gt; &amp;gt;
> &gt; &amp;gt; org.apache.hadoop.hive.ql.parse.SemanticException: Line
> 2:178 Invalid
> &gt; &amp;gt; table alias or column reference 'u': (possible column names
> are:
> &gt; &amp;gt; a.paramvalue_id, a.platform_id, a.equipment_id, a.param_id,
> &gt; a.param_value,
> &gt; &amp;gt; a.remark, a.create_time, a.creator, a.update_time,
> a.update_person,
> &gt; &amp;gt; a.record_flag, a.subject_id, a.output_unit, a.show_seq,
> b.param_id,
> &gt; &amp;gt; b.iadc_id, b.param_code, b.param_en, b.param_cn,
> b.output_standard,
> &gt; &amp;gt; b.output_unit, b.param_type, b.param_value, b.remark,
> b.create_time,
> &gt; &amp;gt; b.creator, b.update_time, b.update_person, b.record_flag,
> &gt; c.large_equip_id,
> &gt; &amp;gt; c.equip_name, c.equip_type, c.equip_function, c.equip_board,
> &gt; c.ship_yard,
> &gt; &amp;gt; c.manufacturer_date, c.enqueue_date, c.dockrepair_date,
> c.scrap_date,
> &gt; &amp;gt; c.enqueue_mode, c.work_for_org, c.work_in_org, c.old_age,
> &gt; c.create_time,
> &gt; &amp;gt; c.creator, c.update_time, c.update_person, c.record_flag,
> &gt; c.data_timestamp,
> &gt; &amp;gt; c.work_unit_id, c.work_status, c.work_location, c.work_area,
> &gt; c.equip_code,
> &gt; &amp;gt; c.shi_main_power, c.shi_total_len, c.shi_type_width,
> c.shi_type_depth,
> &gt; &amp;gt; c.shi_design_draft, c.shi_total_tonnage, c.shi_load_tonnage,
> c.remark,
> &gt; &amp;gt; c.unit_classification1, c.unit_classification2)
> &gt; &amp;gt;
> &gt; &amp;gt;
> &gt; &amp;gt;
> &gt; &amp;gt;
> &gt; &amp;gt;
> ------------------&amp;amp;nbsp;原始邮件&amp;amp;nbsp;------------------
> &gt; &amp;gt; 发件人:
> &gt;
> &amp;gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;
> &gt; "user-zh"
> &gt;
> &amp;gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;
> &gt; <
> &gt; &amp;gt; xbjtdcq@gmail.com&amp;amp;gt;;
> &gt; &amp;gt; 发送时间:&amp;amp;nbsp;2021年7月29日(星期四) 下午3:32
> &gt; &amp;gt; 收件人:&amp;amp;nbsp;"user-zh"<user-zh@flink.apache.org
> &amp;amp;gt;;
> &gt; &amp;gt;
> &gt; &amp;gt; 主题:&amp;amp;nbsp;Re: flink 1.13.1 使用hive方言,执行hive sql解析报错
> &gt; &amp;gt;
> &gt; &amp;gt;
> &gt; &amp;gt;
> &gt; &amp;gt; 看起来是sql语法报错,这里面的ELSE呢?
> &gt; &amp;gt;
> &gt; &amp;gt; 祝好,
> &gt; &amp;gt; Leonard
> &gt; &amp;gt;
> &gt; &amp;gt;
> &gt; &amp;gt; &amp;amp;gt; 在 2021年7月27日,20:04,Asahi Lee
> <978466273@qq.com.INVALID&amp;amp;gt;
> &gt; 写道:
> &gt; &amp;gt; &amp;amp;gt;
> &gt; &amp;gt; &amp;amp;gt; CASE
> &gt; &amp;gt;
> &gt;
> &amp;amp;gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;
> &gt; WHEN mipd.`param_cn` = '月池尺寸' THEN
> &gt; &amp;gt;
> &gt;
> &amp;amp;gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;
> &gt; mipv.`param_value`&amp;amp;amp;nbsp;
> &gt; &amp;gt; &amp;amp;gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; END AS `Moonpool`
> &gt;
> &gt;
> &gt;
> &gt; --
> &gt; Best regards!
> &gt; Rui Li
>
>
>
> --
> Best regards!
> Rui Li



-- 
Best regards!
Rui Li

回复: flink 1.13.1 使用hive方言,执行hive sql解析报错

Posted by Asahi Lee <97...@qq.com.INVALID>.
hive 1.1.0版本




------------------&nbsp;原始邮件&nbsp;------------------
发件人:                                                                                                                        "user-zh"                                                                                    <lirui.fudan@gmail.com&gt;;
发送时间:&nbsp;2021年8月2日(星期一) 中午12:23
收件人:&nbsp;"user-zh"<user-zh@flink.apache.org&gt;;

主题:&nbsp;Re: flink 1.13.1 使用hive方言,执行hive sql解析报错



我本地试了一下没有复现你的问题,你的hive版本是什么呢?

On Fri, Jul 30, 2021 at 3:00 PM Asahi Lee <978466273@qq.com.invalid&gt; wrote:

&gt; CREATE TABLE `cosldatacenter.ods_emp_maindata_iadc_paramvalue`(
&gt; &amp;nbsp; `paramvalue_id` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `platform_id` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `equipment_id` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `param_id` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `param_value` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `remark` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `create_time` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `creator` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `update_time` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `update_person` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `record_flag` double COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `subject_id` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `output_unit` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `show_seq` double COMMENT '')
&gt; COMMENT ''
&gt; ROW FORMAT SERDE&amp;nbsp;
&gt; &amp;nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcSerde'&amp;nbsp;
&gt; WITH SERDEPROPERTIES (&amp;nbsp;
&gt; &amp;nbsp; 'field.delim'=',',&amp;nbsp;
&gt; &amp;nbsp; 'serialization.format'=',')&amp;nbsp;
&gt; STORED AS INPUTFORMAT&amp;nbsp;
&gt; &amp;nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat'&amp;nbsp;
&gt; OUTPUTFORMAT&amp;nbsp;
&gt; &amp;nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat'
&gt; LOCATION
&gt; &amp;nbsp;
&gt; 'hdfs://hadoop02:8020/user/hive/warehouse/cosldatacenter.db/ods_emp_maindata_iadc_paramvalue'
&gt; TBLPROPERTIES (
&gt; &amp;nbsp; 'COLUMN_STATS_ACCURATE'='false',&amp;nbsp;
&gt; &amp;nbsp; 'last_modified_by'='root',&amp;nbsp;
&gt; &amp;nbsp; 'last_modified_time'='1621834335',&amp;nbsp;
&gt; &amp;nbsp; 'numFiles'='0',&amp;nbsp;
&gt; &amp;nbsp; 'numRows'='-1',&amp;nbsp;
&gt; &amp;nbsp; 'rawDataSize'='-1',&amp;nbsp;
&gt; &amp;nbsp; 'totalSize'='0',&amp;nbsp;
&gt; &amp;nbsp; 'transient_lastDdlTime'='1621834335')
&gt;
&gt;
&gt;
&gt; CREATE TABLE `cosldatacenter.ods_emp_md_large_equip`(
&gt; &amp;nbsp; `large_equip_id` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `equip_name` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `equip_type` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `equip_function` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `equip_board` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `ship_yard` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `manufacturer_date` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `enqueue_date` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `dockrepair_date` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `scrap_date` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `enqueue_mode` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `work_for_org` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `work_in_org` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `old_age` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `create_time` date COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `creator` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `update_time` date COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `update_person` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `record_flag` double COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `data_timestamp` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `work_unit_id` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `work_status` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `work_location` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `work_area` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `equip_code` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `shi_main_power` double COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `shi_total_len` double COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `shi_type_width` double COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `shi_type_depth` double COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `shi_design_draft` double COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `shi_total_tonnage` double COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `shi_load_tonnage` double COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `remark` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `unit_classification1` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `unit_classification2` string COMMENT '')
&gt; COMMENT ''
&gt; ROW FORMAT SERDE&amp;nbsp;
&gt; &amp;nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcSerde'&amp;nbsp;
&gt; WITH SERDEPROPERTIES (&amp;nbsp;
&gt; &amp;nbsp; 'field.delim'=',',&amp;nbsp;
&gt; &amp;nbsp; 'serialization.format'=',')&amp;nbsp;
&gt; STORED AS INPUTFORMAT&amp;nbsp;
&gt; &amp;nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat'&amp;nbsp;
&gt; OUTPUTFORMAT&amp;nbsp;
&gt; &amp;nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat'
&gt; LOCATION
&gt; &amp;nbsp;
&gt; 'hdfs://hadoop02:8020/user/hive/warehouse/cosldatacenter.db/ods_emp_md_large_equip'
&gt; TBLPROPERTIES (
&gt; &amp;nbsp; 'COLUMN_STATS_ACCURATE'='false',&amp;nbsp;
&gt; &amp;nbsp; 'last_modified_by'='root',&amp;nbsp;
&gt; &amp;nbsp; 'last_modified_time'='1621834338',&amp;nbsp;
&gt; &amp;nbsp; 'numFiles'='0',&amp;nbsp;
&gt; &amp;nbsp; 'numRows'='-1',&amp;nbsp;
&gt; &amp;nbsp; 'rawDataSize'='-1',&amp;nbsp;
&gt; &amp;nbsp; 'totalSize'='0',&amp;nbsp;
&gt; &amp;nbsp; 'transient_lastDdlTime'='1621834338')
&gt;
&gt;
&gt;
&gt; CREATE TABLE `cosldatacenter.ods_emp_maindata_iadc_paramdef`(
&gt; &amp;nbsp; `param_id` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `iadc_id` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `param_code` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `param_en` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `param_cn` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `output_standard` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `output_unit` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `param_type` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `param_value` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `remark` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `create_time` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `creator` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `update_time` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `update_person` string COMMENT '',&amp;nbsp;
&gt; &amp;nbsp; `record_flag` double COMMENT '')
&gt; COMMENT ''
&gt; ROW FORMAT SERDE&amp;nbsp;
&gt; &amp;nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcSerde'&amp;nbsp;
&gt; WITH SERDEPROPERTIES (&amp;nbsp;
&gt; &amp;nbsp; 'field.delim'=',',&amp;nbsp;
&gt; &amp;nbsp; 'serialization.format'=',')&amp;nbsp;
&gt; STORED AS INPUTFORMAT&amp;nbsp;
&gt; &amp;nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat'&amp;nbsp;
&gt; OUTPUTFORMAT&amp;nbsp;
&gt; &amp;nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat'
&gt; LOCATION
&gt; &amp;nbsp;
&gt; 'hdfs://hadoop02:8020/user/hive/warehouse/cosldatacenter.db/ods_emp_maindata_iadc_paramdef'
&gt; TBLPROPERTIES (
&gt; &amp;nbsp; 'COLUMN_STATS_ACCURATE'='false',&amp;nbsp;
&gt; &amp;nbsp; 'last_modified_by'='root',&amp;nbsp;
&gt; &amp;nbsp; 'last_modified_time'='1621834335',&amp;nbsp;
&gt; &amp;nbsp; 'numFiles'='0',&amp;nbsp;
&gt; &amp;nbsp; 'numRows'='-1',&amp;nbsp;
&gt; &amp;nbsp; 'rawDataSize'='-1',&amp;nbsp;
&gt; &amp;nbsp; 'totalSize'='0',&amp;nbsp;
&gt; &amp;nbsp; 'transient_lastDdlTime'='1621834335')
&gt;
&gt;
&gt;
&gt; CREATE TABLE `cosldatacenter.dw_riginfoparam`(
&gt; &amp;nbsp; `large_equip_id` string,&amp;nbsp;
&gt; &amp;nbsp; `equip_code` string,&amp;nbsp;
&gt; &amp;nbsp; `equip_name` string,&amp;nbsp;
&gt; &amp;nbsp; `enqueue_date` string,&amp;nbsp;
&gt; &amp;nbsp; `shi_total_len` double,&amp;nbsp;
&gt; &amp;nbsp; `shi_type_width` double,&amp;nbsp;
&gt; &amp;nbsp; `shi_type_depth` double,&amp;nbsp;
&gt; &amp;nbsp; `moonpool` string,&amp;nbsp;
&gt; &amp;nbsp; `maxwindvelocity` string,&amp;nbsp;
&gt; &amp;nbsp; `maxwaveheight` string,&amp;nbsp;
&gt; &amp;nbsp; `airgap` string,&amp;nbsp;
&gt; &amp;nbsp; `maxopewaterdepth` string,&amp;nbsp;
&gt; &amp;nbsp; `drilldepthcap` string,&amp;nbsp;
&gt; &amp;nbsp; `drillvl` string,&amp;nbsp;
&gt; &amp;nbsp; `drillwater` string,&amp;nbsp;
&gt; &amp;nbsp; `potablewater` string)
&gt; ROW FORMAT SERDE&amp;nbsp;
&gt; &amp;nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcSerde'&amp;nbsp;
&gt; WITH SERDEPROPERTIES (&amp;nbsp;
&gt; &amp;nbsp; 'field.delim'=',',&amp;nbsp;
&gt; &amp;nbsp; 'serialization.format'=',')&amp;nbsp;
&gt; STORED AS INPUTFORMAT&amp;nbsp;
&gt; &amp;nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat'&amp;nbsp;
&gt; OUTPUTFORMAT&amp;nbsp;
&gt; &amp;nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat'
&gt; LOCATION
&gt; &amp;nbsp;
&gt; 'hdfs://hadoop02:8020/user/hive/warehouse/cosldatacenter.db/dw_riginfoparam'
&gt; TBLPROPERTIES (
&gt; &amp;nbsp; 'COLUMN_STATS_ACCURATE'='false',&amp;nbsp;
&gt; &amp;nbsp; 'numFiles'='1',&amp;nbsp;
&gt; &amp;nbsp; 'numRows'='-1',&amp;nbsp;
&gt; &amp;nbsp; 'rawDataSize'='-1',&amp;nbsp;
&gt; &amp;nbsp; 'totalSize'='1564',&amp;nbsp;
&gt; &amp;nbsp; 'transient_lastDdlTime'='1627353556')
&gt;
&gt;
&gt;
&gt;
&gt;
&gt;
&gt;
&gt; ------------------&amp;nbsp;原始邮件&amp;nbsp;------------------
&gt; 发件人:
&gt;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; "user-zh"
&gt;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; <
&gt; lirui.fudan@gmail.com&amp;gt;;
&gt; 发送时间:&amp;nbsp;2021年7月30日(星期五) 中午11:18
&gt; 收件人:&amp;nbsp;"user-zh"<user-zh@flink.apache.org&amp;gt;;
&gt;
&gt; 主题:&amp;nbsp;Re: flink 1.13.1 使用hive方言,执行hive sql解析报错
&gt;
&gt;
&gt;
&gt; 你好,
&gt;
&gt; 能不能把你insert语句中使用到的表的DDL发一下?贴一下show create table的结果就可以了。
&gt;
&gt; On Thu, Jul 29, 2021 at 9:03 PM Asahi Lee <978466273@qq.com.invalid&amp;gt;
&gt; wrote:
&gt;
&gt; &amp;gt; hi!
&gt; &amp;gt; 我验证了,不是else的问题,下面的sql也报同样的问题?Invalid table alias or column reference
&gt; 'u'
&gt; &amp;gt; ,我的sql里面没有'u'的名称!
&gt; &amp;gt; CREATE CATALOG `tempo_df_hive_default_catalog` WITH(
&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; 'type' = 'hive',
&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp; 'default-database' = 'default'
&gt; &amp;gt; );
&gt; &amp;gt; USE CATALOG tempo_df_hive_default_catalog;
&gt; &amp;gt; CREATE TABLE IF NOT EXISTS `default`.`tempo_blackhole_table` (
&gt; &amp;gt; &amp;amp;nbsp; &amp;amp;nbsp;f0 INT
&gt; &amp;gt; );
&gt; &amp;gt; insert into cosldatacenter.dw_riginfoparam
&gt; &amp;gt; select&amp;amp;nbsp;
&gt; &amp;gt; c.LARGE_EQUIP_ID,
&gt; &amp;gt; c.EQUIP_CODE,
&gt; &amp;gt; c.EQUIP_NAME,
&gt; &amp;gt; c.ENQUEUE_DATE,
&gt; &amp;gt; c.SHI_TOTAL_LEN,
&gt; &amp;gt; c.SHI_TYPE_WIDTH,
&gt; &amp;gt; c.SHI_TYPE_DEPTH,
&gt; &amp;gt; case when b.param_cn = '月池尺寸' then a.param_value else null end as
&gt; Moonpool,
&gt; &amp;gt; case when b.param_cn = '最大风速' then a.param_value else null end as
&gt; &amp;gt; MaxWindvelocity,
&gt; &amp;gt; case when b.param_cn = '最大波浪高度' then a.param_value else null end as
&gt; &amp;gt; MaxWaveheight,
&gt; &amp;gt; case when b.param_cn = '气隙' then a.param_value else null end as
&gt; Airgap,
&gt; &amp;gt; case when b.param_cn = '设计最大作业水深' then a.param_value else null end as
&gt; &amp;gt; MaxOpeWaterdepth,
&gt; &amp;gt; case when b.param_cn = '额定钻井深度' then a.param_value else null end as
&gt; &amp;gt; DrilldepthCap,
&gt; &amp;gt; case when b.param_cn = '钻井可变载荷' then a.param_value else null end as
&gt; &amp;gt; DrillVL,
&gt; &amp;gt; case when b.param_cn = '钻井水' then a.param_value else null end as
&gt; &amp;gt; DrillWater,
&gt; &amp;gt; case when b.param_cn = '生活水' then a.param_value else null end as
&gt; &amp;gt; PotableWater
&gt; &amp;gt; from cosldatacenter.ods_emp_maindata_iadc_paramvalue a&amp;amp;nbsp;
&gt; &amp;gt; inner join cosldatacenter.ods_emp_maindata_iadc_paramdef b on
&gt; a.param_id =
&gt; &amp;gt; b.param_id
&gt; &amp;gt; inner join cosldatacenter.ods_emp_md_large_equip c on
&gt; &amp;gt; a.SUBJECT_ID=c.LARGE_EQUIP_ID;
&gt; &amp;gt; INSERT INTO `default`.`tempo_blackhole_table` SELECT 1 ;
&gt; &amp;gt;
&gt; &amp;gt;
&gt; &amp;gt;
&gt; &amp;gt;
&gt; &amp;gt;
&gt; &amp;gt; org.apache.hadoop.hive.ql.parse.SemanticException: Line 2:178 Invalid
&gt; &amp;gt; table alias or column reference 'u': (possible column names are:
&gt; &amp;gt; a.paramvalue_id, a.platform_id, a.equipment_id, a.param_id,
&gt; a.param_value,
&gt; &amp;gt; a.remark, a.create_time, a.creator, a.update_time, a.update_person,
&gt; &amp;gt; a.record_flag, a.subject_id, a.output_unit, a.show_seq, b.param_id,
&gt; &amp;gt; b.iadc_id, b.param_code, b.param_en, b.param_cn, b.output_standard,
&gt; &amp;gt; b.output_unit, b.param_type, b.param_value, b.remark, b.create_time,
&gt; &amp;gt; b.creator, b.update_time, b.update_person, b.record_flag,
&gt; c.large_equip_id,
&gt; &amp;gt; c.equip_name, c.equip_type, c.equip_function, c.equip_board,
&gt; c.ship_yard,
&gt; &amp;gt; c.manufacturer_date, c.enqueue_date, c.dockrepair_date, c.scrap_date,
&gt; &amp;gt; c.enqueue_mode, c.work_for_org, c.work_in_org, c.old_age,
&gt; c.create_time,
&gt; &amp;gt; c.creator, c.update_time, c.update_person, c.record_flag,
&gt; c.data_timestamp,
&gt; &amp;gt; c.work_unit_id, c.work_status, c.work_location, c.work_area,
&gt; c.equip_code,
&gt; &amp;gt; c.shi_main_power, c.shi_total_len, c.shi_type_width, c.shi_type_depth,
&gt; &amp;gt; c.shi_design_draft, c.shi_total_tonnage, c.shi_load_tonnage, c.remark,
&gt; &amp;gt; c.unit_classification1, c.unit_classification2)
&gt; &amp;gt;
&gt; &amp;gt;
&gt; &amp;gt;
&gt; &amp;gt;
&gt; &amp;gt; ------------------&amp;amp;nbsp;原始邮件&amp;amp;nbsp;------------------
&gt; &amp;gt; 发件人:
&gt; &amp;gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;
&gt; "user-zh"
&gt; &amp;gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;
&gt; <
&gt; &amp;gt; xbjtdcq@gmail.com&amp;amp;gt;;
&gt; &amp;gt; 发送时间:&amp;amp;nbsp;2021年7月29日(星期四) 下午3:32
&gt; &amp;gt; 收件人:&amp;amp;nbsp;"user-zh"<user-zh@flink.apache.org&amp;amp;gt;;
&gt; &amp;gt;
&gt; &amp;gt; 主题:&amp;amp;nbsp;Re: flink 1.13.1 使用hive方言,执行hive sql解析报错
&gt; &amp;gt;
&gt; &amp;gt;
&gt; &amp;gt;
&gt; &amp;gt; 看起来是sql语法报错,这里面的ELSE呢?
&gt; &amp;gt;
&gt; &amp;gt; 祝好,
&gt; &amp;gt; Leonard
&gt; &amp;gt;
&gt; &amp;gt;
&gt; &amp;gt; &amp;amp;gt; 在 2021年7月27日,20:04,Asahi Lee <978466273@qq.com.INVALID&amp;amp;gt;
&gt; 写道:
&gt; &amp;gt; &amp;amp;gt;
&gt; &amp;gt; &amp;amp;gt; CASE
&gt; &amp;gt;
&gt; &amp;amp;gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;
&gt; WHEN mipd.`param_cn` = '月池尺寸' THEN
&gt; &amp;gt;
&gt; &amp;amp;gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;
&gt; mipv.`param_value`&amp;amp;amp;nbsp;
&gt; &amp;gt; &amp;amp;gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; END AS `Moonpool`
&gt;
&gt;
&gt;
&gt; --
&gt; Best regards!
&gt; Rui Li



-- 
Best regards!
Rui Li

Re: flink 1.13.1 使用hive方言,执行hive sql解析报错

Posted by Rui Li <li...@gmail.com>.
我本地试了一下没有复现你的问题,你的hive版本是什么呢?

On Fri, Jul 30, 2021 at 3:00 PM Asahi Lee <97...@qq.com.invalid> wrote:

> CREATE TABLE `cosldatacenter.ods_emp_maindata_iadc_paramvalue`(
> &nbsp; `paramvalue_id` string COMMENT '',&nbsp;
> &nbsp; `platform_id` string COMMENT '',&nbsp;
> &nbsp; `equipment_id` string COMMENT '',&nbsp;
> &nbsp; `param_id` string COMMENT '',&nbsp;
> &nbsp; `param_value` string COMMENT '',&nbsp;
> &nbsp; `remark` string COMMENT '',&nbsp;
> &nbsp; `create_time` string COMMENT '',&nbsp;
> &nbsp; `creator` string COMMENT '',&nbsp;
> &nbsp; `update_time` string COMMENT '',&nbsp;
> &nbsp; `update_person` string COMMENT '',&nbsp;
> &nbsp; `record_flag` double COMMENT '',&nbsp;
> &nbsp; `subject_id` string COMMENT '',&nbsp;
> &nbsp; `output_unit` string COMMENT '',&nbsp;
> &nbsp; `show_seq` double COMMENT '')
> COMMENT ''
> ROW FORMAT SERDE&nbsp;
> &nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcSerde'&nbsp;
> WITH SERDEPROPERTIES (&nbsp;
> &nbsp; 'field.delim'=',',&nbsp;
> &nbsp; 'serialization.format'=',')&nbsp;
> STORED AS INPUTFORMAT&nbsp;
> &nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat'&nbsp;
> OUTPUTFORMAT&nbsp;
> &nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat'
> LOCATION
> &nbsp;
> 'hdfs://hadoop02:8020/user/hive/warehouse/cosldatacenter.db/ods_emp_maindata_iadc_paramvalue'
> TBLPROPERTIES (
> &nbsp; 'COLUMN_STATS_ACCURATE'='false',&nbsp;
> &nbsp; 'last_modified_by'='root',&nbsp;
> &nbsp; 'last_modified_time'='1621834335',&nbsp;
> &nbsp; 'numFiles'='0',&nbsp;
> &nbsp; 'numRows'='-1',&nbsp;
> &nbsp; 'rawDataSize'='-1',&nbsp;
> &nbsp; 'totalSize'='0',&nbsp;
> &nbsp; 'transient_lastDdlTime'='1621834335')
>
>
>
> CREATE TABLE `cosldatacenter.ods_emp_md_large_equip`(
> &nbsp; `large_equip_id` string COMMENT '',&nbsp;
> &nbsp; `equip_name` string COMMENT '',&nbsp;
> &nbsp; `equip_type` string COMMENT '',&nbsp;
> &nbsp; `equip_function` string COMMENT '',&nbsp;
> &nbsp; `equip_board` string COMMENT '',&nbsp;
> &nbsp; `ship_yard` string COMMENT '',&nbsp;
> &nbsp; `manufacturer_date` string COMMENT '',&nbsp;
> &nbsp; `enqueue_date` string COMMENT '',&nbsp;
> &nbsp; `dockrepair_date` string COMMENT '',&nbsp;
> &nbsp; `scrap_date` string COMMENT '',&nbsp;
> &nbsp; `enqueue_mode` string COMMENT '',&nbsp;
> &nbsp; `work_for_org` string COMMENT '',&nbsp;
> &nbsp; `work_in_org` string COMMENT '',&nbsp;
> &nbsp; `old_age` string COMMENT '',&nbsp;
> &nbsp; `create_time` date COMMENT '',&nbsp;
> &nbsp; `creator` string COMMENT '',&nbsp;
> &nbsp; `update_time` date COMMENT '',&nbsp;
> &nbsp; `update_person` string COMMENT '',&nbsp;
> &nbsp; `record_flag` double COMMENT '',&nbsp;
> &nbsp; `data_timestamp` string COMMENT '',&nbsp;
> &nbsp; `work_unit_id` string COMMENT '',&nbsp;
> &nbsp; `work_status` string COMMENT '',&nbsp;
> &nbsp; `work_location` string COMMENT '',&nbsp;
> &nbsp; `work_area` string COMMENT '',&nbsp;
> &nbsp; `equip_code` string COMMENT '',&nbsp;
> &nbsp; `shi_main_power` double COMMENT '',&nbsp;
> &nbsp; `shi_total_len` double COMMENT '',&nbsp;
> &nbsp; `shi_type_width` double COMMENT '',&nbsp;
> &nbsp; `shi_type_depth` double COMMENT '',&nbsp;
> &nbsp; `shi_design_draft` double COMMENT '',&nbsp;
> &nbsp; `shi_total_tonnage` double COMMENT '',&nbsp;
> &nbsp; `shi_load_tonnage` double COMMENT '',&nbsp;
> &nbsp; `remark` string COMMENT '',&nbsp;
> &nbsp; `unit_classification1` string COMMENT '',&nbsp;
> &nbsp; `unit_classification2` string COMMENT '')
> COMMENT ''
> ROW FORMAT SERDE&nbsp;
> &nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcSerde'&nbsp;
> WITH SERDEPROPERTIES (&nbsp;
> &nbsp; 'field.delim'=',',&nbsp;
> &nbsp; 'serialization.format'=',')&nbsp;
> STORED AS INPUTFORMAT&nbsp;
> &nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat'&nbsp;
> OUTPUTFORMAT&nbsp;
> &nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat'
> LOCATION
> &nbsp;
> 'hdfs://hadoop02:8020/user/hive/warehouse/cosldatacenter.db/ods_emp_md_large_equip'
> TBLPROPERTIES (
> &nbsp; 'COLUMN_STATS_ACCURATE'='false',&nbsp;
> &nbsp; 'last_modified_by'='root',&nbsp;
> &nbsp; 'last_modified_time'='1621834338',&nbsp;
> &nbsp; 'numFiles'='0',&nbsp;
> &nbsp; 'numRows'='-1',&nbsp;
> &nbsp; 'rawDataSize'='-1',&nbsp;
> &nbsp; 'totalSize'='0',&nbsp;
> &nbsp; 'transient_lastDdlTime'='1621834338')
>
>
>
> CREATE TABLE `cosldatacenter.ods_emp_maindata_iadc_paramdef`(
> &nbsp; `param_id` string COMMENT '',&nbsp;
> &nbsp; `iadc_id` string COMMENT '',&nbsp;
> &nbsp; `param_code` string COMMENT '',&nbsp;
> &nbsp; `param_en` string COMMENT '',&nbsp;
> &nbsp; `param_cn` string COMMENT '',&nbsp;
> &nbsp; `output_standard` string COMMENT '',&nbsp;
> &nbsp; `output_unit` string COMMENT '',&nbsp;
> &nbsp; `param_type` string COMMENT '',&nbsp;
> &nbsp; `param_value` string COMMENT '',&nbsp;
> &nbsp; `remark` string COMMENT '',&nbsp;
> &nbsp; `create_time` string COMMENT '',&nbsp;
> &nbsp; `creator` string COMMENT '',&nbsp;
> &nbsp; `update_time` string COMMENT '',&nbsp;
> &nbsp; `update_person` string COMMENT '',&nbsp;
> &nbsp; `record_flag` double COMMENT '')
> COMMENT ''
> ROW FORMAT SERDE&nbsp;
> &nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcSerde'&nbsp;
> WITH SERDEPROPERTIES (&nbsp;
> &nbsp; 'field.delim'=',',&nbsp;
> &nbsp; 'serialization.format'=',')&nbsp;
> STORED AS INPUTFORMAT&nbsp;
> &nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat'&nbsp;
> OUTPUTFORMAT&nbsp;
> &nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat'
> LOCATION
> &nbsp;
> 'hdfs://hadoop02:8020/user/hive/warehouse/cosldatacenter.db/ods_emp_maindata_iadc_paramdef'
> TBLPROPERTIES (
> &nbsp; 'COLUMN_STATS_ACCURATE'='false',&nbsp;
> &nbsp; 'last_modified_by'='root',&nbsp;
> &nbsp; 'last_modified_time'='1621834335',&nbsp;
> &nbsp; 'numFiles'='0',&nbsp;
> &nbsp; 'numRows'='-1',&nbsp;
> &nbsp; 'rawDataSize'='-1',&nbsp;
> &nbsp; 'totalSize'='0',&nbsp;
> &nbsp; 'transient_lastDdlTime'='1621834335')
>
>
>
> CREATE TABLE `cosldatacenter.dw_riginfoparam`(
> &nbsp; `large_equip_id` string,&nbsp;
> &nbsp; `equip_code` string,&nbsp;
> &nbsp; `equip_name` string,&nbsp;
> &nbsp; `enqueue_date` string,&nbsp;
> &nbsp; `shi_total_len` double,&nbsp;
> &nbsp; `shi_type_width` double,&nbsp;
> &nbsp; `shi_type_depth` double,&nbsp;
> &nbsp; `moonpool` string,&nbsp;
> &nbsp; `maxwindvelocity` string,&nbsp;
> &nbsp; `maxwaveheight` string,&nbsp;
> &nbsp; `airgap` string,&nbsp;
> &nbsp; `maxopewaterdepth` string,&nbsp;
> &nbsp; `drilldepthcap` string,&nbsp;
> &nbsp; `drillvl` string,&nbsp;
> &nbsp; `drillwater` string,&nbsp;
> &nbsp; `potablewater` string)
> ROW FORMAT SERDE&nbsp;
> &nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcSerde'&nbsp;
> WITH SERDEPROPERTIES (&nbsp;
> &nbsp; 'field.delim'=',',&nbsp;
> &nbsp; 'serialization.format'=',')&nbsp;
> STORED AS INPUTFORMAT&nbsp;
> &nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat'&nbsp;
> OUTPUTFORMAT&nbsp;
> &nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat'
> LOCATION
> &nbsp;
> 'hdfs://hadoop02:8020/user/hive/warehouse/cosldatacenter.db/dw_riginfoparam'
> TBLPROPERTIES (
> &nbsp; 'COLUMN_STATS_ACCURATE'='false',&nbsp;
> &nbsp; 'numFiles'='1',&nbsp;
> &nbsp; 'numRows'='-1',&nbsp;
> &nbsp; 'rawDataSize'='-1',&nbsp;
> &nbsp; 'totalSize'='1564',&nbsp;
> &nbsp; 'transient_lastDdlTime'='1627353556')
>
>
>
>
>
>
>
> ------------------&nbsp;原始邮件&nbsp;------------------
> 发件人:
>                                                   "user-zh"
>                                                                     <
> lirui.fudan@gmail.com&gt;;
> 发送时间:&nbsp;2021年7月30日(星期五) 中午11:18
> 收件人:&nbsp;"user-zh"<user-zh@flink.apache.org&gt;;
>
> 主题:&nbsp;Re: flink 1.13.1 使用hive方言,执行hive sql解析报错
>
>
>
> 你好,
>
> 能不能把你insert语句中使用到的表的DDL发一下?贴一下show create table的结果就可以了。
>
> On Thu, Jul 29, 2021 at 9:03 PM Asahi Lee <978466273@qq.com.invalid&gt;
> wrote:
>
> &gt; hi!
> &gt; 我验证了,不是else的问题,下面的sql也报同样的问题?Invalid table alias or column reference
> 'u'
> &gt; ,我的sql里面没有'u'的名称!
> &gt; CREATE CATALOG `tempo_df_hive_default_catalog` WITH(
> &gt; &amp;nbsp; &amp;nbsp; 'type' = 'hive',
> &gt; &amp;nbsp; &amp;nbsp; 'default-database' = 'default'
> &gt; );
> &gt; USE CATALOG tempo_df_hive_default_catalog;
> &gt; CREATE TABLE IF NOT EXISTS `default`.`tempo_blackhole_table` (
> &gt; &amp;nbsp; &amp;nbsp;f0 INT
> &gt; );
> &gt; insert into cosldatacenter.dw_riginfoparam
> &gt; select&amp;nbsp;
> &gt; c.LARGE_EQUIP_ID,
> &gt; c.EQUIP_CODE,
> &gt; c.EQUIP_NAME,
> &gt; c.ENQUEUE_DATE,
> &gt; c.SHI_TOTAL_LEN,
> &gt; c.SHI_TYPE_WIDTH,
> &gt; c.SHI_TYPE_DEPTH,
> &gt; case when b.param_cn = '月池尺寸' then a.param_value else null end as
> Moonpool,
> &gt; case when b.param_cn = '最大风速' then a.param_value else null end as
> &gt; MaxWindvelocity,
> &gt; case when b.param_cn = '最大波浪高度' then a.param_value else null end as
> &gt; MaxWaveheight,
> &gt; case when b.param_cn = '气隙' then a.param_value else null end as
> Airgap,
> &gt; case when b.param_cn = '设计最大作业水深' then a.param_value else null end as
> &gt; MaxOpeWaterdepth,
> &gt; case when b.param_cn = '额定钻井深度' then a.param_value else null end as
> &gt; DrilldepthCap,
> &gt; case when b.param_cn = '钻井可变载荷' then a.param_value else null end as
> &gt; DrillVL,
> &gt; case when b.param_cn = '钻井水' then a.param_value else null end as
> &gt; DrillWater,
> &gt; case when b.param_cn = '生活水' then a.param_value else null end as
> &gt; PotableWater
> &gt; from cosldatacenter.ods_emp_maindata_iadc_paramvalue a&amp;nbsp;
> &gt; inner join cosldatacenter.ods_emp_maindata_iadc_paramdef b on
> a.param_id =
> &gt; b.param_id
> &gt; inner join cosldatacenter.ods_emp_md_large_equip c on
> &gt; a.SUBJECT_ID=c.LARGE_EQUIP_ID;
> &gt; INSERT INTO `default`.`tempo_blackhole_table` SELECT 1 ;
> &gt;
> &gt;
> &gt;
> &gt;
> &gt;
> &gt; org.apache.hadoop.hive.ql.parse.SemanticException: Line 2:178 Invalid
> &gt; table alias or column reference 'u': (possible column names are:
> &gt; a.paramvalue_id, a.platform_id, a.equipment_id, a.param_id,
> a.param_value,
> &gt; a.remark, a.create_time, a.creator, a.update_time, a.update_person,
> &gt; a.record_flag, a.subject_id, a.output_unit, a.show_seq, b.param_id,
> &gt; b.iadc_id, b.param_code, b.param_en, b.param_cn, b.output_standard,
> &gt; b.output_unit, b.param_type, b.param_value, b.remark, b.create_time,
> &gt; b.creator, b.update_time, b.update_person, b.record_flag,
> c.large_equip_id,
> &gt; c.equip_name, c.equip_type, c.equip_function, c.equip_board,
> c.ship_yard,
> &gt; c.manufacturer_date, c.enqueue_date, c.dockrepair_date, c.scrap_date,
> &gt; c.enqueue_mode, c.work_for_org, c.work_in_org, c.old_age,
> c.create_time,
> &gt; c.creator, c.update_time, c.update_person, c.record_flag,
> c.data_timestamp,
> &gt; c.work_unit_id, c.work_status, c.work_location, c.work_area,
> c.equip_code,
> &gt; c.shi_main_power, c.shi_total_len, c.shi_type_width, c.shi_type_depth,
> &gt; c.shi_design_draft, c.shi_total_tonnage, c.shi_load_tonnage, c.remark,
> &gt; c.unit_classification1, c.unit_classification2)
> &gt;
> &gt;
> &gt;
> &gt;
> &gt; ------------------&amp;nbsp;原始邮件&amp;nbsp;------------------
> &gt; 发件人:
> &gt;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
> "user-zh"
> &gt;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
> <
> &gt; xbjtdcq@gmail.com&amp;gt;;
> &gt; 发送时间:&amp;nbsp;2021年7月29日(星期四) 下午3:32
> &gt; 收件人:&amp;nbsp;"user-zh"<user-zh@flink.apache.org&amp;gt;;
> &gt;
> &gt; 主题:&amp;nbsp;Re: flink 1.13.1 使用hive方言,执行hive sql解析报错
> &gt;
> &gt;
> &gt;
> &gt; 看起来是sql语法报错,这里面的ELSE呢?
> &gt;
> &gt; 祝好,
> &gt; Leonard
> &gt;
> &gt;
> &gt; &amp;gt; 在 2021年7月27日,20:04,Asahi Lee <978466273@qq.com.INVALID&amp;gt;
> 写道:
> &gt; &amp;gt;
> &gt; &amp;gt; CASE
> &gt;
> &amp;gt;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
> WHEN mipd.`param_cn` = '月池尺寸' THEN
> &gt;
> &amp;gt;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
> mipv.`param_value`&amp;amp;nbsp;
> &gt; &amp;gt;&nbsp;&nbsp;&nbsp; END AS `Moonpool`
>
>
>
> --
> Best regards!
> Rui Li



-- 
Best regards!
Rui Li

回复: flink 1.13.1 使用hive方言,执行hive sql解析报错

Posted by Asahi Lee <97...@qq.com.INVALID>.
CREATE TABLE `cosldatacenter.ods_emp_maindata_iadc_paramvalue`(
&nbsp; `paramvalue_id` string COMMENT '',&nbsp;
&nbsp; `platform_id` string COMMENT '',&nbsp;
&nbsp; `equipment_id` string COMMENT '',&nbsp;
&nbsp; `param_id` string COMMENT '',&nbsp;
&nbsp; `param_value` string COMMENT '',&nbsp;
&nbsp; `remark` string COMMENT '',&nbsp;
&nbsp; `create_time` string COMMENT '',&nbsp;
&nbsp; `creator` string COMMENT '',&nbsp;
&nbsp; `update_time` string COMMENT '',&nbsp;
&nbsp; `update_person` string COMMENT '',&nbsp;
&nbsp; `record_flag` double COMMENT '',&nbsp;
&nbsp; `subject_id` string COMMENT '',&nbsp;
&nbsp; `output_unit` string COMMENT '',&nbsp;
&nbsp; `show_seq` double COMMENT '')
COMMENT ''
ROW FORMAT SERDE&nbsp;
&nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcSerde'&nbsp;
WITH SERDEPROPERTIES (&nbsp;
&nbsp; 'field.delim'=',',&nbsp;
&nbsp; 'serialization.format'=',')&nbsp;
STORED AS INPUTFORMAT&nbsp;
&nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat'&nbsp;
OUTPUTFORMAT&nbsp;
&nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat'
LOCATION
&nbsp; 'hdfs://hadoop02:8020/user/hive/warehouse/cosldatacenter.db/ods_emp_maindata_iadc_paramvalue'
TBLPROPERTIES (
&nbsp; 'COLUMN_STATS_ACCURATE'='false',&nbsp;
&nbsp; 'last_modified_by'='root',&nbsp;
&nbsp; 'last_modified_time'='1621834335',&nbsp;
&nbsp; 'numFiles'='0',&nbsp;
&nbsp; 'numRows'='-1',&nbsp;
&nbsp; 'rawDataSize'='-1',&nbsp;
&nbsp; 'totalSize'='0',&nbsp;
&nbsp; 'transient_lastDdlTime'='1621834335')



CREATE TABLE `cosldatacenter.ods_emp_md_large_equip`(
&nbsp; `large_equip_id` string COMMENT '',&nbsp;
&nbsp; `equip_name` string COMMENT '',&nbsp;
&nbsp; `equip_type` string COMMENT '',&nbsp;
&nbsp; `equip_function` string COMMENT '',&nbsp;
&nbsp; `equip_board` string COMMENT '',&nbsp;
&nbsp; `ship_yard` string COMMENT '',&nbsp;
&nbsp; `manufacturer_date` string COMMENT '',&nbsp;
&nbsp; `enqueue_date` string COMMENT '',&nbsp;
&nbsp; `dockrepair_date` string COMMENT '',&nbsp;
&nbsp; `scrap_date` string COMMENT '',&nbsp;
&nbsp; `enqueue_mode` string COMMENT '',&nbsp;
&nbsp; `work_for_org` string COMMENT '',&nbsp;
&nbsp; `work_in_org` string COMMENT '',&nbsp;
&nbsp; `old_age` string COMMENT '',&nbsp;
&nbsp; `create_time` date COMMENT '',&nbsp;
&nbsp; `creator` string COMMENT '',&nbsp;
&nbsp; `update_time` date COMMENT '',&nbsp;
&nbsp; `update_person` string COMMENT '',&nbsp;
&nbsp; `record_flag` double COMMENT '',&nbsp;
&nbsp; `data_timestamp` string COMMENT '',&nbsp;
&nbsp; `work_unit_id` string COMMENT '',&nbsp;
&nbsp; `work_status` string COMMENT '',&nbsp;
&nbsp; `work_location` string COMMENT '',&nbsp;
&nbsp; `work_area` string COMMENT '',&nbsp;
&nbsp; `equip_code` string COMMENT '',&nbsp;
&nbsp; `shi_main_power` double COMMENT '',&nbsp;
&nbsp; `shi_total_len` double COMMENT '',&nbsp;
&nbsp; `shi_type_width` double COMMENT '',&nbsp;
&nbsp; `shi_type_depth` double COMMENT '',&nbsp;
&nbsp; `shi_design_draft` double COMMENT '',&nbsp;
&nbsp; `shi_total_tonnage` double COMMENT '',&nbsp;
&nbsp; `shi_load_tonnage` double COMMENT '',&nbsp;
&nbsp; `remark` string COMMENT '',&nbsp;
&nbsp; `unit_classification1` string COMMENT '',&nbsp;
&nbsp; `unit_classification2` string COMMENT '')
COMMENT ''
ROW FORMAT SERDE&nbsp;
&nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcSerde'&nbsp;
WITH SERDEPROPERTIES (&nbsp;
&nbsp; 'field.delim'=',',&nbsp;
&nbsp; 'serialization.format'=',')&nbsp;
STORED AS INPUTFORMAT&nbsp;
&nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat'&nbsp;
OUTPUTFORMAT&nbsp;
&nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat'
LOCATION
&nbsp; 'hdfs://hadoop02:8020/user/hive/warehouse/cosldatacenter.db/ods_emp_md_large_equip'
TBLPROPERTIES (
&nbsp; 'COLUMN_STATS_ACCURATE'='false',&nbsp;
&nbsp; 'last_modified_by'='root',&nbsp;
&nbsp; 'last_modified_time'='1621834338',&nbsp;
&nbsp; 'numFiles'='0',&nbsp;
&nbsp; 'numRows'='-1',&nbsp;
&nbsp; 'rawDataSize'='-1',&nbsp;
&nbsp; 'totalSize'='0',&nbsp;
&nbsp; 'transient_lastDdlTime'='1621834338')



CREATE TABLE `cosldatacenter.ods_emp_maindata_iadc_paramdef`(
&nbsp; `param_id` string COMMENT '',&nbsp;
&nbsp; `iadc_id` string COMMENT '',&nbsp;
&nbsp; `param_code` string COMMENT '',&nbsp;
&nbsp; `param_en` string COMMENT '',&nbsp;
&nbsp; `param_cn` string COMMENT '',&nbsp;
&nbsp; `output_standard` string COMMENT '',&nbsp;
&nbsp; `output_unit` string COMMENT '',&nbsp;
&nbsp; `param_type` string COMMENT '',&nbsp;
&nbsp; `param_value` string COMMENT '',&nbsp;
&nbsp; `remark` string COMMENT '',&nbsp;
&nbsp; `create_time` string COMMENT '',&nbsp;
&nbsp; `creator` string COMMENT '',&nbsp;
&nbsp; `update_time` string COMMENT '',&nbsp;
&nbsp; `update_person` string COMMENT '',&nbsp;
&nbsp; `record_flag` double COMMENT '')
COMMENT ''
ROW FORMAT SERDE&nbsp;
&nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcSerde'&nbsp;
WITH SERDEPROPERTIES (&nbsp;
&nbsp; 'field.delim'=',',&nbsp;
&nbsp; 'serialization.format'=',')&nbsp;
STORED AS INPUTFORMAT&nbsp;
&nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat'&nbsp;
OUTPUTFORMAT&nbsp;
&nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat'
LOCATION
&nbsp; 'hdfs://hadoop02:8020/user/hive/warehouse/cosldatacenter.db/ods_emp_maindata_iadc_paramdef'
TBLPROPERTIES (
&nbsp; 'COLUMN_STATS_ACCURATE'='false',&nbsp;
&nbsp; 'last_modified_by'='root',&nbsp;
&nbsp; 'last_modified_time'='1621834335',&nbsp;
&nbsp; 'numFiles'='0',&nbsp;
&nbsp; 'numRows'='-1',&nbsp;
&nbsp; 'rawDataSize'='-1',&nbsp;
&nbsp; 'totalSize'='0',&nbsp;
&nbsp; 'transient_lastDdlTime'='1621834335')



CREATE TABLE `cosldatacenter.dw_riginfoparam`(
&nbsp; `large_equip_id` string,&nbsp;
&nbsp; `equip_code` string,&nbsp;
&nbsp; `equip_name` string,&nbsp;
&nbsp; `enqueue_date` string,&nbsp;
&nbsp; `shi_total_len` double,&nbsp;
&nbsp; `shi_type_width` double,&nbsp;
&nbsp; `shi_type_depth` double,&nbsp;
&nbsp; `moonpool` string,&nbsp;
&nbsp; `maxwindvelocity` string,&nbsp;
&nbsp; `maxwaveheight` string,&nbsp;
&nbsp; `airgap` string,&nbsp;
&nbsp; `maxopewaterdepth` string,&nbsp;
&nbsp; `drilldepthcap` string,&nbsp;
&nbsp; `drillvl` string,&nbsp;
&nbsp; `drillwater` string,&nbsp;
&nbsp; `potablewater` string)
ROW FORMAT SERDE&nbsp;
&nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcSerde'&nbsp;
WITH SERDEPROPERTIES (&nbsp;
&nbsp; 'field.delim'=',',&nbsp;
&nbsp; 'serialization.format'=',')&nbsp;
STORED AS INPUTFORMAT&nbsp;
&nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat'&nbsp;
OUTPUTFORMAT&nbsp;
&nbsp; 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat'
LOCATION
&nbsp; 'hdfs://hadoop02:8020/user/hive/warehouse/cosldatacenter.db/dw_riginfoparam'
TBLPROPERTIES (
&nbsp; 'COLUMN_STATS_ACCURATE'='false',&nbsp;
&nbsp; 'numFiles'='1',&nbsp;
&nbsp; 'numRows'='-1',&nbsp;
&nbsp; 'rawDataSize'='-1',&nbsp;
&nbsp; 'totalSize'='1564',&nbsp;
&nbsp; 'transient_lastDdlTime'='1627353556')







------------------&nbsp;原始邮件&nbsp;------------------
发件人:                                                                                                                        "user-zh"                                                                                    <lirui.fudan@gmail.com&gt;;
发送时间:&nbsp;2021年7月30日(星期五) 中午11:18
收件人:&nbsp;"user-zh"<user-zh@flink.apache.org&gt;;

主题:&nbsp;Re: flink 1.13.1 使用hive方言,执行hive sql解析报错



你好,

能不能把你insert语句中使用到的表的DDL发一下?贴一下show create table的结果就可以了。

On Thu, Jul 29, 2021 at 9:03 PM Asahi Lee <978466273@qq.com.invalid&gt; wrote:

&gt; hi!
&gt; 我验证了,不是else的问题,下面的sql也报同样的问题?Invalid table alias or column reference 'u'
&gt; ,我的sql里面没有'u'的名称!
&gt; CREATE CATALOG `tempo_df_hive_default_catalog` WITH(
&gt; &amp;nbsp; &amp;nbsp; 'type' = 'hive',
&gt; &amp;nbsp; &amp;nbsp; 'default-database' = 'default'
&gt; );
&gt; USE CATALOG tempo_df_hive_default_catalog;
&gt; CREATE TABLE IF NOT EXISTS `default`.`tempo_blackhole_table` (
&gt; &amp;nbsp; &amp;nbsp;f0 INT
&gt; );
&gt; insert into cosldatacenter.dw_riginfoparam
&gt; select&amp;nbsp;
&gt; c.LARGE_EQUIP_ID,
&gt; c.EQUIP_CODE,
&gt; c.EQUIP_NAME,
&gt; c.ENQUEUE_DATE,
&gt; c.SHI_TOTAL_LEN,
&gt; c.SHI_TYPE_WIDTH,
&gt; c.SHI_TYPE_DEPTH,
&gt; case when b.param_cn = '月池尺寸' then a.param_value else null end as Moonpool,
&gt; case when b.param_cn = '最大风速' then a.param_value else null end as
&gt; MaxWindvelocity,
&gt; case when b.param_cn = '最大波浪高度' then a.param_value else null end as
&gt; MaxWaveheight,
&gt; case when b.param_cn = '气隙' then a.param_value else null end as Airgap,
&gt; case when b.param_cn = '设计最大作业水深' then a.param_value else null end as
&gt; MaxOpeWaterdepth,
&gt; case when b.param_cn = '额定钻井深度' then a.param_value else null end as
&gt; DrilldepthCap,
&gt; case when b.param_cn = '钻井可变载荷' then a.param_value else null end as
&gt; DrillVL,
&gt; case when b.param_cn = '钻井水' then a.param_value else null end as
&gt; DrillWater,
&gt; case when b.param_cn = '生活水' then a.param_value else null end as
&gt; PotableWater
&gt; from cosldatacenter.ods_emp_maindata_iadc_paramvalue a&amp;nbsp;
&gt; inner join cosldatacenter.ods_emp_maindata_iadc_paramdef b on a.param_id =
&gt; b.param_id
&gt; inner join cosldatacenter.ods_emp_md_large_equip c on
&gt; a.SUBJECT_ID=c.LARGE_EQUIP_ID;
&gt; INSERT INTO `default`.`tempo_blackhole_table` SELECT 1 ;
&gt;
&gt;
&gt;
&gt;
&gt;
&gt; org.apache.hadoop.hive.ql.parse.SemanticException: Line 2:178 Invalid
&gt; table alias or column reference 'u': (possible column names are:
&gt; a.paramvalue_id, a.platform_id, a.equipment_id, a.param_id, a.param_value,
&gt; a.remark, a.create_time, a.creator, a.update_time, a.update_person,
&gt; a.record_flag, a.subject_id, a.output_unit, a.show_seq, b.param_id,
&gt; b.iadc_id, b.param_code, b.param_en, b.param_cn, b.output_standard,
&gt; b.output_unit, b.param_type, b.param_value, b.remark, b.create_time,
&gt; b.creator, b.update_time, b.update_person, b.record_flag, c.large_equip_id,
&gt; c.equip_name, c.equip_type, c.equip_function, c.equip_board, c.ship_yard,
&gt; c.manufacturer_date, c.enqueue_date, c.dockrepair_date, c.scrap_date,
&gt; c.enqueue_mode, c.work_for_org, c.work_in_org, c.old_age, c.create_time,
&gt; c.creator, c.update_time, c.update_person, c.record_flag, c.data_timestamp,
&gt; c.work_unit_id, c.work_status, c.work_location, c.work_area, c.equip_code,
&gt; c.shi_main_power, c.shi_total_len, c.shi_type_width, c.shi_type_depth,
&gt; c.shi_design_draft, c.shi_total_tonnage, c.shi_load_tonnage, c.remark,
&gt; c.unit_classification1, c.unit_classification2)
&gt;
&gt;
&gt;
&gt;
&gt; ------------------&amp;nbsp;原始邮件&amp;nbsp;------------------
&gt; 发件人:
&gt;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; "user-zh"
&gt;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; <
&gt; xbjtdcq@gmail.com&amp;gt;;
&gt; 发送时间:&amp;nbsp;2021年7月29日(星期四) 下午3:32
&gt; 收件人:&amp;nbsp;"user-zh"<user-zh@flink.apache.org&amp;gt;;
&gt;
&gt; 主题:&amp;nbsp;Re: flink 1.13.1 使用hive方言,执行hive sql解析报错
&gt;
&gt;
&gt;
&gt; 看起来是sql语法报错,这里面的ELSE呢?
&gt;
&gt; 祝好,
&gt; Leonard
&gt;
&gt;
&gt; &amp;gt; 在 2021年7月27日,20:04,Asahi Lee <978466273@qq.com.INVALID&amp;gt; 写道:
&gt; &amp;gt;
&gt; &amp;gt; CASE
&gt; &amp;gt;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; WHEN mipd.`param_cn` = '月池尺寸' THEN
&gt; &amp;gt;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; mipv.`param_value`&amp;amp;nbsp;
&gt; &amp;gt;&nbsp;&nbsp;&nbsp; END AS `Moonpool`



-- 
Best regards!
Rui Li

Re: flink 1.13.1 使用hive方言,执行hive sql解析报错

Posted by Rui Li <li...@gmail.com>.
你好,

能不能把你insert语句中使用到的表的DDL发一下?贴一下show create table的结果就可以了。

On Thu, Jul 29, 2021 at 9:03 PM Asahi Lee <97...@qq.com.invalid> wrote:

> hi!
> 我验证了,不是else的问题,下面的sql也报同样的问题?Invalid table alias or column reference 'u'
> ,我的sql里面没有'u'的名称!
> CREATE CATALOG `tempo_df_hive_default_catalog` WITH(
> &nbsp; &nbsp; 'type' = 'hive',
> &nbsp; &nbsp; 'default-database' = 'default'
> );
> USE CATALOG tempo_df_hive_default_catalog;
> CREATE TABLE IF NOT EXISTS `default`.`tempo_blackhole_table` (
> &nbsp; &nbsp;f0 INT
> );
> insert into cosldatacenter.dw_riginfoparam
> select&nbsp;
> c.LARGE_EQUIP_ID,
> c.EQUIP_CODE,
> c.EQUIP_NAME,
> c.ENQUEUE_DATE,
> c.SHI_TOTAL_LEN,
> c.SHI_TYPE_WIDTH,
> c.SHI_TYPE_DEPTH,
> case when b.param_cn = '月池尺寸' then a.param_value else null end as Moonpool,
> case when b.param_cn = '最大风速' then a.param_value else null end as
> MaxWindvelocity,
> case when b.param_cn = '最大波浪高度' then a.param_value else null end as
> MaxWaveheight,
> case when b.param_cn = '气隙' then a.param_value else null end as Airgap,
> case when b.param_cn = '设计最大作业水深' then a.param_value else null end as
> MaxOpeWaterdepth,
> case when b.param_cn = '额定钻井深度' then a.param_value else null end as
> DrilldepthCap,
> case when b.param_cn = '钻井可变载荷' then a.param_value else null end as
> DrillVL,
> case when b.param_cn = '钻井水' then a.param_value else null end as
> DrillWater,
> case when b.param_cn = '生活水' then a.param_value else null end as
> PotableWater
> from cosldatacenter.ods_emp_maindata_iadc_paramvalue a&nbsp;
> inner join cosldatacenter.ods_emp_maindata_iadc_paramdef b on a.param_id =
> b.param_id
> inner join cosldatacenter.ods_emp_md_large_equip c on
> a.SUBJECT_ID=c.LARGE_EQUIP_ID;
> INSERT INTO `default`.`tempo_blackhole_table` SELECT 1 ;
>
>
>
>
>
> org.apache.hadoop.hive.ql.parse.SemanticException: Line 2:178 Invalid
> table alias or column reference 'u': (possible column names are:
> a.paramvalue_id, a.platform_id, a.equipment_id, a.param_id, a.param_value,
> a.remark, a.create_time, a.creator, a.update_time, a.update_person,
> a.record_flag, a.subject_id, a.output_unit, a.show_seq, b.param_id,
> b.iadc_id, b.param_code, b.param_en, b.param_cn, b.output_standard,
> b.output_unit, b.param_type, b.param_value, b.remark, b.create_time,
> b.creator, b.update_time, b.update_person, b.record_flag, c.large_equip_id,
> c.equip_name, c.equip_type, c.equip_function, c.equip_board, c.ship_yard,
> c.manufacturer_date, c.enqueue_date, c.dockrepair_date, c.scrap_date,
> c.enqueue_mode, c.work_for_org, c.work_in_org, c.old_age, c.create_time,
> c.creator, c.update_time, c.update_person, c.record_flag, c.data_timestamp,
> c.work_unit_id, c.work_status, c.work_location, c.work_area, c.equip_code,
> c.shi_main_power, c.shi_total_len, c.shi_type_width, c.shi_type_depth,
> c.shi_design_draft, c.shi_total_tonnage, c.shi_load_tonnage, c.remark,
> c.unit_classification1, c.unit_classification2)
>
>
>
>
> ------------------&nbsp;原始邮件&nbsp;------------------
> 发件人:
>                                                   "user-zh"
>                                                                     <
> xbjtdcq@gmail.com&gt;;
> 发送时间:&nbsp;2021年7月29日(星期四) 下午3:32
> 收件人:&nbsp;"user-zh"<user-zh@flink.apache.org&gt;;
>
> 主题:&nbsp;Re: flink 1.13.1 使用hive方言,执行hive sql解析报错
>
>
>
> 看起来是sql语法报错,这里面的ELSE呢?
>
> 祝好,
> Leonard
>
>
> &gt; 在 2021年7月27日,20:04,Asahi Lee <978466273@qq.com.INVALID&gt; 写道:
> &gt;
> &gt; CASE
> &gt;            WHEN mipd.`param_cn` = '月池尺寸' THEN
> &gt;            mipv.`param_value`&amp;nbsp;
> &gt;    END AS `Moonpool`



-- 
Best regards!
Rui Li