You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user-zh@flink.apache.org by nashcen <24...@qq.com> on 2020/09/24 08:16:21 UTC

[flink-1.11] 读 kafka 写 hive,partition-time 抽取值不准确

Kafka 表 定义如下:
CREATE TABLE `dc_ods`.`ods_dcpoints_prod_kafka_source` (
  `assetSpecId` STRING,
  `dcnum` STRING,
  `monitorType` STRING,
  `tagNo` STRING,
  `value` STRING,
*  `updateTime` BIGINT,
  `eventTime` AS TO_TIMESTAMP(FROM_UNIXTIME(`updateTime` / 1000,'yyyy-MM-dd
HH:mm:ss')),*
  WATERMARK FOR `eventTime` AS `eventTime` - INTERVAL '5' SECOND
) WITH (
  'connector' = 'kafka',
  'topic' = 'ods_dcpoints_prod',
  'properties.bootstrap.servers' = 'prod-bigdata-03:9092',
  'scan.startup.mode' = 'earliest-offset',
  'format' = 'json',
  'json.fail-on-missing-field' = 'false',
  'json.ignore-parse-errors' = 'true'
)

从已经插入到 hive的数据来看,有一条update_time=1600218000292,对应event_time的值,为
2020-09-16 01:00:00.0
<http://apache-flink.147419.n8.nabble.com/file/t817/1600935127%281%29.png> 

但是从 hive 查询 FROM_UNIXTIME(cast(1600218000292/1000 as int),'yyyy-MM-dd
HH:mm:ss') 的值,为
2020-09-16 09:00:00
<http://apache-flink.147419.n8.nabble.com/file/t817/1600935330%281%29.png> 

两者对不上,这是什么原因?






--
Sent from: http://apache-flink.147419.n8.nabble.com/

Re: [flink-1.11] 读 kafka 写 hive,partition-time 抽取值不准确

Posted by Rui Li <li...@gmail.com>.
不好意思,我本地试了一下flink的FROM_UNIXTIME也是用的系统时区。你贴一下hive表的DDL和INSERT语句,我去试一下吧

On Fri, Sep 25, 2020 at 1:58 PM Rui Li <li...@gmail.com> wrote:

> 这应该是时区差异导致的,flink的FROM_UNIXTIME用的是UTC时区,hive的FROM_UNIXTIME用的是系统时区。
>
> On Thu, Sep 24, 2020 at 4:16 PM nashcen <24...@qq.com> wrote:
>
>> Kafka 表 定义如下:
>> CREATE TABLE `dc_ods`.`ods_dcpoints_prod_kafka_source` (
>>   `assetSpecId` STRING,
>>   `dcnum` STRING,
>>   `monitorType` STRING,
>>   `tagNo` STRING,
>>   `value` STRING,
>> *  `updateTime` BIGINT,
>>   `eventTime` AS TO_TIMESTAMP(FROM_UNIXTIME(`updateTime` /
>> 1000,'yyyy-MM-dd
>> HH:mm:ss')),*
>>   WATERMARK FOR `eventTime` AS `eventTime` - INTERVAL '5' SECOND
>> ) WITH (
>>   'connector' = 'kafka',
>>   'topic' = 'ods_dcpoints_prod',
>>   'properties.bootstrap.servers' = 'prod-bigdata-03:9092',
>>   'scan.startup.mode' = 'earliest-offset',
>>   'format' = 'json',
>>   'json.fail-on-missing-field' = 'false',
>>   'json.ignore-parse-errors' = 'true'
>> )
>>
>> 从已经插入到 hive的数据来看,有一条update_time=1600218000292,对应event_time的值,为
>> 2020-09-16 01:00:00.0
>> <http://apache-flink.147419.n8.nabble.com/file/t817/1600935127%281%29.png>
>>
>>
>> 但是从 hive 查询 FROM_UNIXTIME(cast(1600218000292/1000 as int),'yyyy-MM-dd
>> HH:mm:ss') 的值,为
>> 2020-09-16 09:00:00
>> <http://apache-flink.147419.n8.nabble.com/file/t817/1600935330%281%29.png>
>>
>>
>> 两者对不上,这是什么原因?
>>
>>
>>
>>
>>
>>
>> --
>> Sent from: http://apache-flink.147419.n8.nabble.com/
>>
>
>
> --
> Best regards!
> Rui Li
>


-- 
Best regards!
Rui Li

Re: [flink-1.11] 读 kafka 写 hive,partition-time 抽取值不准确

Posted by Rui Li <li...@gmail.com>.
这应该是时区差异导致的,flink的FROM_UNIXTIME用的是UTC时区,hive的FROM_UNIXTIME用的是系统时区。

On Thu, Sep 24, 2020 at 4:16 PM nashcen <24...@qq.com> wrote:

> Kafka 表 定义如下:
> CREATE TABLE `dc_ods`.`ods_dcpoints_prod_kafka_source` (
>   `assetSpecId` STRING,
>   `dcnum` STRING,
>   `monitorType` STRING,
>   `tagNo` STRING,
>   `value` STRING,
> *  `updateTime` BIGINT,
>   `eventTime` AS TO_TIMESTAMP(FROM_UNIXTIME(`updateTime` / 1000,'yyyy-MM-dd
> HH:mm:ss')),*
>   WATERMARK FOR `eventTime` AS `eventTime` - INTERVAL '5' SECOND
> ) WITH (
>   'connector' = 'kafka',
>   'topic' = 'ods_dcpoints_prod',
>   'properties.bootstrap.servers' = 'prod-bigdata-03:9092',
>   'scan.startup.mode' = 'earliest-offset',
>   'format' = 'json',
>   'json.fail-on-missing-field' = 'false',
>   'json.ignore-parse-errors' = 'true'
> )
>
> 从已经插入到 hive的数据来看,有一条update_time=1600218000292,对应event_time的值,为
> 2020-09-16 01:00:00.0
> <http://apache-flink.147419.n8.nabble.com/file/t817/1600935127%281%29.png>
>
>
> 但是从 hive 查询 FROM_UNIXTIME(cast(1600218000292/1000 as int),'yyyy-MM-dd
> HH:mm:ss') 的值,为
> 2020-09-16 09:00:00
> <http://apache-flink.147419.n8.nabble.com/file/t817/1600935330%281%29.png>
>
>
> 两者对不上,这是什么原因?
>
>
>
>
>
>
> --
> Sent from: http://apache-flink.147419.n8.nabble.com/
>


-- 
Best regards!
Rui Li