You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues-all@impala.apache.org by "Csaba Ringhofer (JIRA)" <ji...@apache.org> on 2018/08/23 20:57:00 UTC

[jira] [Updated] (IMPALA-7472) Consider removing TimestampValue::FromSubsecondUnixTime

     [ https://issues.apache.org/jira/browse/IMPALA-7472?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Csaba Ringhofer updated IMPALA-7472:
------------------------------------
    Description: 
TimestampValue::FromSubsecondUnixTime converts a double interpreted as unix time in seconds to TimestampValue. Impala uses it in two cases:
1: double <-> timestamp casting
2: aggregate function AVG on timestamps - it converts timestamp to double, calculates the average on doubles, and converts the average back to timestamp.

My concern about this is double's varying's precision depending on the distance from 0 (1970-01-01) - subsec timestamps around 2018 are rounded/distorted with several nanosecs, which can add up if a large number of timestamps are averaged.

I think that decimal (with nanosec precision) could be used for both cases:
1. Decimal could used as an intermediate stage - double < - >decimal and decimal < - > timestamp conversions have well defined rounding rules since decimal V2.
2. Decimal could be used as aggregation state - this would probably make the AVG slower, but it would be much more precise.

  was:
TimestampValue::FromSubsecondUnixTime converts a double interpreted as unix time in seconds to TimestampValue. Impala uses it in two cases:
1: double <-> timestamp casting
2: aggregate function AVG on timestamps - it converts timestamp to double, calculates the average on doubles, and converts the average back to timestamp.

My concern about this is double's varying's precision depending on the distance from 0 (1970-01-01) - subsec timestamps around 2018 are rounded/distorted with several nanosecs, which can add up if a large number of timestamps are averaged.

I think that decimal (with nanosec precision) could be used for both cases:
1. Decimal could used as an intermediate stage - double <->decimal and decimal<-> timestamp conversions have well defined rounding rules since decimal V2.
2. Decimal could be used as aggregation state - this would probably make the AVG slower, but it would be much more precise.


> Consider removing TimestampValue::FromSubsecondUnixTime
> -------------------------------------------------------
>
>                 Key: IMPALA-7472
>                 URL: https://issues.apache.org/jira/browse/IMPALA-7472
>             Project: IMPALA
>          Issue Type: Bug
>          Components: Backend
>            Reporter: Csaba Ringhofer
>            Priority: Minor
>              Labels: timestamp
>
> TimestampValue::FromSubsecondUnixTime converts a double interpreted as unix time in seconds to TimestampValue. Impala uses it in two cases:
> 1: double <-> timestamp casting
> 2: aggregate function AVG on timestamps - it converts timestamp to double, calculates the average on doubles, and converts the average back to timestamp.
> My concern about this is double's varying's precision depending on the distance from 0 (1970-01-01) - subsec timestamps around 2018 are rounded/distorted with several nanosecs, which can add up if a large number of timestamps are averaged.
> I think that decimal (with nanosec precision) could be used for both cases:
> 1. Decimal could used as an intermediate stage - double < - >decimal and decimal < - > timestamp conversions have well defined rounding rules since decimal V2.
> 2. Decimal could be used as aggregation state - this would probably make the AVG slower, but it would be much more precise.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-all-unsubscribe@impala.apache.org
For additional commands, e-mail: issues-all-help@impala.apache.org