You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hive.apache.org by "Christian Zorneck (JIRA)" <ji...@apache.org> on 2016/11/29 12:56:58 UTC

[jira] [Created] (HIVE-15304) Implicit down cast should throw error

Christian Zorneck created HIVE-15304:
----------------------------------------

             Summary: Implicit down cast should throw error
                 Key: HIVE-15304
                 URL: https://issues.apache.org/jira/browse/HIVE-15304
             Project: Hive
          Issue Type: Bug
    Affects Versions: 2.0.1, 2.1.0, 2.0.0, 1.2.1, 1.1.1, 1.0.1, 1.1.0, 1.2.0, 1.0.0, 0.13.1, 0.14.0, 0.13.0
            Reporter: Christian Zorneck


Implicit down casts from a bigger numeric  type to a smaller type should throw an error, when a numeric overflow happens.

Example:
CREATE TABLE downcast_test (int_value INT);
-- implicit cast to BIGINT and than implicit cast to INT
INSERT INTO TABLE downcast_test SELECT 2147483647 + 1;
-- implicit cast from BIGINT to INT
INSERT INTO TABLE downcast_test SELECT 2147483648L;
-- implicit cast from DOUBLE to INT
INSERT INTO TABLE downcast_test SELECT 2.147483648E9;
SELECT * FROM downcast_test;
=>
-2147483648  -- expected int overflow value
-2147483648  -- expected int overflow value
2147483647  -- what happend here? cast from DOUBLE to INT results in INT max value

Here you can see another nice bug with cast from DOUBLE to INT, but this is not the issue here.
Reading the following documentation
https://cwiki.apache.org/confluence/display/Hive/LanguageManual+Types#LanguageManualTypes-AllowedImplicitConversions
implicit down casts from BIGINT and DOUBLE to INT are not allowed. So an exception is expected. But now exception will be thrown, it quite casts the values somehow down to INT



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)