You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Max Gekk (Jira)" <ji...@apache.org> on 2022/04/19 19:26:00 UTC

[jira] [Resolved] (SPARK-38929) Improve error messages for cast failures in ANSI

     [ https://issues.apache.org/jira/browse/SPARK-38929?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Max Gekk resolved SPARK-38929.
------------------------------
    Fix Version/s: 3.4.0
       Resolution: Fixed

Issue resolved by pull request 36241
[https://github.com/apache/spark/pull/36241]

> Improve error messages for cast failures in ANSI
> ------------------------------------------------
>
>                 Key: SPARK-38929
>                 URL: https://issues.apache.org/jira/browse/SPARK-38929
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.3.0
>            Reporter: Xinyi Yu
>            Assignee: Xinyi Yu
>            Priority: Major
>             Fix For: 3.4.0
>
>
> Improve several error messages for cast failures in ANSI.
> h2. Cast to numeric types
> {code:java}
> java.lang.NumberFormatException: invalid input syntax for type numeric: 1.0. To return NULL instead, use 'try_cast'. ...{code}
> This is confusing as 1.0 is numeric to an average user. Need to mention the specific target type (integer in this case) and put 1.0 in single quotes. If we can mention this is a cast from string to an integer that’s even better.
> *Proposed change*
> {code:java}
> Invalid `int` literal: '1.0'. To return NULL instead, use 'try_cast'.{code}
> h2. Cast to date types
> {code:java}
> java.time.DateTimeException: Cannot cast 2021-09- 2 to DateType.{code}
> Can align with the above change.
> *Proposed change*
> {code:java}
> Invalid `date` literal: '2021-09- 2'. To return NULL instead, use 'try_cast'.{code}



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org