You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2022/03/12 09:02:00 UTC

[jira] [Updated] (SPARK-38534) Flaky Test: ansi/datetime-parsing-invalid.sql

     [ https://issues.apache.org/jira/browse/SPARK-38534?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Dongjoon Hyun updated SPARK-38534:
----------------------------------
    Description: 
**Java 8**
{code}
$ bin/spark-shell --conf spark.sql.ansi.enabled=true
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
22/03/12 00:59:31 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Spark context Web UI available at http://172.16.0.31:4040
Spark context available as 'sc' (master = local[*], app id = local-1647075572229).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 3.3.0-SNAPSHOT
      /_/

Using Scala version 2.12.15 (OpenJDK 64-Bit Server VM, Java 1.8.0_322)
Type in expressions to have them evaluated.
Type :help for more information.

scala> sql("select to_timestamp('366', 'DD')").show
java.time.format.DateTimeParseException: Text '366' could not be parsed, unparsed text found at index 2. If necessary set spark.sql.ansi.enabled to false to bypass this error.
{code}

**Java 11+**
{code}
$ bin/spark-shell --conf spark.sql.ansi.enabled=true
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
22/03/12 01:00:07 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Spark context Web UI available at http://172.16.0.31:4040
Spark context available as 'sc' (master = local[*], app id = local-1647075607932).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 3.3.0-SNAPSHOT
      /_/

Using Scala version 2.12.15 (OpenJDK 64-Bit Server VM, Java 11.0.12)
Type in expressions to have them evaluated.
Type :help for more information.

scala> sql("select to_timestamp('366', 'DD')").show
java.time.DateTimeException: Invalid date 'DayOfYear 366' as '1970' is not a leap year. If necessary set spark.sql.ansi.enabled to false to bypass this error.
{code}

  was:
https://github.com/apache/spark/runs/5514962942

{code}
org.scalatest.exceptions.TestFailedException: ansi/datetime-parsing-invalid.sql
Expected "java.time.[format.DateTimeParseException
Text '366' could not be parsed, unparsed text found at index 2]. If necessary set s...", but got "java.time.[DateTimeException
Invalid date 'DayOfYear 366' as '1970' is not a leap year]. If necessary set s..." Result did not match for query #8
select to_timestamp('366', 'DD')
{code}


> Flaky Test: ansi/datetime-parsing-invalid.sql
> ---------------------------------------------
>
>                 Key: SPARK-38534
>                 URL: https://issues.apache.org/jira/browse/SPARK-38534
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL, Tests
>    Affects Versions: 3.3.0
>            Reporter: Dongjoon Hyun
>            Priority: Major
>
> **Java 8**
> {code}
> $ bin/spark-shell --conf spark.sql.ansi.enabled=true
> Setting default log level to "WARN".
> To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
> 22/03/12 00:59:31 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
> Spark context Web UI available at http://172.16.0.31:4040
> Spark context available as 'sc' (master = local[*], app id = local-1647075572229).
> Spark session available as 'spark'.
> Welcome to
>       ____              __
>      / __/__  ___ _____/ /__
>     _\ \/ _ \/ _ `/ __/  '_/
>    /___/ .__/\_,_/_/ /_/\_\   version 3.3.0-SNAPSHOT
>       /_/
> Using Scala version 2.12.15 (OpenJDK 64-Bit Server VM, Java 1.8.0_322)
> Type in expressions to have them evaluated.
> Type :help for more information.
> scala> sql("select to_timestamp('366', 'DD')").show
> java.time.format.DateTimeParseException: Text '366' could not be parsed, unparsed text found at index 2. If necessary set spark.sql.ansi.enabled to false to bypass this error.
> {code}
> **Java 11+**
> {code}
> $ bin/spark-shell --conf spark.sql.ansi.enabled=true
> Setting default log level to "WARN".
> To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
> 22/03/12 01:00:07 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
> Spark context Web UI available at http://172.16.0.31:4040
> Spark context available as 'sc' (master = local[*], app id = local-1647075607932).
> Spark session available as 'spark'.
> Welcome to
>       ____              __
>      / __/__  ___ _____/ /__
>     _\ \/ _ \/ _ `/ __/  '_/
>    /___/ .__/\_,_/_/ /_/\_\   version 3.3.0-SNAPSHOT
>       /_/
> Using Scala version 2.12.15 (OpenJDK 64-Bit Server VM, Java 11.0.12)
> Type in expressions to have them evaluated.
> Type :help for more information.
> scala> sql("select to_timestamp('366', 'DD')").show
> java.time.DateTimeException: Invalid date 'DayOfYear 366' as '1970' is not a leap year. If necessary set spark.sql.ansi.enabled to false to bypass this error.
> {code}



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org