You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2020/10/31 08:31:00 UTC
[jira] [Assigned] (SPARK-33306) TimezoneID is needed when there
cast from Date to String
[ https://issues.apache.org/jira/browse/SPARK-33306?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-33306:
------------------------------------
Assignee: Apache Spark
> TimezoneID is needed when there cast from Date to String
> --------------------------------------------------------
>
> Key: SPARK-33306
> URL: https://issues.apache.org/jira/browse/SPARK-33306
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 3.0.0
> Reporter: EdisonWang
> Assignee: Apache Spark
> Priority: Major
>
> A simple way to reproduce this is
> ```
> spark-shell --conf spark.sql.legacy.typeCoercion.datetimeToString.enabled
> scala> sql("""
> select a.d1 from
> (select to_date(concat('2000-01-0', id)) as d1 from range(1, 2)) a
> join
> (select concat('2000-01-0', id) as d2 from range(1, 2)) b
> on a.d1 = b.d2
> """).show
> ```
>
> it will throw
> ```
> java.util.NoSuchElementException: None.get
> at scala.None$.get(Option.scala:529)
> at scala.None$.get(Option.scala:527)
> at org.apache.spark.sql.catalyst.expressions.TimeZoneAwareExpression.zoneId(datetimeExpressions.scala:56)
> at org.apache.spark.sql.catalyst.expressions.TimeZoneAwareExpression.zoneId$(datetimeExpressions.scala:56)
> at org.apache.spark.sql.catalyst.expressions.CastBase.zoneId$lzycompute(Cast.scala:253)
> at org.apache.spark.sql.catalyst.expressions.CastBase.zoneId(Cast.scala:253)
> at org.apache.spark.sql.catalyst.expressions.CastBase.dateFormatter$lzycompute(Cast.scala:287)
> at org.apache.spark.sql.catalyst.expressions.CastBase.dateFormatter(Cast.scala:287)
> ```
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org