You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sathiya Kumar (Jira)" <ji...@apache.org> on 2021/11/21 23:17:00 UTC
[jira] [Created] (SPARK-37433) TimeZoneAwareExpression throws NoSuchElementException: None.get on expr.eval()
Sathiya Kumar created SPARK-37433:
-------------------------------------
Summary: TimeZoneAwareExpression throws NoSuchElementException: None.get on expr.eval()
Key: SPARK-37433
URL: https://issues.apache.org/jira/browse/SPARK-37433
Project: Spark
Issue Type: Bug
Components: SQL
Affects Versions: 3.2.0
Reporter: Sathiya Kumar
TimeZoneAwareExpression like hour, date_format etc. throws
NoSuchElementException: None.get on expr.eval()
*hour(current_timestamp).expr.eval()*
*date_format(current_timestamp, "dd").expr.eval()*
{code:java}
java.util.NoSuchElementException: None.get
at scala.None$.get(Option.scala:529)
at scala.None$.get(Option.scala:527)
at org.apache.spark.sql.catalyst.expressions.TimeZoneAwareExpression.zoneId(datetimeExpressions.scala:53)
at org.apache.spark.sql.catalyst.expressions.TimeZoneAwareExpression.zoneId$(datetimeExpressions.scala:53)
at org.apache.spark.sql.catalyst.expressions.DateFormatClass.zoneId$lzycompute(datetimeExpressions.scala:772)
at org.apache.spark.sql.catalyst.expressions.DateFormatClass.zoneId(datetimeExpressions.scala:772)
at org.apache.spark.sql.catalyst.expressions.TimestampFormatterHelper.getFormatter(datetimeExpressions.scala:70)
at org.apache.spark.sql.catalyst.expressions.TimestampFormatterHelper.getFormatter$(datetimeExpressions.scala:67)
at org.apache.spark.sql.catalyst.expressions.DateFormatClass.getFormatter(datetimeExpressions.scala:772)
at org.apache.spark.sql.catalyst.expressions.TimestampFormatterHelper.$anonfun$formatterOption$1(datetimeExpressions.scala:64)
{code}
--
This message was sent by Atlassian Jira
(v8.20.1#820001)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org