You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Josh Rosen (JIRA)" <ji...@apache.org> on 2017/02/28 01:04:45 UTC

[jira] [Created] (SPARK-19758) Casting string to timestamp in inline table definition fails with AnalysisException

Josh Rosen created SPARK-19758:
----------------------------------

             Summary: Casting string to timestamp in inline table definition fails with AnalysisException
                 Key: SPARK-19758
                 URL: https://issues.apache.org/jira/browse/SPARK-19758
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 2.2.0
            Reporter: Josh Rosen
            Priority: Blocker


The following query runs succesfully on Spark 2.1.x but fails in the current master:

{code}
sql("""CREATE TEMPORARY VIEW table_4(timestamp_col_3) AS VALUES TIMESTAMP('1991-12-06 00:00:00.0')""")
{code}

Here's the error:

{code}
scala> sql("""CREATE TEMPORARY VIEW table_4(timestamp_col_3) AS VALUES TIMESTAMP('1991-12-06 00:00:00.0')""")
org.apache.spark.sql.AnalysisException: failed to evaluate expression CAST('1991-12-06 00:00:00.0' AS TIMESTAMP): None.get; line 1 pos 50
  at org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42)
  at org.apache.spark.sql.catalyst.analysis.ResolveInlineTables$$anonfun$4$$anonfun$apply$4.apply(ResolveInlineTables.scala:105)
  at org.apache.spark.sql.catalyst.analysis.ResolveInlineTables$$anonfun$4$$anonfun$apply$4.apply(ResolveInlineTables.scala:95)
  at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
  at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
  at scala.collection.immutable.List.foreach(List.scala:381)
  at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
  at scala.collection.immutable.List.map(List.scala:285)
  at org.apache.spark.sql.catalyst.analysis.ResolveInlineTables$$anonfun$4.apply(ResolveInlineTables.scala:95)
  at org.apache.spark.sql.catalyst.analysis.ResolveInlineTables$$anonfun$4.apply(ResolveInlineTables.scala:94)
  at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
  at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
  at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
  at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
  at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
  at scala.collection.AbstractTraversable.map(Traversable.scala:104)
  at org.apache.spark.sql.catalyst.analysis.ResolveInlineTables$.convert(ResolveInlineTables.scala:94)
  at org.apache.spark.sql.catalyst.analysis.ResolveInlineTables$$anonfun$apply$1.applyOrElse(ResolveInlineTables.scala:36)
  at org.apache.spark.sql.catalyst.analysis.ResolveInlineTables$$anonfun$apply$1.applyOrElse(ResolveInlineTables.scala:32)
  at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:289)
  at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:289)
  at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70)
  at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:288)
  at org.apache.spark.sql.catalyst.analysis.ResolveInlineTables$.apply(ResolveInlineTables.scala:32)
  at org.apache.spark.sql.catalyst.analysis.ResolveInlineTables$.apply(ResolveInlineTables.scala:31)
  at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:85)
  at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:82)
  at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:124)
  at scala.collection.immutable.List.foldLeft(List.scala:84)
  at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:82)
  at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:74)
  at scala.collection.immutable.List.foreach(List.scala:381)
  at org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:74)
  at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:65)
  at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:63)
  at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:51)
  at org.apache.spark.sql.execution.command.CreateViewCommand.run(views.scala:128)
  at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
  at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
  at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:67)
  at org.apache.spark.sql.Dataset.<init>(Dataset.scala:182)
  at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:67)
  at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:588)
  ... 48 elided
{code}

It appears that this bug was introduced by SPARK-18936. /cc [~ueshin]



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org