You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Bo Meng (JIRA)" <ji...@apache.org> on 2017/03/29 19:09:41 UTC
[jira] [Commented] (SPARK-20145) "SELECT * FROM range(1)" works,
but "SELECT * FROM RANGE(1)" doesn't
[ https://issues.apache.org/jira/browse/SPARK-20145?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15947726#comment-15947726 ]
Bo Meng commented on SPARK-20145:
---------------------------------
From the current code, I can see builtinFunctions is using the exact match for looking up ("range" as a key is all lowercase).
> "SELECT * FROM range(1)" works, but "SELECT * FROM RANGE(1)" doesn't
> --------------------------------------------------------------------
>
> Key: SPARK-20145
> URL: https://issues.apache.org/jira/browse/SPARK-20145
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.2.0
> Reporter: Juliusz Sompolski
>
> Executed at clean tip of the master branch, with all default settings:
> scala> spark.sql("SELECT * FROM range(1)")
> res1: org.apache.spark.sql.DataFrame = [id: bigint]
> scala> spark.sql("SELECT * FROM RANGE(1)")
> org.apache.spark.sql.AnalysisException: could not resolve `RANGE` to a table-valued function; line 1 pos 14
> at org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42)
> at org.apache.spark.sql.catalyst.analysis.ResolveTableValuedFunctions$$anonfun$apply$1.applyOrElse(ResolveTableValuedFunctions.scala:126)
> at org.apache.spark.sql.catalyst.analysis.ResolveTableValuedFunctions$$anonfun$apply$1.applyOrElse(ResolveTableValuedFunctions.scala:106)
> at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan$$anonfun$resolveOperators$1.apply(LogicalPlan.scala:62)
> ...
> I believe it should be case insensitive?
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org