You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Brandon Krieger (JIRA)" <ji...@apache.org> on 2018/06/07 20:18:00 UTC
[jira] [Commented] (SPARK-24488) Analyzer throws when generator is
aliased multiple times
[ https://issues.apache.org/jira/browse/SPARK-24488?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16505199#comment-16505199 ]
Brandon Krieger commented on SPARK-24488:
-----------------------------------------
Made https://github.com/apache/spark/pull/21508 to address this issue.
> Analyzer throws when generator is aliased multiple times
> --------------------------------------------------------
>
> Key: SPARK-24488
> URL: https://issues.apache.org/jira/browse/SPARK-24488
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.3.0
> Reporter: Brandon Krieger
> Priority: Minor
>
> Currently, the Analyzer throws an exception if your try to nest a generator. However, it special cases generators "nested" in an alias, and allows that. If you try to alias a generator twice, it is not caught by the special case, so an exception is thrown:
>
> {code:java}
> scala> Seq(("a", "b"))
> .toDF("col1","col2")
> .select(functions.array('col1,'col2).as("arr"))
> .select(functions.explode('arr).as("first").as("second"))
> .collect()
> org.apache.spark.sql.AnalysisException: Generators are not supported when it's nested in expressions, but got: explode(arr) AS `first`;
> at org.apache.spark.sql.catalyst.analysis.Analyzer$ExtractGenerator$$anonfun$apply$23.applyOrElse(Analyzer.scala:1604)
> at org.apache.spark.sql.catalyst.analysis.Analyzer$ExtractGenerator$$anonfun$apply$23.applyOrElse(Analyzer.scala:1601)
> {code}
>
> In reality, aliasing twice is fine, so we can fix this by trimming non top-level aliases.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org