You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2020/07/19 02:13:00 UTC
[jira] [Resolved] (SPARK-32344) Unevaluable expr is set to
FIRST/LAST ignoreNullsExpr in distinct aggregates
[ https://issues.apache.org/jira/browse/SPARK-32344?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-32344.
----------------------------------
Fix Version/s: 3.1.0
3.0.1
Resolution: Fixed
Issue resolved by pull request 29143
[https://github.com/apache/spark/pull/29143]
> Unevaluable expr is set to FIRST/LAST ignoreNullsExpr in distinct aggregates
> ----------------------------------------------------------------------------
>
> Key: SPARK-32344
> URL: https://issues.apache.org/jira/browse/SPARK-32344
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.4.6, 3.0.0
> Reporter: Takeshi Yamamuro
> Assignee: Takeshi Yamamuro
> Priority: Minor
> Fix For: 3.0.1, 3.1.0
>
>
> {code}
> scala> sql("SELECT FIRST(DISTINCT v) FROM VALUES 1, 2, 3 t(v)").show()
> ...
> Caused by: java.lang.UnsupportedOperationException: Cannot evaluate expression: false#37
> at org.apache.spark.sql.catalyst.expressions.Unevaluable$class.eval(Expression.scala:258)
> at org.apache.spark.sql.catalyst.expressions.AttributeReference.eval(namedExpressions.scala:226)
> at org.apache.spark.sql.catalyst.expressions.aggregate.First.ignoreNulls(First.scala:68)
> at org.apache.spark.sql.catalyst.expressions.aggregate.First.updateExpressions$lzycompute(First.scala:82)
> at org.apache.spark.sql.catalyst.expressions.aggregate.First.updateExpressions(First.scala:81)
> at org.apache.spark.sql.execution.aggregate.HashAggregateExec$$anonfun$15.apply(HashAggregateExec.scala:268)
> {code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org