You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2022/05/17 13:30:00 UTC
[jira] [Assigned] (SPARK-39167) Throw an exception w/ an error class for multiple rows from a subquery used as an expression
[ https://issues.apache.org/jira/browse/SPARK-39167?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-39167:
------------------------------------
Assignee: (was: Apache Spark)
> Throw an exception w/ an error class for multiple rows from a subquery used as an expression
> --------------------------------------------------------------------------------------------
>
> Key: SPARK-39167
> URL: https://issues.apache.org/jira/browse/SPARK-39167
> Project: Spark
> Issue Type: Sub-task
> Components: SQL
> Affects Versions: 3.4.0
> Reporter: Max Gekk
> Priority: Major
>
> Users can trigger an illegal state exception by the SQL statement:
> {code:sql}
> > select (select a from (select 1 as a union all select 2 as a) t) as b
> {code}
> {code:java}
> Caused by: java.lang.IllegalStateException: more than one row returned by a subquery used as an expression:
> Subquery subquery#242, [id=#100]
> +- AdaptiveSparkPlan isFinalPlan=true
> +- == Final Plan ==
> Union
> :- *(1) Project [1 AS a#240]
> : +- *(1) Scan OneRowRelation[]
> +- *(2) Project [2 AS a#241]
> +- *(2) Scan OneRowRelation[]
> +- == Initial Plan ==
> Union
> :- Project [1 AS a#240]
> : +- Scan OneRowRelation[]
> +- Project [2 AS a#241]
> +- Scan OneRowRelation[]
> at org.apache.spark.sql.execution.ScalarSubquery.updateResult(subquery.scala:83)
> {code}
> but such kind of exceptions are not supposed to be visible to users. Need to introduce an error class (or re-use an existing one), and replace the IllegalStateException.
--
This message was sent by Atlassian Jira
(v8.20.7#820007)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org