You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2022/08/18 09:09:00 UTC
[jira] [Commented] (SPARK-40137) Combines limits after projection
[ https://issues.apache.org/jira/browse/SPARK-40137?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17581248#comment-17581248 ]
Apache Spark commented on SPARK-40137:
--------------------------------------
User 'ConeyLiu' has created a pull request for this issue:
https://github.com/apache/spark/pull/37565
> Combines limits after projection
> --------------------------------
>
> Key: SPARK-40137
> URL: https://issues.apache.org/jira/browse/SPARK-40137
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 3.2.2
> Reporter: Xianyang Liu
> Priority: Major
>
> `Dataset.show` will add extra `Limit` and `Projection` on top of the given logical plan. If the `Dataset` is already a limit job, this will introduce an extra shuffle phase. So we should combine the limit after projection.
> For example:
> ```scala
> spark.sql("select * from spark.store_sales limit 10").show()
> ```
> Before:
> ```
> == Physical Plan ==
> AdaptiveSparkPlan (12)
> +- == Final Plan ==
> * Project (7)
> +- * GlobalLimit (6)
> +- ShuffleQueryStage (5), Statistics(sizeInBytes=185.6 KiB, rowCount=990)
> +- Exchange (4)
> +- * LocalLimit (3)
> +- * ColumnarToRow (2)
> +- Scan parquet spark_catalog.spark.store_sales (1)
> +- == Initial Plan ==
> Project (11)
> +- GlobalLimit (10)
> +- Exchange (9)
> +- LocalLimit (8)
> +- Scan parquet spark_catalog.spark.store_sales (1)
> ```
> After:
> ```
> == Physical Plan ==
> CollectLimit (4)
> +- * Project (3)
> +- * ColumnarToRow (2)
> +- Scan parquet spark_catalog.spark.store_sales (1)
> ```
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org