You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Kevin Zhang (JIRA)" <ji...@apache.org> on 2018/06/22 04:10:00 UTC
[jira] [Commented] (SPARK-20295) when spark.sql.adaptive.enabled
is enabled, have conflict with Exchange Resue
[ https://issues.apache.org/jira/browse/SPARK-20295?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16519994#comment-16519994 ]
Kevin Zhang commented on SPARK-20295:
-------------------------------------
Still hit this bug in spark 2.3.0
> when spark.sql.adaptive.enabled is enabled, have conflict with Exchange Resue
> ------------------------------------------------------------------------------
>
> Key: SPARK-20295
> URL: https://issues.apache.org/jira/browse/SPARK-20295
> Project: Spark
> Issue Type: Bug
> Components: Shuffle, SQL
> Affects Versions: 2.1.0
> Reporter: Ruhui Wang
> Priority: Major
>
> when run tpcds-q95, and set spark.sql.adaptive.enabled = true the physical plan firstly:
> Sort
> : +- Exchange(coordinator id: 1)
> : +- Project***
> : :-Sort **
> : : +- Exchange(coordinator id: 2)
> : : :- Project ***
> : +- Sort
> : : +- Exchange(coordinator id: 3)
> spark.sql.exchange.reuse is opened, then physical plan will become below:
> Sort
> : +- Exchange(coordinator id: 1)
> : +- Project***
> : :-Sort **
> : : +- Exchange(coordinator id: 2)
> : : :- Project ***
> : +- Sort
> : : +- ReusedExchange Exchange(coordinator id: 2)
> If spark.sql.adaptive.enabled = true, the code stack is : ShuffleExchange#doExecute --> postShuffleRDD function --> doEstimationIfNecessary . In this function,
> assert(exchanges.length == numExchanges) will be error, as left side has only one element, but right is equal to 2.
> If this is a bug of spark.sql.adaptive.enabled and exchange resue?
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org