You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Nicolas Poggi (JIRA)" <ji...@apache.org> on 2018/02/12 18:36:00 UTC

[jira] [Comment Edited] (SPARK-23310) Perf regression introduced by SPARK-21113

    [ https://issues.apache.org/jira/browse/SPARK-23310?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16361071#comment-16361071 ] 

Nicolas Poggi edited comment on SPARK-23310 at 2/12/18 6:35 PM:
----------------------------------------------------------------

Q72 of TPC-DS is also affected around 30% at scale factor 1000. [~juliuszsompolski] SPARK-23366 also fixes it.


was (Author: npoggi):
Q72 of TPC-DS is also affected around 30% at scale factor 1000. [~juliuszsompolski] SPARK-23355 also fixes it.

> Perf regression introduced by SPARK-21113
> -----------------------------------------
>
>                 Key: SPARK-23310
>                 URL: https://issues.apache.org/jira/browse/SPARK-23310
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.3.0
>            Reporter: Yin Huai
>            Assignee: Sital Kedia
>            Priority: Blocker
>             Fix For: 2.3.0
>
>
> While running all TPC-DS queries with SF set to 1000, we noticed that Q95 (https://github.com/databricks/spark-sql-perf/blob/master/src/main/resources/tpcds_2_4/q95.sql) has noticeable regression (11%). After looking into it, we found that the regression was introduced by SPARK-21113. Specially, ReadAheadInputStream gets lock congestion. After setting spark.unsafe.sorter.spill.read.ahead.enabled set to false, the regression disappear and the overall performance of all TPC-DS queries has improved.
>  
> I am proposing that we set spark.unsafe.sorter.spill.read.ahead.enabled to false by default for Spark 2.3 and re-enable it after addressing the lock congestion issue. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org