You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2021/04/21 06:59:00 UTC
[jira] [Commented] (SPARK-35168) mapred.reduce.tasks should be
shuffle.partitions not adaptive.coalescePartitions.initialPartitionNum
[ https://issues.apache.org/jira/browse/SPARK-35168?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17326301#comment-17326301 ]
Apache Spark commented on SPARK-35168:
--------------------------------------
User 'yaooqinn' has created a pull request for this issue:
https://github.com/apache/spark/pull/32265
> mapred.reduce.tasks should be shuffle.partitions not adaptive.coalescePartitions.initialPartitionNum
> ----------------------------------------------------------------------------------------------------
>
> Key: SPARK-35168
> URL: https://issues.apache.org/jira/browse/SPARK-35168
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 3.0.2, 3.1.1, 3.2.0
> Reporter: Kent Yao
> Priority: Minor
>
> {code:java}
> spark-sql> set spark.sql.adaptive.coalescePartitions.initialPartitionNum=1;
> spark.sql.adaptive.coalescePartitions.initialPartitionNum 1
> Time taken: 2.18 seconds, Fetched 1 row(s)
> spark-sql> set mapred.reduce.tasks;
> 21/04/21 14:27:11 WARN SetCommand: Property mapred.reduce.tasks is deprecated, showing spark.sql.shuffle.partitions instead.
> spark.sql.shuffle.partitions 1
> Time taken: 0.03 seconds, Fetched 1 row(s)
> spark-sql> set spark.sql.shuffle.partitions;
> spark.sql.shuffle.partitions 200
> Time taken: 0.024 seconds, Fetched 1 row(s)
> spark-sql> set mapred.reduce.tasks=2;
> 21/04/21 14:31:52 WARN SetCommand: Property mapred.reduce.tasks is deprecated, automatically converted to spark.sql.shuffle.partitions instead.
> spark.sql.shuffle.partitions 2
> Time taken: 0.017 seconds, Fetched 1 row(s)
> spark-sql> set mapred.reduce.tasks;
> 21/04/21 14:31:55 WARN SetCommand: Property mapred.reduce.tasks is deprecated, showing spark.sql.shuffle.partitions instead.
> spark.sql.shuffle.partitions 1
> Time taken: 0.017 seconds, Fetched 1 row(s)
> spark-sql>
> {code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org