You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "yiming.xu (Jira)" <ji...@apache.org> on 2019/09/29 06:37:00 UTC

[jira] [Created] (SPARK-29284) df.distinct.count throw NoSuchElementException when enabled daptive executor

yiming.xu created SPARK-29284:
---------------------------------

             Summary: df.distinct.count throw NoSuchElementException when enabled daptive executor 
                 Key: SPARK-29284
                 URL: https://issues.apache.org/jira/browse/SPARK-29284
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 2.4.4
            Reporter: yiming.xu


This case:
 spark.sql("SET spark.sql.adaptive.enabled=true")
  spark.sql("SET spark.sql.shuffle.partitions=1")
  val result = spark.range(0).distinct().count 
or spark.table("empty table").distinct.count
throw java.util.NoSuchElementException: next on empty iterator

If current stage partition is 0,org.apache.spark.sql.execution.exchange.ExchangeCoordinator#doEstimationIfNecessary->partitionStartIndices is empty (https://issues.apache.org/jira/browse/SPARK-22144)
next stage rdd partition is empty.

To solve the problem I will change partitionStartIndices to Array(0) when the parent rdd partition is 0;





--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org