You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "XiDuo You (Jira)" <ji...@apache.org> on 2022/02/11 04:52:00 UTC

[jira] [Created] (SPARK-38182) Fix NoSuchElementException if pushed filter does not contain any references

XiDuo You created SPARK-38182:
---------------------------------

             Summary: Fix NoSuchElementException if pushed filter does not contain any references
                 Key: SPARK-38182
                 URL: https://issues.apache.org/jira/browse/SPARK-38182
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 3.3.0
            Reporter: XiDuo You


reproduce:

{code:java}
CREATE TABLE pt (c1 int) USING PARQUET PARTITIONED BY (p string);

set spark.sql.optimizer.excludedRules=org.apache.spark.sql.catalyst.optimizer.BooleanSimplification;

SELECT * FROM pt WHERE p = 'a' AND 2 > 1;
{code}

and the error msg:

{code:java}
java.util.NoSuchElementException: next on empty iterator
	at scala.collection.Iterator$$anon$2.next(Iterator.scala:41)
	at scala.collection.Iterator$$anon$2.next(Iterator.scala:39)
	at scala.collection.mutable.LinkedHashSet$$anon$1.next(LinkedHashSet.scala:89)
	at scala.collection.IterableLike.head(IterableLike.scala:109)
	at scala.collection.IterableLike.head$(IterableLike.scala:108)
	at org.apache.spark.sql.catalyst.expressions.AttributeSet.head(AttributeSet.scala:69)
	at org.apache.spark.sql.execution.datasources.PartitioningAwareFileIndex.$anonfun$listFiles$3(PartitioningAwareFileIndex.scala:85)
	at scala.Option.map(Option.scala:230)
	at org.apache.spark.sql.execution.datasources.PartitioningAwareFileIndex.listFiles(PartitioningAwareFileIndex.scala:84)
	at org.apache.spark.sql.execution.FileSourceScanExec.selectedPartitions$lzycompute(DataSourceScanExec.scala:249)
{code}





--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org