You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean R. Owen (Jira)" <ji...@apache.org> on 2022/08/15 13:59:00 UTC

[jira] [Resolved] (SPARK-40058) Avoid filter twice in HadoopFSUtils

     [ https://issues.apache.org/jira/browse/SPARK-40058?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean R. Owen resolved SPARK-40058.
----------------------------------
    Fix Version/s: 3.4.0
       Resolution: Fixed

Resolved by https://github.com/apache/spark/pull/37498

> Avoid filter twice in HadoopFSUtils
> -----------------------------------
>
>                 Key: SPARK-40058
>                 URL: https://issues.apache.org/jira/browse/SPARK-40058
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 3.4.0
>            Reporter: ZiyueGuan
>            Priority: Minor
>             Fix For: 3.4.0
>
>
> In HadoopFSUtils, listLeafFiles will apply filter more than once in recursive method call. This may waste more time when filter logic is heavy. Would like to have a refactor on this.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org