You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by ScrapCodes <gi...@git.apache.org> on 2018/08/23 10:02:39 UTC

[GitHub] spark pull request #17745: [SPARK-17159][Streaming] optimise check for new f...

Github user ScrapCodes commented on a diff in the pull request:

    https://github.com/apache/spark/pull/17745#discussion_r212251757
  
    --- Diff: streaming/src/main/scala/org/apache/spark/streaming/dstream/FileInputDStream.scala ---
    @@ -196,29 +191,29 @@ class FileInputDStream[K, V, F <: NewInputFormat[K, V]](
           logDebug(s"Getting new files for time $currentTime, " +
             s"ignoring files older than $modTimeIgnoreThreshold")
     
    -      val newFileFilter = new PathFilter {
    -        def accept(path: Path): Boolean = isNewFile(path, currentTime, modTimeIgnoreThreshold)
    -      }
    -      val directoryFilter = new PathFilter {
    -        override def accept(path: Path): Boolean = fs.getFileStatus(path).isDirectory
    -      }
    -      val directories = fs.globStatus(directoryPath, directoryFilter).map(_.getPath)
    +      val directories = Option(fs.globStatus(directoryPath)).getOrElse(Array.empty[FileStatus])
    --- End diff --
    
    In this approach, we might be fetching a very large list of files and then filtering through the directories. If the fetched, list is too large, then it can be a problem.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org