You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Chris Kanich (JIRA)" <ji...@apache.org> on 2017/02/01 04:21:51 UTC

[jira] [Created] (SPARK-19417) spark.files.overwrite is ignored

Chris Kanich created SPARK-19417:
------------------------------------

             Summary: spark.files.overwrite is ignored
                 Key: SPARK-19417
                 URL: https://issues.apache.org/jira/browse/SPARK-19417
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 2.1.0
            Reporter: Chris Kanich


I have not been able to get Spark to actually overwrite a file after I have changed it on the driver node, re-called addFile, and then used it on the executors again. Here's a failing test.

{code}

  test("can overwrite files when spark.files.overwrite is true") {
    val dir = Utils.createTempDir()
    val file = new File(dir, "file")
    try {
      Files.write("one", file, StandardCharsets.UTF_8)
      sc = new SparkContext(new SparkConf().setAppName("test").setMaster("local-cluster[1,1,1024]")
         .set("spark.files.overwrite", "true"))
      sc.addFile(file.getAbsolutePath)
      def getAddedFileContents(): String = {
        sc.parallelize(Seq(0)).map { _ =>
          scala.io.Source.fromFile(SparkFiles.get("file")).mkString
        }.first()
      }
      assert(getAddedFileContents() === "one")
      Files.write("two", file, StandardCharsets.UTF_8)
      sc.addFile(file.getAbsolutePath)
      assert(getAddedFileContents() === "onetwo")
    } finally {
      Utils.deleteRecursively(dir)
      sc.stop()
    }
  }

{code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org