You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2017/11/22 13:42:00 UTC
[jira] [Resolved] (SPARK-19417) spark.files.overwrite is ignored
[ https://issues.apache.org/jira/browse/SPARK-19417?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen resolved SPARK-19417.
-------------------------------
Resolution: Won't Fix
I think this behavior is on purpose, as these resources are effectively immutable. Letting them change might cause other odd behavior. You can find other ways to broadcast mutable state.
> spark.files.overwrite is ignored
> --------------------------------
>
> Key: SPARK-19417
> URL: https://issues.apache.org/jira/browse/SPARK-19417
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 2.1.0
> Reporter: Chris Kanich
>
> I have not been able to get Spark to actually overwrite a file after I have changed it on the driver node, re-called addFile, and then used it on the executors again. Here's a failing test.
> {code}
> test("can overwrite files when spark.files.overwrite is true") {
> val dir = Utils.createTempDir()
> val file = new File(dir, "file")
> try {
> Files.write("one", file, StandardCharsets.UTF_8)
> sc = new SparkContext(new SparkConf().setAppName("test").setMaster("local-cluster[1,1,1024]")
> .set("spark.files.overwrite", "true"))
> sc.addFile(file.getAbsolutePath)
> def getAddedFileContents(): String = {
> sc.parallelize(Seq(0)).map { _ =>
> scala.io.Source.fromFile(SparkFiles.get("file")).mkString
> }.first()
> }
> assert(getAddedFileContents() === "one")
> Files.write("two", file, StandardCharsets.UTF_8)
> sc.addFile(file.getAbsolutePath)
> assert(getAddedFileContents() === "onetwo")
> } finally {
> Utils.deleteRecursively(dir)
> sc.stop()
> }
> }
> {code}
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org