You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2021/03/03 04:27:00 UTC
[jira] [Assigned] (SPARK-34225) Jars or file paths which contain
spaces are generating FileNotFoundException exception
[ https://issues.apache.org/jira/browse/SPARK-34225?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-34225:
------------------------------------
Assignee: Apache Spark
> Jars or file paths which contain spaces are generating FileNotFoundException exception
> --------------------------------------------------------------------------------------
>
> Key: SPARK-34225
> URL: https://issues.apache.org/jira/browse/SPARK-34225
> Project: Spark
> Issue Type: Bug
> Components: Spark Core, Spark Submit, Windows
> Affects Versions: 3.0.1
> Reporter: Lucian Timar
> Assignee: Apache Spark
> Priority: Major
>
> Whenever I try to use jars or files whose paths contain spaces FileNotFoundException exception is generated
> just run spark-shell command having spaces in paths
> See below
> {code:java}
> c:>spark-shell --files "c:\Program Files\...\myjar.jar"
> or
> c:>spark-shell --jars"c:\Program Files\...\myjar.jar"
> or
> c:>spark-shell --conf spark.jars="c:\Program Files\...\myjar.jar"
> any combination produce the same exception
> java.io.FileNotFoundException: Jar c:\Program%20Files\........ not found
> at org.apache.spark.SparkContext.addLocalJarFile$1(SparkContext.scala:1833)
> at org.apache.spark.SparkContext.addJar(SparkContext.scala:1887)
> at org.apache.spark.SparkContext.$anonfun$new$11(SparkContext.scala:490)
> at org.apache.spark.SparkContext.$anonfun$new$11$adapted(SparkContext.scala:490)
> at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
> at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
> at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:490)
> at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2574)
> at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:934)
> at scala.Option.getOrElse(Option.scala:189)
> at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:928)
> at org.apache.spark.repl.Main$.createSparkSession(Main.scala:106)
> {code}
> I have noticed that the following function is causing the issue.
> {code:java}
> def addJar(path: String): Unit = {
> ...
> } else {
> val uri = new Path(path).toUri
> {code}
> the path as the string is
> "file:///C:/Program%20Files/Nokia/....jar"
> this call generates the following uri
> "file:///C:/Program*%25*20Files/Nokia/....jar"
> which results in an invalid file name.
> Using
> {code:java}
> val uri = Utils.resolveURI(path){code}
> seems to resolve the issue.
> Until a fix is provided are there any workarounds to overcome this issue?
>
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org