You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by sr...@apache.org on 2022/06/02 12:44:30 UTC
[spark] branch master updated: [SPARK-38807][CORE] Fix the startup error of spark shell on Windows
This is an automated email from the ASF dual-hosted git repository.
srowen pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new a760975083e [SPARK-38807][CORE] Fix the startup error of spark shell on Windows
a760975083e is described below
commit a760975083ea0696e8fd834ecfe3fb877b7f7449
Author: Ming Li <11...@qq.com>
AuthorDate: Thu Jun 2 07:44:17 2022 -0500
[SPARK-38807][CORE] Fix the startup error of spark shell on Windows
### What changes were proposed in this pull request?
The File.getCanonicalPath method will return the drive letter in the windows system. The RpcEnvFileServer.validateDirectoryUri method uses the File.getCanonicalPath method to process the baseuri, which will cause the baseuri not to comply with the URI verification rules. For example, the / classes is processed into F: \ classes.This causes the sparkcontext to fail to start on windows.
This PR modifies the RpcEnvFileServer.validateDirectoryUri method and replaces `new File(baseUri).getCanonicalPath` with
`new URI(baseUri).normalize().getPath`. This method can work normally in windows.
### Why are the changes needed?
Fix the startup error of spark shell on Windows system
[[SPARK-35691](https://issues.apache.org/jira/browse/SPARK-35691)] introduced this regression.
### Does this PR introduce any user-facing change?
No
### How was this patch tested?
CI
Closes #36447 from 1104056452/master.
Lead-authored-by: Ming Li <11...@qq.com>
Co-authored-by: ming li <11...@qq.com>
Signed-off-by: Sean Owen <sr...@gmail.com>
---
core/src/main/scala/org/apache/spark/rpc/RpcEnv.scala | 3 ++-
1 file changed, 2 insertions(+), 1 deletion(-)
diff --git a/core/src/main/scala/org/apache/spark/rpc/RpcEnv.scala b/core/src/main/scala/org/apache/spark/rpc/RpcEnv.scala
index bf19190c021..82d3a28894b 100644
--- a/core/src/main/scala/org/apache/spark/rpc/RpcEnv.scala
+++ b/core/src/main/scala/org/apache/spark/rpc/RpcEnv.scala
@@ -18,6 +18,7 @@
package org.apache.spark.rpc
import java.io.File
+import java.net.URI
import java.nio.channels.ReadableByteChannel
import scala.concurrent.Future
@@ -187,7 +188,7 @@ private[spark] trait RpcEnvFileServer {
/** Validates and normalizes the base URI for directories. */
protected def validateDirectoryUri(baseUri: String): String = {
- val baseCanonicalUri = new File(baseUri).getCanonicalPath
+ val baseCanonicalUri = new URI(baseUri).normalize().getPath
val fixedBaseUri = "/" + baseCanonicalUri.stripPrefix("/").stripSuffix("/")
require(fixedBaseUri != "/files" && fixedBaseUri != "/jars",
"Directory URI cannot be /files nor /jars.")
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org