You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by do...@apache.org on 2023/06/26 05:58:01 UTC
[spark] branch master updated: [SPARK-44153][CORE][UI][FOLLOWUP] Use JAVA_HOME and stderr
This is an automated email from the ASF dual-hosted git repository.
dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new 646388ee5f3 [SPARK-44153][CORE][UI][FOLLOWUP] Use JAVA_HOME and stderr
646388ee5f3 is described below
commit 646388ee5f3fb4da9f92586ff39352a0eac9c839
Author: Dongjoon Hyun <do...@apache.org>
AuthorDate: Sun Jun 25 22:57:50 2023 -0700
[SPARK-44153][CORE][UI][FOLLOWUP] Use JAVA_HOME and stderr
### What changes were proposed in this pull request?
This is a follow-up of https://github.com/apache/spark/pull/41709 to address the review comments.
### Why are the changes needed?
1. Use `JAVA_HOME` prefixed `jmap` to ensure the same version's `JVM` and JMAP.
2. Use the existing stderr instead of merging `stderr` and `stdout` via `redirectErrorStream`
3. Use `tryWithResource`.
### Does this PR introduce _any_ user-facing change?
No.
### How was this patch tested?
Manual review.
Closes #41731 from dongjoon-hyun/SPARK-44153-2.
Authored-by: Dongjoon Hyun <do...@apache.org>
Signed-off-by: Dongjoon Hyun <do...@apache.org>
---
core/src/main/scala/org/apache/spark/util/Utils.scala | 15 ++++++++-------
1 file changed, 8 insertions(+), 7 deletions(-)
diff --git a/core/src/main/scala/org/apache/spark/util/Utils.scala b/core/src/main/scala/org/apache/spark/util/Utils.scala
index ee74eacb84f..ada0cffd2b0 100644
--- a/core/src/main/scala/org/apache/spark/util/Utils.scala
+++ b/core/src/main/scala/org/apache/spark/util/Utils.scala
@@ -2291,15 +2291,16 @@ private[spark] object Utils extends Logging with SparkClassUtils {
def getHeapHistogram(): Array[String] = {
// From Java 9+, we can use 'ProcessHandle.current().pid()'
val pid = getProcessName().split("@").head
- val builder = new ProcessBuilder("jmap", "-histo:live", pid)
- builder.redirectErrorStream(true)
+ val jmap = System.getProperty("java.home") + "/bin/jmap"
+ val builder = new ProcessBuilder(jmap, "-histo:live", pid)
val p = builder.start()
- val r = new BufferedReader(new InputStreamReader(p.getInputStream()))
val rows = ArrayBuffer.empty[String]
- var line = ""
- while (line != null) {
- if (line.nonEmpty) rows += line
- line = r.readLine()
+ Utils.tryWithResource(new BufferedReader(new InputStreamReader(p.getInputStream()))) { r =>
+ var line = ""
+ while (line != null) {
+ if (line.nonEmpty) rows += line
+ line = r.readLine()
+ }
}
rows.toArray
}
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org