You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2021/01/20 00:40:20 UTC
[spark] branch branch-3.0 updated: [SPARK-34115][CORE] Check
SPARK_TESTING as lazy val to avoid slowdown
This is an automated email from the ASF dual-hosted git repository.
gurwls223 pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.0 by this push:
new b5b1da9 [SPARK-34115][CORE] Check SPARK_TESTING as lazy val to avoid slowdown
b5b1da9 is described below
commit b5b1da961e65227c1fbc59544390f31def1d15f1
Author: Norbert Schultz <no...@reactivecore.de>
AuthorDate: Wed Jan 20 09:39:13 2021 +0900
[SPARK-34115][CORE] Check SPARK_TESTING as lazy val to avoid slowdown
### What changes were proposed in this pull request?
Check SPARK_TESTING as lazy val to avoid slow down when there are many environment variables
### Why are the changes needed?
If there are many environment variables, sys.env slows is very slow. As Utils.isTesting is called very often during Dataframe-Optimization, this can slow down evaluation very much.
An example for triggering the problem can be found in the bug ticket https://issues.apache.org/jira/browse/SPARK-34115
### Does this PR introduce _any_ user-facing change?
No
### How was this patch tested?
With the example provided in the ticket.
Closes #31244 from nob13/bug/34115.
Lead-authored-by: Norbert Schultz <no...@reactivecore.de>
Co-authored-by: Norbert Schultz <no...@gmail.com>
Signed-off-by: HyukjinKwon <gu...@apache.org>
(cherry picked from commit c3d8352ca1a59ce5cc37840919c0e799f5150efa)
Signed-off-by: HyukjinKwon <gu...@apache.org>
---
core/src/main/scala/org/apache/spark/util/Utils.scala | 4 +++-
1 file changed, 3 insertions(+), 1 deletion(-)
diff --git a/core/src/main/scala/org/apache/spark/util/Utils.scala b/core/src/main/scala/org/apache/spark/util/Utils.scala
index c7db212..867cd19 100644
--- a/core/src/main/scala/org/apache/spark/util/Utils.scala
+++ b/core/src/main/scala/org/apache/spark/util/Utils.scala
@@ -1879,7 +1879,9 @@ private[spark] object Utils extends Logging {
* Indicates whether Spark is currently running unit tests.
*/
def isTesting: Boolean = {
- sys.env.contains("SPARK_TESTING") || sys.props.contains(IS_TESTING.key)
+ // Scala's `sys.env` creates a ton of garbage by constructing Scala immutable maps, so
+ // we directly use the Java APIs instead.
+ System.getenv("SPARK_TESTING") != null || System.getProperty(IS_TESTING.key) != null
}
/**
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org