You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by cloud-fan <gi...@git.apache.org> on 2017/10/06 01:54:46 UTC
[GitHub] spark pull request #17357: [SPARK-20025][CORE] Ignore SPARK_LOCAL* env, whil...
Github user cloud-fan commented on a diff in the pull request:
https://github.com/apache/spark/pull/17357#discussion_r143096475
--- Diff: core/src/main/scala/org/apache/spark/deploy/worker/DriverWrapper.scala ---
@@ -23,14 +23,15 @@ import org.apache.commons.lang3.StringUtils
import org.apache.spark.{SecurityManager, SparkConf}
import org.apache.spark.deploy.{DependencyUtils, SparkHadoopUtil, SparkSubmit}
+import org.apache.spark.internal.Logging
import org.apache.spark.rpc.RpcEnv
import org.apache.spark.util.{ChildFirstURLClassLoader, MutableURLClassLoader, Utils}
/**
* Utility object for launching driver programs such that they share fate with the Worker process.
* This is used in standalone cluster mode only.
*/
-object DriverWrapper {
+object DriverWrapper extends Logging {
--- End diff --
This `DriverWrapper` actually runs within the same JVM of driver, and initialize log4j instance earlier. Will this be a problem?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org