You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@zeppelin.apache.org by GitBox <gi...@apache.org> on 2021/06/29 03:55:14 UTC

[GitHub] [zeppelin] zjffdu commented on a change in pull request #4147: [ZEPPELIN-5417] Unable to set conda env in pyspark

zjffdu commented on a change in pull request #4147:
URL: https://github.com/apache/zeppelin/pull/4147#discussion_r660261687



##########
File path: spark/interpreter/src/main/java/org/apache/zeppelin/spark/PySparkInterpreter.java
##########
@@ -171,11 +171,12 @@ public void setInterpreterContextInPython() {
   // spark.pyspark.driver.python > spark.pyspark.python > PYSPARK_DRIVER_PYTHON > PYSPARK_PYTHON
   @Override
   protected String getPythonExec() {
-    if (!StringUtils.isBlank(getProperty("spark.pyspark.driver.python", ""))) {
-      return properties.getProperty("spark.pyspark.driver.python");
+    SparkConf sparkConf = getSparkConf();
+    if (StringUtils.isNotBlank(sparkConf.get("spark.pyspark.driver.python", ""))) {
+      return sparkConf.get("spark.pyspark.driver.python");
     }
-    if (!StringUtils.isBlank(getProperty("spark.pyspark.python", ""))) {
-      return properties.getProperty("spark.pyspark.python");
+    if (StringUtils.isNotBlank(sparkConf.get("spark.pyspark.python", ""))) {
+      return sparkConf.get("spark.pyspark.python");
     }
     if (System.getenv("PYSPARK_PYTHON") != null) {
       return System.getenv("PYSPARK_PYTHON");

Review comment:
       Thanks for the careful review, this is a bug, fixed it now.




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscribe@zeppelin.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org