You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2021/12/15 04:46:35 UTC

[GitHub] [spark] PerilousApricot commented on a change in pull request #34903: [SPARK-37650][PYTHON] Tell spark-env.sh the python interpreter

PerilousApricot commented on a change in pull request #34903:
URL: https://github.com/apache/spark/pull/34903#discussion_r769237821



##########
File path: docs/configuration.md
##########
@@ -3029,7 +3029,10 @@ can be found on the pages for each mode:
 Certain Spark settings can be configured through environment variables, which are read from the
 `conf/spark-env.sh` script in the directory where Spark is installed (or `conf/spark-env.cmd` on
 Windows). In Standalone and Mesos modes, this file can give machine specific information such as
-hostnames. It is also sourced when running local Spark applications or submission scripts.
+hostnames. It is also sourced when running local Spark applications or submission scripts. For
+pyspark applications, the environment variable `_PYSPARK_DRIVER_SYS_EXECUTABLE` will be set to
+the python interpreter's `sys.executable`, which will allow further customization based on the
+user's virtual environment.

Review comment:
       done @ e747a2f




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org