You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "jzhuge (via GitHub)" <gi...@apache.org> on 2023/03/01 02:20:22 UTC

[GitHub] [spark] jzhuge commented on a diff in pull request #40212: [SPARK-42613][CORE][PYTHON][YARN] PythonRunner should set OMP_NUM_THREADS to task cpus instead of executor cores by default

jzhuge commented on code in PR #40212:
URL: https://github.com/apache/spark/pull/40212#discussion_r1121037803


##########
core/src/main/scala/org/apache/spark/api/python/PythonRunner.scala:
##########
@@ -140,7 +140,7 @@ private[spark] abstract class BasePythonRunner[IN, OUT](
       // SPARK-28843: limit the OpenMP thread pool to the number of cores assigned to this executor
       // this avoids high memory consumption with pandas/numpy because of a large OpenMP thread pool
       // see https://github.com/numpy/numpy/issues/10455
-      execCoresProp.foreach(envVars.put("OMP_NUM_THREADS", _))
+      envVars.put("OMP_NUM_THREADS", conf.get("spark.task.cpus", "1"))

Review Comment:
   Thanks for the context!
   Good to go for me.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org