You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "WeichenXu123 (via GitHub)" <gi...@apache.org> on 2023/02/28 09:18:13 UTC

[GitHub] [spark] WeichenXu123 commented on a diff in pull request #40212: [SPARK-42613][CORE][PYTHON][YARN] PythonRunner should set OMP_NUM_THREADS to task cpus instead of executor cores by default

WeichenXu123 commented on code in PR #40212:
URL: https://github.com/apache/spark/pull/40212#discussion_r1119771653


##########
core/src/main/scala/org/apache/spark/api/python/PythonRunner.scala:
##########
@@ -140,7 +140,7 @@ private[spark] abstract class BasePythonRunner[IN, OUT](
       // SPARK-28843: limit the OpenMP thread pool to the number of cores assigned to this executor
       // this avoids high memory consumption with pandas/numpy because of a large OpenMP thread pool
       // see https://github.com/numpy/numpy/issues/10455
-      execCoresProp.foreach(envVars.put("OMP_NUM_THREADS", _))
+      envVars.put("OMP_NUM_THREADS", conf.get("spark.task.cpus", "1"))

Review Comment:
   I think we should remove the code here,
   because we already set this in here https://github.com/apache/spark/blob/74410ca2f1318177e558f1e719e0cac0f0196807/core/src/main/scala/org/apache/spark/SparkContext.scala#L551 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org