You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "WuZeyi (Jira)" <ji...@apache.org> on 2019/12/09 03:17:00 UTC

[jira] [Commented] (SPARK-29553) This problemis about using native BLAS to improvement ML/MLLIB performance

    [ https://issues.apache.org/jira/browse/SPARK-29553?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16991098#comment-16991098 ] 

WuZeyi commented on SPARK-29553:
--------------------------------

[~srowen] Sir, I use the cmd line “export OPENBLAS_NUM_THREADS=1” to set env of OS on the machine where my executor is launched,  but it doesn't work either.

I propose a concrete change to the docs, PTAL.

https://github.com/apache/spark/pull/26801

> This problemis about using native BLAS to improvement ML/MLLIB performance
> --------------------------------------------------------------------------
>
>                 Key: SPARK-29553
>                 URL: https://issues.apache.org/jira/browse/SPARK-29553
>             Project: Spark
>          Issue Type: Improvement
>          Components: ML, MLlib
>    Affects Versions: 2.3.0, 2.4.4
>            Reporter: WuZeyi
>            Priority: Minor
>              Labels: performance
>         Attachments: image-2019-11-19-16-11-43-130.png, image-2019-11-19-16-13-30-723.png, image-2019-11-21-17-08-15-797.png
>
>
> I use {color:#ff0000}native BLAS{color} to improvement ML/MLLIB performance on Yarn.
> The file {color:#ff0000}spark-env.sh{color} which is modified by SPARK-21305 said that I should set {color:#ff0000}OPENBLAS_NUM_THREADS=1{color} to disable multi-threading of OpenBLAS, but it does not take effect.
> I modify {color:#ff0000}spark.conf{color} to set  {color:#FF0000}spark.executorEnv.OPENBLAS_NUM_THREADS=1{color},and the performance improve.
>   
>   
>  I think MKL_NUM_THREADS is the same.
>   



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org