You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Thomas Graves (Jira)" <ji...@apache.org> on 2020/12/08 20:27:00 UTC

[jira] [Created] (SPARK-33715) Error about cores being limiting resource confusing

Thomas Graves created SPARK-33715:
-------------------------------------

             Summary: Error about cores being limiting resource confusing
                 Key: SPARK-33715
                 URL: https://issues.apache.org/jira/browse/SPARK-33715
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 3.0.1
            Reporter: Thomas Graves


If you misconfigure your resources and cores is not the limiting resource we print the following error:

20/12/08 20:23:37 ERROR Main: Failed to initialize Spark session.
java.lang.IllegalArgumentException: The number of slots on an executor has to be limited by the number of cores, otherwise you waste resources and dynamic allocation doesn't work properly. Your configuration has core/task cpu slots = 8 and gpu = 1. Please adjust your configuration so that all resources require same number of executor slots.

I recieved reports that this was confusing to users. Specifically the sentence "Your configuration has core/task cpu slots = 8 and gpu = 1" so I think we can improve that message.

Note this only affects 3.0.0 and 3.0.1, 3.1.0 changed this functionality.

To reproduce just run spark 3.0.1 with something like:

$SPARK_HOME/bin/spark-shell --master yarn --executor-cores 8 --conf spark.executor.resource.gpu.amount=1 --conf spark.task.resource.gpu.amount=1



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org