You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Igor Calabria (Jira)" <ji...@apache.org> on 2022/01/27 14:53:00 UTC

[jira] [Created] (SPARK-38044) Spark dynamic allocation ignores pending jobs

Igor Calabria created SPARK-38044:
-------------------------------------

             Summary: Spark dynamic allocation ignores pending jobs
                 Key: SPARK-38044
                 URL: https://issues.apache.org/jira/browse/SPARK-38044
             Project: Spark
          Issue Type: Bug
          Components: Kubernetes
    Affects Versions: 3.2.0
            Reporter: Igor Calabria


When running spark with dynamic allocation on kubernetes, everything seems to work fine for a single job. If there are pending tasks, executors are requested according to the configurations just fine. The problem happens when there are pending jobs, it seems that whatever is on queue to be executed is ignored by the dynamic allocation system. We're hitting this corner case because, in this particular situation, several jobs with just one(or just a few) task are submitted in parallel. The behavior we're seeing is that only the first executor is added, regardless how many pending tasks there are in other jobs in the same context. I'm strapped for time right now, but please let me know if you need some code to reproduce this or if I wasn't clear enough with the explanation



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org