You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Thomas Graves (Jira)" <ji...@apache.org> on 2019/09/20 16:13:00 UTC

[jira] [Updated] (SPARK-29151) Support fraction resources for task resource scheduling

     [ https://issues.apache.org/jira/browse/SPARK-29151?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Thomas Graves updated SPARK-29151:
----------------------------------
    Summary: Support fraction resources for task resource scheduling  (was: Support fraction resources for resource scheduling)

> Support fraction resources for task resource scheduling
> -------------------------------------------------------
>
>                 Key: SPARK-29151
>                 URL: https://issues.apache.org/jira/browse/SPARK-29151
>             Project: Spark
>          Issue Type: Story
>          Components: Scheduler
>    Affects Versions: 3.0.0
>            Reporter: Thomas Graves
>            Priority: Major
>
> The current resource scheduling code for GPU/FPGA, etc only supports amounts as integers, so you can only schedule whole resources.  There are cases where you may want to share the resources and schedule multiple tasks to run on the same resources (GPU).  It would be nice to support fractional resources.  Somehow say we want a task to have 1/4 of a GPU for instance.  I think we only want to support fractional when the resources amount is < 1.  Otherwise you run into issues where someone asks for 2 1/8 GPU, which doesn't really make sense to me and makes assigning addresses very complicated.
> Need to think about implementation details, for instance using a float can be troublesome here due to floating point math precision issues.
> Another thing to consider, depending on implementation is limiting the precision - go down to tenths, hundreths, thousandths, etc.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org