You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Thomas Graves (Jira)" <ji...@apache.org> on 2019/09/05 15:18:00 UTC
[jira] [Commented] (SPARK-27492) GPU scheduling - High level user
documentation
[ https://issues.apache.org/jira/browse/SPARK-27492?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16923528#comment-16923528 ]
Thomas Graves commented on SPARK-27492:
---------------------------------------
I think we talked about this a big in the prs for the main changes. I think its better to leave as resource since really cpu/memory could be rolled into this. They are just special since pre-existing ones. Also this could be extended to any resource type that wouldn't necessarily be an accelerator. Also if https://issues.apache.org/jira/browse/SPARK-27495 stage scheduling gets approved everything will be under a single api there and just referred to as resource.
> GPU scheduling - High level user documentation
> ----------------------------------------------
>
> Key: SPARK-27492
> URL: https://issues.apache.org/jira/browse/SPARK-27492
> Project: Spark
> Issue Type: Story
> Components: Documentation
> Affects Versions: 3.0.0
> Reporter: Thomas Graves
> Assignee: Thomas Graves
> Priority: Major
>
> For the SPIP - Accelerator-aware task scheduling for Spark, https://issues.apache.org/jira/browse/SPARK-24615 Add some high level user documentation about how this feature works together and point to things like the example discovery script, etc.
>
> - make sure to document the discovery script and what permissions are needed and any security implications
> - Document standalone - local-cluster mode limitation of only a single resource file or discovery script so you have to have coordination on for it to work right.
--
This message was sent by Atlassian Jira
(v8.3.2#803003)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org