You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Josh Rosen (JIRA)" <ji...@apache.org> on 2015/04/07 08:38:12 UTC

[jira] [Resolved] (SPARK-6716) Change SparkContext.DRIVER_IDENTIFIER from '' to 'driver'

     [ https://issues.apache.org/jira/browse/SPARK-6716?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Josh Rosen resolved SPARK-6716.
-------------------------------
       Resolution: Fixed
    Fix Version/s: 1.4.0

Issue resolved by pull request 5372
[https://github.com/apache/spark/pull/5372]

> Change SparkContext.DRIVER_IDENTIFIER from '<driver>' to 'driver'
> -----------------------------------------------------------------
>
>                 Key: SPARK-6716
>                 URL: https://issues.apache.org/jira/browse/SPARK-6716
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>            Reporter: Josh Rosen
>            Assignee: Josh Rosen
>             Fix For: 1.4.0
>
>
> Currently, the driver's executorId is set to {{<driver>}}.  This choice of ID was present in older Spark versions, but it has started to cause problems now that executorIds are used in more contexts, such as Ganglia metric names or driver thread-dump links the web UI.  The angle brackets must be escaped when embedding this ID in XML or as part of URLs and this has led to multiple problems:
> - https://issues.apache.org/jira/browse/SPARK-6484
> - https://issues.apache.org/jira/browse/SPARK-4313
> The simplest solution seems to be to change this id to something that does not contain any special characters, such as {{driver}}. 
> I'm not sure whether we can perform this change in a patch release, since this ID may be considered a stable API by metrics users, but it's probably okay to do this in a major release as long as we document it in the release notes.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org