You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sharad (JIRA)" <ji...@apache.org> on 2016/11/30 18:19:58 UTC

[jira] [Commented] (SPARK-13061) Error in spark rest api application info for job names contains spaces

    [ https://issues.apache.org/jira/browse/SPARK-13061?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15709325#comment-15709325 ] 

Sharad commented on SPARK-13061:
--------------------------------

As Devraj mentioned the correct URL to access job details is of the form - /applications/[app-id]/jobs/[job-id]	
You should not be using name in the rest call.
This should be closed.

> Error in spark rest api application info for job names contains spaces
> ----------------------------------------------------------------------
>
>                 Key: SPARK-13061
>                 URL: https://issues.apache.org/jira/browse/SPARK-13061
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.5.2
>            Reporter: Avihoo Mamka
>            Priority: Trivial
>              Labels: rest_api, spark
>
> When accessing spark rest api with application id to get job specific id status, a job with name containing whitespaces are being encoded to '%20' and therefore the rest api returns `no such app`.
> For example:
> http://spark.mysite.com:20888/proxy/application_1447676402999_1254/api/v1/applications/ returns:
> [ {
>   "id" : "Spark shell",
>   "name" : "Spark shell",
>   "attempts" : [ {
>     "startTime" : "2016-01-28T09:20:58.526GMT",
>     "endTime" : "1969-12-31T23:59:59.999GMT",
>     "sparkUser" : "",
>     "completed" : false
>   } ]
> } ]
> and then when accessing:
> http://spark.mysite.com:20888/proxy/application_1447676402999_1254/api/v1/applications/Spark shell/
> the result returned is:
> unknown app: Spark%20shell



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org