You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Kevin Chen (JIRA)" <ji...@apache.org> on 2015/09/17 19:01:04 UTC
[jira] [Commented] (SPARK-10565) New /api/v1/[path] APIs don't
contain as much information as original /json API
[ https://issues.apache.org/jira/browse/SPARK-10565?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14803224#comment-14803224 ]
Kevin Chen commented on SPARK-10565:
------------------------------------
To summarize what has been discussed up until now in a separate email thread on dev@spark.apache.org:
We plan to add the remaining information to the v1 API without incrementing the version number, because this change will only add more endpoints / more fields to existing endpoints.
Mark Hamstra has also requested adding endpoints that get results by jobGroup (cf. SparkContext#setJobGroup) instead of just a single job.
> New /api/v1/[path] APIs don't contain as much information as original /json API
> --------------------------------------------------------------------------------
>
> Key: SPARK-10565
> URL: https://issues.apache.org/jira/browse/SPARK-10565
> Project: Spark
> Issue Type: Improvement
> Components: Input/Output, Java API
> Affects Versions: 1.5.0
> Reporter: Kevin Chen
> Original Estimate: 1h
> Remaining Estimate: 1h
>
> [SPARK-3454] introduced official json APIs at /api/v1/[path] for data that originally appeared only on the web UI. However, it does not expose all the information on the web UI or on the previous unofficial endpoint at /json.
> For example, the APIs at /api/v1/[path] do not show the number of cores or amount of memory per slave for each job. This is stored in ApplicationInfo.desc.maxCores and ApplicationInfo.desc.memoryPerSlave, respectively. This information would be useful to expose.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org