You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Felix Cheung (JIRA)" <ji...@apache.org> on 2016/06/21 08:34:57 UTC
[jira] [Comment Edited] (SPARK-16090) Improve method grouping in
SparkR generated docs
[ https://issues.apache.org/jira/browse/SPARK-16090?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15341367#comment-15341367 ]
Felix Cheung edited comment on SPARK-16090 at 6/21/16 8:34 AM:
---------------------------------------------------------------
Ok, I reviewed all 200+ html pages generated and on all of those, I would suggest splitting:
gapply
glm
predict
statfunctions
describe/summary
write.ml
(these are the only ones with different parameter names or types)
I would also suggest combining these into 1 rd/html:
print.*
was (Author: felixcheung):
Ok, I reviewed all 200+ html pages generated and on all of those, I would suggest splitting:
gapply
glm
predict
statfunctions
describe/summary
write.ml
I would also suggest combining these into 1 rd/html:
print.*
> Improve method grouping in SparkR generated docs
> ------------------------------------------------
>
> Key: SPARK-16090
> URL: https://issues.apache.org/jira/browse/SPARK-16090
> Project: Spark
> Issue Type: Improvement
> Components: Documentation, SparkR
> Affects Versions: 2.0.0
> Reporter: Xiangrui Meng
> Priority: Critical
>
> This JIRA follows the discussion on https://github.com/apache/spark/pull/13109 to improve method grouping in SparkR generated docs. Having one method per doc page is not an R convention. However, having many methods per doc page would hurt the readability. So a proper grouping would help. Since we use roxygen2 instead of writing Rd files directly, we should consider smaller groups to avoid confusion.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org