You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Matthew Farrellee (JIRA)" <ji...@apache.org> on 2014/09/16 17:51:34 UTC

[jira] [Commented] (SPARK-3508) annotate the Spark configs to indicate which ones are meant for the end user

    [ https://issues.apache.org/jira/browse/SPARK-3508?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14135631#comment-14135631 ] 

Matthew Farrellee commented on SPARK-3508:
------------------------------------------

documented == public is a good metric. to handle the case of committers not knowing what should be public, specifically calling out newly documented config params at release provides an opportunity for extra review.

+1 config as api

> annotate the Spark configs to indicate which ones are meant for the end user
> ----------------------------------------------------------------------------
>
>                 Key: SPARK-3508
>                 URL: https://issues.apache.org/jira/browse/SPARK-3508
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 1.1.0
>            Reporter: Thomas Graves
>
> Spark has lots of configs floating around.  To me configs are like api's and we should make it clear which ones are meant for the end user and which ones are only used internally.  We should decide on exactly how we want to do this.
> I've seen in the past users looking at the code and then using a config that was meant to be internal and file a jira to document it.  Since there are many comitters its easy for someone who doesn't have the history with that config to just think we forgot to document it and then it becomes public.
> Perhaps we need to name internal configs specially (spark.internal.) or we need to annotate them or something else.
> thoughts?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org