You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Vladimir Feinberg (JIRA)" <ji...@apache.org> on 2016/06/15 19:19:09 UTC
[jira] [Updated] (SPARK-15972) GroupedData varargs arguments
misnamed
[ https://issues.apache.org/jira/browse/SPARK-15972?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Vladimir Feinberg updated SPARK-15972:
--------------------------------------
Description:
Simple aggregation functions which take column names {{cols}} as varargs arguments show up in documentation with the argument {{args}}, but their documentation refers to {{cols}}.
The discrepancy is caused by an annotation, {{df_varargs_api}}, which produces a temporary function with arguments {{args}} instead of {{cols}}, creating the confusing documentation.
was:
Simple aggregation functions which take column names [[cols]] as varargs arguments show up in documentation with the argument [[args]], but their documentation refers to [[cols]].
The discrepancy is caused by an annotation, [[df_varargs_api]], which produces a temporary function with arguments [[args]] instead of [[cols]], creating the confusing documentation.
> GroupedData varargs arguments misnamed
> --------------------------------------
>
> Key: SPARK-15972
> URL: https://issues.apache.org/jira/browse/SPARK-15972
> Project: Spark
> Issue Type: Improvement
> Components: PySpark
> Affects Versions: 2.0.0, 2.1.0
> Reporter: Vladimir Feinberg
> Priority: Trivial
>
> Simple aggregation functions which take column names {{cols}} as varargs arguments show up in documentation with the argument {{args}}, but their documentation refers to {{cols}}.
> The discrepancy is caused by an annotation, {{df_varargs_api}}, which produces a temporary function with arguments {{args}} instead of {{cols}}, creating the confusing documentation.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org