You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Maxim Gekk (Jira)" <ji...@apache.org> on 2020/09/21 19:24:00 UTC

[jira] [Commented] (SPARK-32306) `approx_percentile` in Spark SQL gives incorrect results

    [ https://issues.apache.org/jira/browse/SPARK-32306?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17199615#comment-17199615 ] 

Maxim Gekk commented on SPARK-32306:
------------------------------------

[~seanmalory] Which result do you expect? 8?

> `approx_percentile` in Spark SQL gives incorrect results
> --------------------------------------------------------
>
>                 Key: SPARK-32306
>                 URL: https://issues.apache.org/jira/browse/SPARK-32306
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark, SQL
>    Affects Versions: 2.4.4
>            Reporter: Sean Malory
>            Priority: Major
>
> The `approx_percentile` function in Spark SQL does not give the correct result. I'm not sure how incorrect it is; it may just be a boundary issue. From the docs:
> {quote}The accuracy parameter (default: 10000) is a positive numeric literal which controls approximation accuracy at the cost of memory. Higher value of accuracy yields better accuracy, 1.0/accuracy is the relative error of the approximation.
> {quote}
> This is not true. Here is a minimum example in `pyspark` where, essentially, the median of 5 and 8 is being calculated as 5:
> {code:python}
> import pyspark.sql.functions as psf
> df = spark.createDataFrame(
>     [('bar', 5), ('bar', 8)], ['name', 'val']
> )
> median = psf.expr('percentile_approx(val, 0.5, 2147483647)')
> df.groupBy('name').agg(median.alias('median'))    # gives the median as 5
> {code}
> I've tested this with Spark v2.4.4, pyspark v2.4.5- although I suspect this is an issue with the underlying algorithm.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org