You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sameer Agarwal (JIRA)" <ji...@apache.org> on 2018/01/08 20:52:03 UTC

[jira] [Updated] (SPARK-16011) SQL metrics include duplicated attempts

     [ https://issues.apache.org/jira/browse/SPARK-16011?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sameer Agarwal updated SPARK-16011:
-----------------------------------
    Target Version/s: 2.4.0  (was: 2.3.0)

> SQL metrics include duplicated attempts
> ---------------------------------------
>
>                 Key: SPARK-16011
>                 URL: https://issues.apache.org/jira/browse/SPARK-16011
>             Project: Spark
>          Issue Type: New Feature
>          Components: Spark Core, SQL
>            Reporter: Davies Liu
>            Assignee: Wenchen Fan
>
> When I ran a simple scan and aggregate query, the number of rows in scan could be different from run to run, but actually scanned result is correct, the SQL metrics is wrong (should not include duplicated attempt), this is a regression since 1.6.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org