You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Reynold Xin (JIRA)" <ji...@apache.org> on 2016/07/15 18:08:20 UTC

[jira] [Comment Edited] (SPARK-16011) SQL metrics include duplicated attempts

    [ https://issues.apache.org/jira/browse/SPARK-16011?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15379817#comment-15379817 ] 

Reynold Xin edited comment on SPARK-16011 at 7/15/16 6:07 PM:
--------------------------------------------------------------

I just retargeted this. It's actually intentional, although I think for metrics we should report both (one value including duplicated attempts and the other not).


was (Author: rxin):
I just retargeted this. It's actually intentional.


> SQL metrics include duplicated attempts
> ---------------------------------------
>
>                 Key: SPARK-16011
>                 URL: https://issues.apache.org/jira/browse/SPARK-16011
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core, SQL
>            Reporter: Davies Liu
>            Assignee: Wenchen Fan
>            Priority: Blocker
>
> When I ran a simple scan and aggregate query, the number of rows in scan could be different from run to run, but actually scanned result is correct, the SQL metrics is wrong (should not include duplicated attempt), this is a regression since 1.6.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org