You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (JIRA)" <ji...@apache.org> on 2018/09/11 14:32:00 UTC

[jira] [Commented] (SPARK-16011) SQL metrics include duplicated attempts

    [ https://issues.apache.org/jira/browse/SPARK-16011?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16610705#comment-16610705 ] 

Wenchen Fan commented on SPARK-16011:
-------------------------------------

Since the behavior is intentional, I'm closing this ticket. We can create a new ticket if we want to support SQL metrics without duplicated attempts.

> SQL metrics include duplicated attempts
> ---------------------------------------
>
>                 Key: SPARK-16011
>                 URL: https://issues.apache.org/jira/browse/SPARK-16011
>             Project: Spark
>          Issue Type: New Feature
>          Components: Spark Core, SQL
>            Reporter: Davies Liu
>            Assignee: Wenchen Fan
>            Priority: Major
>
> When I ran a simple scan and aggregate query, the number of rows in scan could be different from run to run, but actually scanned result is correct, the SQL metrics is wrong (should not include duplicated attempt), this is a regression since 1.6.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org