You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2022/12/18 14:41:00 UTC

[jira] [Commented] (SPARK-40708) Auto update table statistics based on write metrics

    [ https://issues.apache.org/jira/browse/SPARK-40708?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17649017#comment-17649017 ] 

Apache Spark commented on SPARK-40708:
--------------------------------------

User 'jackylee-ch' has created a pull request for this issue:
https://github.com/apache/spark/pull/39114

> Auto update table statistics based on write metrics
> ---------------------------------------------------
>
>                 Key: SPARK-40708
>                 URL: https://issues.apache.org/jira/browse/SPARK-40708
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.4.0
>            Reporter: Yuming Wang
>            Priority: Major
>
> {code:scala}
>   // Get write statistics
>   def getWriteStats(mode: SaveMode, metrics: Map[String, SQLMetric]): Option[WriteStats] = {
>     val numBytes = metrics.get(NUM_OUTPUT_BYTES_KEY).map(_.value).map(BigInt(_))
>     val numRows = metrics.get(NUM_OUTPUT_ROWS_KEY).map(_.value).map(BigInt(_))
>     numBytes.map(WriteStats(mode, _, numRows))
>   }
> // Update table statistics
>       val stat = wroteStats.get
>       stat.mode match {
>         case SaveMode.Overwrite | SaveMode.ErrorIfExists =>
>           catalog.alterTableStats(table.identifier,
>             Some(CatalogStatistics(stat.numBytes, stat.numRows)))
>         case _ if table.stats.nonEmpty => // SaveMode.Append
>           catalog.alterTableStats(table.identifier, None)
>         case _ => // SaveMode.Ignore Do nothing
> {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org