You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by we...@apache.org on 2022/08/16 08:18:16 UTC
[spark] branch master updated (0503e10b819 -> 37c421aa5c7)
This is an automated email from the ASF dual-hosted git repository.
wenchen pushed a change to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
from 0503e10b819 [SPARK-40067][SQL] Use Table#name() instead of Scan#name() to populate the table name in the BatchScan node in SparkUI
add 37c421aa5c7 [SPARK-40013][SQL] DS V2 expressions should have the default `toString`
No new revisions were added by this update.
Summary of changes:
.../org/apache/spark/sql/connector/expressions/Cast.java | 10 ++--------
.../org/apache/spark/sql/connector/expressions/Extract.java | 12 ++----------
.../sql/connector/expressions/GeneralScalarExpression.java | 11 ++---------
.../sql/connector/expressions/UserDefinedScalarFunc.java | 11 ++---------
.../spark/sql/connector/expressions/aggregate/Avg.java | 12 ++----------
.../spark/sql/connector/expressions/aggregate/Count.java | 12 ++----------
.../sql/connector/expressions/aggregate/CountStar.java | 6 ++----
.../expressions/aggregate/GeneralAggregateFunc.java | 10 ++--------
.../spark/sql/connector/expressions/aggregate/Max.java | 6 ++----
.../spark/sql/connector/expressions/aggregate/Min.java | 6 ++----
.../spark/sql/connector/expressions/aggregate/Sum.java | 12 ++----------
.../expressions/aggregate/UserDefinedAggregateFunc.java | 10 ++--------
...{SupportsMetadata.scala => ExpressionWithToString.scala} | 13 ++++++-------
.../spark/sql/internal/connector/ToStringSQLBuilder.scala | 2 +-
14 files changed, 31 insertions(+), 102 deletions(-)
copy sql/catalyst/src/main/scala/org/apache/spark/sql/internal/connector/{SupportsMetadata.scala => ExpressionWithToString.scala} (77%)
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org