You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "zhengruifeng (via GitHub)" <gi...@apache.org> on 2024/01/16 08:24:47 UTC

[PR] [SPARK-46677][CONNECT][FOLLOWUP] Convert `count(df["*"])` to `count(1)` on client side [spark]

zhengruifeng opened a new pull request, #44752:
URL: https://github.com/apache/spark/pull/44752

   ### What changes were proposed in this pull request?
   before https://github.com/apache/spark/pull/44689, `df["*"]` and `sf.col("*")` are both convert to `UnresolvedStar`, and then `Count(UnresolvedStar)` is converted to `Count(1)` in Analyzer:
   https://github.com/apache/spark/blob/381f3691bd481abc8f621ca3f282e06db32bea31/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala#L1893-L1897
   
   in the fix, we introduced a new node `UnresolvedDataFrameStar` for `df["*"]` which will be replaced to `ResolvedStar` later. Unfortunately, it doesn't match `Count(UnresolvedStar)` any more.
   So it causes:
   ```
   In [1]: from pyspark.sql import functions as sf
   
   In [2]: df1 = spark.createDataFrame([{"id": 1, "val": "v"}])
   
   In [3]: df1.select(sf.count(df1["*"]))
   Out[3]: DataFrame[count(id, val): bigint]
   ```
   
   which should be
   ```
   In [3]: df1.select(sf.count(df1["*"]))
   Out[3]: DataFrame[count(1): bigint]
   ```
   
   In vanilla Spark, it is up to the `count` function to make such conversion `sf.count(df1["*"])` -> `sf.count(sf.lit(1))`, see
   
   https://github.com/apache/spark/blob/e8dfcd3081abe16b2115bb2944a2b1cb547eca8e/sql/core/src/main/scala/org/apache/spark/sql/functions.scala#L422-L436
   
   So it is a natural way to match this behavior on the client side.
   
   ### Why are the changes needed?
   to keep the behavior
   
   
   ### Does this PR introduce _any_ user-facing change?
   it fix the behavior change introduced in https://github.com/apache/spark/pull/44689
   
   
   ### How was this patch tested?
   added ut
   
   
   ### Was this patch authored or co-authored using generative AI tooling?
   no


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-46677][CONNECT][FOLLOWUP] Convert `count(df["*"])` to `count(1)` on client side [spark]

Posted by "zhengruifeng (via GitHub)" <gi...@apache.org>.
zhengruifeng commented on PR #44752:
URL: https://github.com/apache/spark/pull/44752#issuecomment-1893422052

   thanks, merged to master


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-46677][CONNECT][FOLLOWUP] Convert `count(df["*"])` to `count(1)` on client side [spark]

Posted by "zhengruifeng (via GitHub)" <gi...@apache.org>.
zhengruifeng closed pull request #44752: [SPARK-46677][CONNECT][FOLLOWUP] Convert `count(df["*"])` to `count(1)` on client side
URL: https://github.com/apache/spark/pull/44752


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org