You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Ruifeng Zheng (Jira)" <ji...@apache.org> on 2023/01/12 15:01:00 UTC
[jira] [Commented] (SPARK-42032) transform_key, transform_values doctest output have different order
[ https://issues.apache.org/jira/browse/SPARK-42032?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17676060#comment-17676060 ]
Ruifeng Zheng commented on SPARK-42032:
---------------------------------------
{code:java}
**********************************************************************
File "/Users/ruifeng.zheng/Dev/spark/python/pyspark/sql/connect/functions.py", line 1423, in pyspark.sql.connect.functions.map_filter
Failed example:
df.select(map_filter(
"data", lambda _, v: v > 30.0).alias("data_filtered")
).show(truncate=False)
Expected:
+--------------------------+
|data_filtered |
+--------------------------+
|{baz -> 32.0, foo -> 42.0}|
+--------------------------+
Got:
+--------------------------+
|data_filtered |
+--------------------------+
|{foo -> 42.0, baz -> 32.0}|
+--------------------------+
<BLANKLINE>
**********************************************************************
File "/Users/ruifeng.zheng/Dev/spark/python/pyspark/sql/connect/functions.py", line 1465, in pyspark.sql.connect.functions.map_zip_with
Failed example:
df.select(map_zip_with(
"base", "ratio", lambda k, v1, v2: round(v1 * v2, 2)).alias("updated_data")
).show(truncate=False)
Expected:
+---------------------------+
|updated_data |
+---------------------------+
|{SALES -> 16.8, IT -> 48.0}|
+---------------------------+
Got:
+---------------------------+
|updated_data |
+---------------------------+
|{IT -> 48.0, SALES -> 16.8}|
+---------------------------+
<BLANKLINE>
**********************************************************************
1 of 2 in pyspark.sql.connect.functions.map_filter
1 of 2 in pyspark.sql.connect.functions.map_zip_with
{code}
> transform_key, transform_values doctest output have different order
> -------------------------------------------------------------------
>
> Key: SPARK-42032
> URL: https://issues.apache.org/jira/browse/SPARK-42032
> Project: Spark
> Issue Type: Sub-task
> Components: Connect, PySpark
> Affects Versions: 3.4.0
> Reporter: Ruifeng Zheng
> Priority: Major
>
> not sure whether this should be fixed:
> {code:java}
> **********************************************************************
> File "/Users/ruifeng.zheng/Dev/spark/python/pyspark/sql/connect/functions.py", line 1623, in pyspark.sql.connect.functions.transform_keys
> Failed example:
> df.select(transform_keys(
> "data", lambda k, _: upper(k)).alias("data_upper")
> ).show(truncate=False)
> Expected:
> +-------------------------+
> |data_upper |
> +-------------------------+
> |{BAR -> 2.0, FOO -> -2.0}|
> +-------------------------+
> Got:
> +-------------------------+
> |data_upper |
> +-------------------------+
> |{FOO -> -2.0, BAR -> 2.0}|
> +-------------------------+
> <BLANKLINE>
> **********************************************************************
> File "/Users/ruifeng.zheng/Dev/spark/python/pyspark/sql/connect/functions.py", line 1630, in pyspark.sql.connect.functions.transform_values
> Failed example:
> df.select(transform_values(
> "data", lambda k, v: when(k.isin("IT", "OPS"), v + 10.0).otherwise(v)
> ).alias("new_data")).show(truncate=False)
> Expected:
> +---------------------------------------+
> |new_data |
> +---------------------------------------+
> |{OPS -> 34.0, IT -> 20.0, SALES -> 2.0}|
> +---------------------------------------+
> Got:
> +---------------------------------------+
> |new_data |
> +---------------------------------------+
> |{IT -> 20.0, SALES -> 2.0, OPS -> 34.0}|
> +---------------------------------------+
> <BLANKLINE>
> **********************************************************************
> 1 of 2 in pyspark.sql.connect.functions.transform_keys
> 1 of 2 in pyspark.sql.connect.functions.transform_values
> {code}
--
This message was sent by Atlassian Jira
(v8.20.10#820010)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org