You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by ya...@apache.org on 2024/01/11 08:53:48 UTC

(spark) branch master updated: [SPARK-37039][PS][FOLLOWUP] Add migration guide for behavior change

This is an automated email from the ASF dual-hosted git repository.

yao pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new fa929d4ea4ad [SPARK-37039][PS][FOLLOWUP] Add migration guide for behavior change
fa929d4ea4ad is described below

commit fa929d4ea4adb99c2f31e5478c5155870f0aa1eb
Author: Haejoon Lee <ha...@databricks.com>
AuthorDate: Thu Jan 11 16:53:34 2024 +0800

    [SPARK-37039][PS][FOLLOWUP] Add migration guide for behavior change
    
    ### What changes were proposed in this pull request?
    
    This PR followup for https://github.com/apache/spark/pull/44570 to add migration guide for behavior change.
    
    ### Why are the changes needed?
    
    We should notice user about any behavior change
    
    ### Does this PR introduce _any_ user-facing change?
    
    No API change.
    
    ### How was this patch tested?
    
    The existing CI should pass.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #44684 from itholic/SPARK-37039-foolowup.
    
    Authored-by: Haejoon Lee <ha...@databricks.com>
    Signed-off-by: Kent Yao <ya...@apache.org>
---
 python/docs/source/migration_guide/pyspark_upgrade.rst | 1 +
 1 file changed, 1 insertion(+)

diff --git a/python/docs/source/migration_guide/pyspark_upgrade.rst b/python/docs/source/migration_guide/pyspark_upgrade.rst
index 17c888be959d..49872fb197af 100644
--- a/python/docs/source/migration_guide/pyspark_upgrade.rst
+++ b/python/docs/source/migration_guide/pyspark_upgrade.rst
@@ -67,6 +67,7 @@ Upgrading from PySpark 3.5 to 4.0
 * In Spark 4.0, ``DataFrame.to_pandas_on_spark`` has been removed from PySpark, use ``DataFrame.pandas_api`` instead.
 * In Spark 4.0, ``DatatimeIndex.week`` and ``DatatimeIndex.weekofyear`` have been removed from Pandas API on Spark, use ``DatetimeIndex.isocalendar().week`` instead.
 * In Spark 4.0, ``Series.dt.week`` and ``Series.dt.weekofyear`` have been removed from Pandas API on Spark, use ``Series.dt.isocalendar().week`` instead.
+* In Spark 4.0, when applying ``astype`` to a decimal type object, the existing missing value is changed to ``True`` instead of ``False`` from Pandas API on Spark.
 
 
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org