You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2023/02/20 00:07:22 UTC

[spark] branch branch-3.4 updated: [SPARK-41818][CONNECT][PYTHON][FOLLOWUP][TEST] Enable a doctest for DataFrame.write

This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch branch-3.4
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.4 by this push:
     new 9249488beb6 [SPARK-41818][CONNECT][PYTHON][FOLLOWUP][TEST] Enable a doctest for DataFrame.write
9249488beb6 is described below

commit 9249488beb688bcf92393b2e247b73a570311a58
Author: Takuya UESHIN <ue...@databricks.com>
AuthorDate: Mon Feb 20 09:06:58 2023 +0900

    [SPARK-41818][CONNECT][PYTHON][FOLLOWUP][TEST] Enable a doctest for DataFrame.write
    
    ### What changes were proposed in this pull request?
    
    Enables a doctest for `DataFrame.write`.
    
    ### Why are the changes needed?
    
    Now that `DataFrame.write.saveAsTable` was fixed, we can enabled the doctest.
    
    ### Does this PR introduce _any_ user-facing change?
    
    No.
    
    ### How was this patch tested?
    
    Enabled the doctest.
    
    Closes #40071 from ueshin/issues/SPARK-41818/doctest.
    
    Authored-by: Takuya UESHIN <ue...@databricks.com>
    Signed-off-by: Hyukjin Kwon <gu...@apache.org>
    (cherry picked from commit b056e59e1cb246682b4f4cf178dca8b5e555f018)
    Signed-off-by: Hyukjin Kwon <gu...@apache.org>
---
 python/pyspark/sql/connect/dataframe.py | 3 ---
 1 file changed, 3 deletions(-)

diff --git a/python/pyspark/sql/connect/dataframe.py b/python/pyspark/sql/connect/dataframe.py
index 3564f5def17..393f7f42ec8 100644
--- a/python/pyspark/sql/connect/dataframe.py
+++ b/python/pyspark/sql/connect/dataframe.py
@@ -1704,9 +1704,6 @@ def _test() -> None:
     # TODO(SPARK-41625): Support Structured Streaming
     del pyspark.sql.connect.dataframe.DataFrame.isStreaming.__doc__
 
-    # TODO(SPARK-41818): Support saveAsTable
-    del pyspark.sql.connect.dataframe.DataFrame.write.__doc__
-
     globs["spark"] = (
         PySparkSession.builder.appName("sql.connect.dataframe tests")
         .remote("local[4]")


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org