You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2021/07/27 03:32:00 UTC

[jira] [Resolved] (SPARK-36288) Update API usage on pyspark pandas documents

     [ https://issues.apache.org/jira/browse/SPARK-36288?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-36288.
----------------------------------
    Fix Version/s: 3.2.0
       Resolution: Fixed

Issue resolved by pull request 33519
[https://github.com/apache/spark/pull/33519]

> Update API usage on pyspark pandas documents
> --------------------------------------------
>
>                 Key: SPARK-36288
>                 URL: https://issues.apache.org/jira/browse/SPARK-36288
>             Project: Spark
>          Issue Type: Improvement
>          Components: Documentation, PySpark
>    Affects Versions: 3.2.0
>            Reporter: Leona Yoda
>            Assignee: Leona Yoda
>            Priority: Minor
>             Fix For: 3.2.0
>
>
> I found several warning messages when I tested around ported Spark Pandas API Documents (https://issues.apache.org/jira/browse/SPARK-34885).
> 1.  `spark.sql.execution.arrow.enabled` on Best Practice document
> {code:java}
> 21/07/26 05:42:02 WARN SQLConf: The SQL config 'spark.sql.execution.arrow.enabled' has been deprecated in Spark v3.0 and may be removed in the future. Use 'spark.sql.execution.arrow.pyspark.enabled' instead of it.
> {code}
>  
> 2. `DataFrame.to_spark_io` on From/to other DBMSes document
> {code:java}
> /opt/spark/python/lib/pyspark.zip/pyspark/pandas/frame.py:4811: FutureWarning: Deprecated in 3.2, Use spark.to_spark_io instead. warnings.warn("Deprecated in 3.2, Use spark.to_spark_io instead.", FutureWarning)
> {code}
>  
> At this time it worked but I think it's better to update API usage on those documents.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org