You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Leona Yoda (Jira)" <ji...@apache.org> on 2021/07/26 11:04:00 UTC

[jira] [Created] (SPARK-36288) Update API usage on pyspark pandas documents

Leona Yoda created SPARK-36288:
----------------------------------

             Summary: Update API usage on pyspark pandas documents
                 Key: SPARK-36288
                 URL: https://issues.apache.org/jira/browse/SPARK-36288
             Project: Spark
          Issue Type: Improvement
          Components: Documentation, PySpark
    Affects Versions: 3.2.0
            Reporter: Leona Yoda


I found several warning messages when I tested around ported Spark Pandas API Documents (https://issues.apache.org/jira/browse/SPARK-34885).

1.  `spark.sql.execution.arrow.enabled` on Best Practice document
{code:java}
21/07/26 05:42:02 WARN SQLConf: The SQL config 'spark.sql.execution.arrow.enabled' has been deprecated in Spark v3.0 and may be removed in the future. Use 'spark.sql.execution.arrow.pyspark.enabled' instead of it.
{code}
 

2. `DataFrame.to_spark_io` on From/to other DBMSes document
{code:java}
/opt/spark/python/lib/pyspark.zip/pyspark/pandas/frame.py:4811: FutureWarning: Deprecated in 3.2, Use spark.to_spark_io instead. warnings.warn("Deprecated in 3.2, Use spark.to_spark_io instead.", FutureWarning)
{code}
 

At this time it worked but I think it's better to update API usage on those documents.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org