You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Daniel Himmelstein (Jira)" <ji...@apache.org> on 2021/02/25 16:50:00 UTC

[jira] [Commented] (SPARK-34544) pyspark toPandas() should return pd.DataFrame

    [ https://issues.apache.org/jira/browse/SPARK-34544?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17291040#comment-17291040 ] 

Daniel Himmelstein commented on SPARK-34544:
--------------------------------------------

SPARK-34540 is an example. {{[DataFrameLike|https://github.com/apache/spark/blob/4a3200b08ac3e7733b5a3dc7271d35e6872c5967/python/pyspark/sql/pandas/_typing/protocols/frame.pyi#L37-L428]}} is missing the {{pd.DataFrame.convert_dtypes}} method. It's also missing {{pd.DataFrame.head}} and column attribute access ({{pd.DataFrame.my_column_name}}).

Keeping up with all upstream pandas.DataFrame API changes seems like an impossible task? And can't accommodate the different pandas versions in use by end users.

> pyspark toPandas() should return pd.DataFrame
> ---------------------------------------------
>
>                 Key: SPARK-34544
>                 URL: https://issues.apache.org/jira/browse/SPARK-34544
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 3.0.1
>            Reporter: Rafal Wojdyla
>            Priority: Critical
>
> Right now {{toPandas()}} returns {{DataFrameLike}}, which is an incomplete "view" of pandas {{DataFrame}}. Which leads to cases like mypy reporting that certain pandas methods are not present in {{DataFrameLike}}, even tho those methods are valid methods on pandas {{DataFrame}}, which is the actual type of the object. This requires type ignore comments or asserts.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org