You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yikun Jiang (Jira)" <ji...@apache.org> on 2021/10/19 08:19:00 UTC

[jira] [Commented] (SPARK-36348) unexpected Index loaded: pd.Index([10, 20, None], name="x")

    [ https://issues.apache.org/jira/browse/SPARK-36348?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17430392#comment-17430392 ] 

Yikun Jiang commented on SPARK-36348:
-------------------------------------

revisit this, it had been already fixed in current master branch, will add a simple PR to optimized the testcase. 

{code:python}
pidx = pd.Index([10, 20, 15, 30, 45, None], name="x")
psidx = ps.Index(pidx)
self.assert_eq(psidx.astype(bool), pidx.astype(bool))
self.assert_eq(psidx.astype(str), pidx.astype(str))
{code}




> unexpected Index loaded: pd.Index([10, 20, None], name="x")
> -----------------------------------------------------------
>
>                 Key: SPARK-36348
>                 URL: https://issues.apache.org/jira/browse/SPARK-36348
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 3.2.0
>            Reporter: Yikun Jiang
>            Priority: Major
>
> {code:python}
> pidx = pd.Index([10, 20, 15, 30, 45, None], name="x")
> psidx = ps.Index(pidx)
> self.assert_eq(psidx.astype(str), pidx.astype(str))
> {code}
> [left pandas on spark]:  Index(['10.0', '20.0', '15.0', '30.0', '45.0', 'nan'], dtype='object', name='x')
> [right pandas]: Index(['10', '20', '15', '30', '45', 'None'], dtype='object', name='x')
> The index is loaded as float64, so the follow step like astype would be diff with pandas
> [1] https://github.com/apache/spark/blob/bcc595c112a23d8e3024ace50f0dbc7eab7144b2/python/pyspark/pandas/tests/indexes/test_base.py#L2249



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org