You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2021/07/08 20:56:00 UTC

[jira] [Commented] (SPARK-36035) Adjust `test_astype`, `test_neg` for old pandas versions

    [ https://issues.apache.org/jira/browse/SPARK-36035?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17377590#comment-17377590 ] 

Apache Spark commented on SPARK-36035:
--------------------------------------

User 'xinrong-databricks' has created a pull request for this issue:
https://github.com/apache/spark/pull/33272

> Adjust `test_astype`, `test_neg` for old pandas versions
> --------------------------------------------------------
>
>                 Key: SPARK-36035
>                 URL: https://issues.apache.org/jira/browse/SPARK-36035
>             Project: Spark
>          Issue Type: Test
>          Components: PySpark
>    Affects Versions: 3.2.0
>            Reporter: Xinrong Meng
>            Priority: Major
>
> * test_astype
> For pandas < 1.1.0, declaring or converting to StringDtype was in general only possible if the data was already only str or nan-like (GH31204).
> In pandas 1.1.0, the problem is adjusted by [https://pandas.pydata.org/pandas-docs/stable/whatsnew/v1.1.0.html#all-dtypes-can-now-be-converted-to-stringdtype].
> That should be considered in `test_astype`, otherwise, current tests will fail with pandas < 1.1.0.
>  * test_neg
> {code:java}
> dtypes = [
>   "Int8",
>   "Int16",
>   "Int32",
>   "Int64",
> ]
> psers = []
> for dtype in dtypes:
>   psers.append(pd.Series([1, 2, 3, None], dtype=dtype))
>   
> for pser in psers:
>   print((-pser).dtype){code}
>  ~ 1.0.5, object dtype
>  1.1.0~1.1.2, TypeError: bad operand type for unary -: 'IntegerArray'
>  1.1.3, correct respective dtype



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org