You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2022/08/02 02:17:00 UTC

[jira] [Commented] (SPARK-39941) window and min_periods parameters in rolling func need to be validated as a Integer

    [ https://issues.apache.org/jira/browse/SPARK-39941?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17573998#comment-17573998 ] 

Apache Spark commented on SPARK-39941:
--------------------------------------

User 'bzhaoopenstack' has created a pull request for this issue:
https://github.com/apache/spark/pull/37367

> window and min_periods parameters in rolling func need to be validated as a Integer
> -----------------------------------------------------------------------------------
>
>                 Key: SPARK-39941
>                 URL: https://issues.apache.org/jira/browse/SPARK-39941
>             Project: Spark
>          Issue Type: Bug
>          Components: Pandas API on Spark
>    Affects Versions: 3.2.2
>         Environment: PySpark: Master
>            Reporter: bo zhao
>            Priority: Minor
>
> window and min_periods parameters is not be validated in rolling function.
>  
> {code:java}
> >>> s = ps.Series([4, 3, 5, 2, 6])
> >>> s.rolling('STRING').sum()
> Traceback (most recent call last):
>   File "<stdin>", line 1, in <module>
>   File "/home/spark/spark/python/pyspark/pandas/generic.py", line 2707, in rolling
>     return Rolling(self, window=window, min_periods=min_periods)
>   File "/home/spark/spark/python/pyspark/pandas/window.py", line 179, in __init__
>     super().__init__(window, min_periods)
>   File "/home/spark/spark/python/pyspark/pandas/window.py", line 147, in __init__
>     if window < 0:
> TypeError: '<' not supported between instances of 'str' and 'int'
> >>> 
>  {code}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org