You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2017/02/06 13:55:41 UTC

[jira] [Resolved] (SPARK-19440) Window in pyspark doesn't have attributes unboundedPreceding, unboundedFollowing and currentRow

     [ https://issues.apache.org/jira/browse/SPARK-19440?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-19440.
----------------------------------
    Resolution: Invalid

It seems there are as below:

{code}
>>> from pyspark.sql import Window
>>> window = Window.orderBy("date").rowsBetween(Window.unboundedPreceding, Window.currentRow)
>>> dir(Window)
['_FOLLOWING_THRESHOLD', '_JAVA_MAX_LONG', '_JAVA_MIN_LONG', '_PRECEDING_THRESHOLD', '__class__', '__delattr__', '__dict__', '__doc__', '__format__', '__getattribute__', '__hash__', '__init__', '__module__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__sizeof__', '__str__', '__subclasshook__', '__weakref__', 'currentRow', 'orderBy', 'partitionBy', 'rangeBetween', 'rowsBetween', 'unboundedFollowing', 'unboundedPreceding']
>>>
{code}

Please reopen this if I am mistaken.

> Window in pyspark doesn't have attributes unboundedPreceding, unboundedFollowing and currentRow
> -----------------------------------------------------------------------------------------------
>
>                 Key: SPARK-19440
>                 URL: https://issues.apache.org/jira/browse/SPARK-19440
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 2.0.0
>            Reporter: Franklyn Dsouza
>            Priority: Trivial
>
> The Window class in pyspark doesn't have the attributes unboundedPreceding, unboundedFollowing and currentRow despite the documentation suggesting this attributes be used.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org