You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2019/12/09 11:51:00 UTC

[jira] [Commented] (SPARK-26433) Tail method for spark DataFrame

    [ https://issues.apache.org/jira/browse/SPARK-26433?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16991520#comment-16991520 ] 

Hyukjin Kwon commented on SPARK-26433:
--------------------------------------

Looking back this JIRA, I realised that I underestimated it given multiple requests and other systems. I re-created a JIRA and made a PR.

> Tail method for spark DataFrame
> -------------------------------
>
>                 Key: SPARK-26433
>                 URL: https://issues.apache.org/jira/browse/SPARK-26433
>             Project: Spark
>          Issue Type: New Feature
>          Components: PySpark
>    Affects Versions: 2.4.0
>            Reporter: Jan Gorecki
>            Priority: Major
>
> There is a head method for spark dataframes which work fine but there doesn't seems to be tail method.
> ```
> >>> ans                                                                         
> DataFrame[v1: bigint]                                                           
> >>> ans.head(3)                                                                
> [Row(v1=299443), Row(v1=299493), Row(v1=300751)]
> >>> ans.tail(3)
> Traceback (most recent call last):
>   File "<stdin>", line 1, in <module>
>   File "/home/jan/git/db-benchmark/spark/py-spark/lib/python3.6/site-packages/py
> spark/sql/dataframe.py", line 1300, in __getattr__
>     "'%s' object has no attribute '%s'" % (self.__class__.__name__, name))
> AttributeError: 'DataFrame' object has no attribute 'tail'
> ```
> I would like to feature request Tail method for spark dataframe



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org