You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Emanuele Cesena (JIRA)" <ji...@apache.org> on 2015/07/05 20:08:04 UTC

[jira] [Updated] (SPARK-8827) pyspark.DStream top method

     [ https://issues.apache.org/jira/browse/SPARK-8827?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Emanuele Cesena updated SPARK-8827:
-----------------------------------
    Description: 
Is there a reason for not having DStream.top?
Any issues with the following definition?

{{
    def topPartition(partition):
        return sorted(partition, key=lambda p: p[1], reverse=True)[:10]

    def top(counts):
        return counts.transform(lambda rdd: rdd.mapPartitions(topPartition)
            .sortBy(lambda p: p[1],ascending=False))
}}

  was:
Is there a reason for not having DStream.top?
Any issues with the following definition?

```
    def topPartition(partition):
        return sorted(partition, key=lambda p: p[1], reverse=True)[:10]

    def top(counts):
        return counts.transform(lambda rdd: rdd.mapPartitions(topPartition)
            .sortBy(lambda p: p[1],ascending=False))
```


> pyspark.DStream top method
> --------------------------
>
>                 Key: SPARK-8827
>                 URL: https://issues.apache.org/jira/browse/SPARK-8827
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark, Streaming
>            Reporter: Emanuele Cesena
>            Priority: Minor
>
> Is there a reason for not having DStream.top?
> Any issues with the following definition?
> {{
>     def topPartition(partition):
>         return sorted(partition, key=lambda p: p[1], reverse=True)[:10]
>     def top(counts):
>         return counts.transform(lambda rdd: rdd.mapPartitions(topPartition)
>             .sortBy(lambda p: p[1],ascending=False))
> }}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org