You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Deep Pradhan <pr...@gmail.com> on 2014/09/17 12:24:09 UTC

Change RDDs using map()

Hi,
I want to make the following changes in the RDD (create new RDD from the
existing to reflect some transformation):
In an RDD of key-value pair, I want to get the keys for which the values
are 1.
How to do this using map()?
Thank You

Re: Change RDDs using map()

Posted by Mark Hamstra <ma...@clearstorydata.com>.
You don't.  That's what filter or the partial function version of collect
are for:

val transformedRDD = yourRDD.collect { case (k, v) if k == 1 => v }

On Wed, Sep 17, 2014 at 3:24 AM, Deep Pradhan <pr...@gmail.com>
wrote:

> Hi,
> I want to make the following changes in the RDD (create new RDD from the
> existing to reflect some transformation):
> In an RDD of key-value pair, I want to get the keys for which the values
> are 1.
> How to do this using map()?
> Thank You
>

RE: Change RDDs using map()

Posted by qihong <qc...@pivotal.io>.
if you want the result as RDD of (key, 1)

  new_rdd = rdd.filter(x => x._2 == 1)

if you want result as RDD of keys (since you know the values are 1), then

  new_rdd = rdd.filter(x => x._2 == 1).map(x => x._1)

x._1 and x._2 are the way of scala to access the key and value from
key/value pair.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Change-RDDs-using-map-tp14436p14481.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org