You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by jamborta <ja...@gmail.com> on 2014/09/15 22:00:57 UTC

Efficient way to sum multiple columns

Hi all,

I have an RDD that contains around 50 columns. I need to sum each column,
which I am doing by running it through a for loop, creating an array and
running the sum function as follows:

for (i <- 0 until 10) yield {
   data.map(x => x(i)).sum
}

is their a better way to do this?

thanks,




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Efficient-way-to-sum-multiple-columns-tp14281.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Efficient way to sum multiple columns

Posted by Xiangrui Meng <me...@gmail.com>.
Please check the colStats method defined under mllib.stat.Statistics. -Xiangrui

On Mon, Sep 15, 2014 at 1:00 PM, jamborta <ja...@gmail.com> wrote:
> Hi all,
>
> I have an RDD that contains around 50 columns. I need to sum each column,
> which I am doing by running it through a for loop, creating an array and
> running the sum function as follows:
>
> for (i <- 0 until 10) yield {
>    data.map(x => x(i)).sum
> }
>
> is their a better way to do this?
>
> thanks,
>
>
>
>
> --
> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Efficient-way-to-sum-multiple-columns-tp14281.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org