You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@systemml.apache.org by arijit chakraborty <ak...@hotmail.com> on 2017/07/12 18:44:59 UTC

Spark Core

Hi,


Suppose I've this following code:


a = matrix(seq(1,10), 10,1)


for(i in 1:100){

  b = a + 10

  write (b, "path" + ".csv", format="csv")

}


So what I'm doing is for 100 items, I'm adding a constant to a matrix than outputting it. And this operation occurs in spark using multiple core of the system.


My question is, after the operation is the value (here b) remains in that core (memory) of the system, so that it get piled up in the memory. Will this affect the performance of the process? If it is, how to clean the memory after each execution of loop?


The reason for asking the question is, when I'm testing the code in R the performance is much better than systemML. Since R to systemML is almost one-to-one mapping, so I'm not sure where I'm making the mistake. And unfortunately at the stage of progress I can't share the exact code.


Thanks you!

Arijit