You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Peter Thai <th...@gmail.com> on 2014/12/02 19:37:20 UTC

Re: Negative Accumulators

Similarly, I'm having an issue with the above solution when I use the
math.min() function to add to an accumulator. I'm seeing negative overflow
numbers again.

This code works fine without the math.min() and even if I add an arbitrarily
large number like 100

// doesn't work
someRDD.foreach(x=>{
  myAccumulator+=math.min(x._1, 100)
})

//works
someRDD.foreach(x=>{
  myAccumulator+=x._1+100
})

Any ideas? 



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Negative-Accumulators-tp19706p20183.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Negative Accumulators

Posted by Peter Thai <th...@gmail.com>.
To answer my own question, 

I was declaring the accumulator incorrectly. The code should look like this:

scala> import org.apache.spark.AccumulatorParam
import org.apache.spark.AccumulatorParam

scala> :paste
// Entering paste mode (ctrl-D to finish)

implicit object BigIntAccumulatorParam extends AccumulatorParam[BigInt] {
  def addInPlace(t1: BigInt, t2: BigInt) = t1 + t2
  def zero(initialValue: BigInt) = BigInt(0)
}

// Exiting paste mode, now interpreting.

defined module BigIntAccumulatorParam       

scala> val accu = sc.accumulator(BigInt(0))(BigIntAccumulatorParam)
accu: org.apache.spark.Accumulator[scala.math.BigInt] = 0

scala> accu += 100

scala> accu.value
res1: scala.math.BigInt = 100



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Negative-Accumulators-tp19706p20199.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org