You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Tien-Dung LE (JIRA)" <ji...@apache.org> on 2015/01/30 14:05:34 UTC
[jira] [Created] (SPARK-5499) iterative computing with 1000
iterations causes stage failure
Tien-Dung LE created SPARK-5499:
-----------------------------------
Summary: iterative computing with 1000 iterations causes stage failure
Key: SPARK-5499
URL: https://issues.apache.org/jira/browse/SPARK-5499
Project: Spark
Issue Type: Bug
Components: Spark Core
Affects Versions: 1.2.0
Reporter: Tien-Dung LE
I got an error "org.apache.spark.SparkException: Job aborted due to stage failure: Task serialization failed: java.lang.StackOverflowError" when executing an action with 1000 transformations cause.
Here is a code snippet to re-produce the error:
import org.apache.spark.rdd.RDD
var pair: RDD[(Long,Long)] = sc.parallelize(Array((1L,2L)))
var newPair: RDD[(Long,Long)] = null
for (i <- 1 to 1000) {
newPair = pair.map(_.swap)
pair = newPair
}
println("Count = " + pair.count())
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org