You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Michel Hubert <mi...@phact.nl> on 2016/05/27 06:55:46 UTC

submitMissingTasks - serialize throws StackOverflow exception

Hi,

My Spark application throws stackoverflow exceptions after a while.
The DAGScheduler function submitMissingTasks tries to serialize a Tuple (MapPartitionsRDD, EsSpark..saveToEs) which is handled with a recursive algorithm.
The recursive algorithm is too deep and results in a stackoverflow exception.

Should I just try to increase the heap size? Or will it just happen later in time?

How can I fix this?

With kind regards,
michel



[cid:image001.png@01D1B7F5.8E8C3980]