You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Corey Nolet <cj...@gmail.com> on 2014/12/16 21:56:10 UTC

Spark eating exceptions in multi-threaded local mode

I've been running a job in local mode using --master local[*] and I've
noticed that, for some reason, exceptions appear to get eaten- as in, I
don't see them. If i debug in my IDE, I'll see that an exception was thrown
if I step through the code but if I just run the application, it appears
everything completed but i know a bunch of my jobs did not actually ever
get run.

The exception is happening in a map stage.

Is there a special way this is supposed to be handled? Am I missing a
property somewhere that allows these to be bubbled up?