You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by pnpritchard <ni...@falkonry.com> on 2015/10/12 22:06:04 UTC

Re: why would a spark Job fail without throwing run-time exceptions?

I'm not sure why spark is not showing the runtime exception in the logs.
However, I think I can point out why the stage is failing.

1. "lineMapToStockPriceInfoObjectRDD.map(new
stockDataFilter(_).requirementsMet.get)" 
The ".get" will throw a runtime exception when "requirementsMet" is None. I
would suggest rewriting as either of these (both have equivalent result):
- "lineMapToStockPriceInfoObjectRDD.flatMap(new
stockDataFilter(_).requirementsMet)"
- "lineMapToStockPriceInfoObjectRDD.filter(new stockDataFilter(_).
isWithinTradingSession)"

2. "case true => Some(s).get"
this should give a compile error. you should remove the .get: "case true =>
Some(s)"

-Nick



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/why-would-a-spark-Job-fail-without-throwing-run-time-exceptions-tp25002p25034.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org