You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Sumona Routh <su...@gmail.com> on 2016/02/10 23:51:29 UTC

SparkListener - why is org.apache.spark.scheduler.JobFailed in scala private?

Hi there,
I am trying to create a listener for my Spark job to do some additional
notifications for failures using this Scala API:
https://spark.apache.org/docs/1.2.1/api/scala/#org.apache.spark.scheduler.JobResult
.

My idea was to write something like this:

override def onJobEnd(jobEnd: SparkListenerJobEnd): Unit = {
    jobEnd.jobResult match {
      case JobFailed(exception) => //do stuff here
    }
 }

However, JobFailed class is package private, and thus I cannot do this. It's
sibling class, JobSucceeded is public, but obviously I want to handle failed
scenarios and be able to introspect the exception.
I did notice that the corresponding class in the Java API is public.

Is there another pattern I should follow to handle failures?

Thanks!