You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Allianzcortex <ia...@gmail.com> on 2017/05/22 07:15:19 UTC

#confused:Whether we should use more specific exception #

Hello,well.......What I ask may be a very simple question,but still trouble
me......

Before I ask the question,I have read the
http://spark.apache.org/contributing.html ,
And search for the mailing list,But still  not sure^_^.

The question is like that:

In org.apache.spark.streaming.dstream of Spark 2.11 Scala Package,functions
are as bellow:

```
override def slideDuration: Duration = {
    if (ssc == null) throw new Exception("ssc is null")
    if (ssc.graph.batchDuration == null) throw new Exception("batchDuration
is null")
    ssc.graph.batchDuration
  }
```

But Whether use more specific Exception Reference will be better ? like
this:

```
override def slideDuration: Duration = {
    import org.apache.spark.SparkException
    if (ssc == null) throw new SparkException("ssc is null")
    if (ssc.graph.batchDuration == null) throw new
SparkException("batchDuration is null")
    ssc.graph.batchDuration
  }
```

But After all,the system will exit once the exception is throw,so it seems
that just error information will be clear enough...

But code change like this is someways like typo fix,and how much sense will
it make to consumer conmitter's time  to review it ?

I'm  confused




--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/confused-Whether-we-should-use-more-specific-exception-tp21600.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org