You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sandeep Singh (JIRA)" <ji...@apache.org> on 2014/05/04 11:43:14 UTC

[jira] [Commented] (SPARK-1710) spark-submit should print better errors than "InvocationTargetException"

    [ https://issues.apache.org/jira/browse/SPARK-1710?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13988952#comment-13988952 ] 

Sandeep Singh commented on SPARK-1710:
--------------------------------------

https://github.com/apache/spark/pull/630

> spark-submit should print better errors than "InvocationTargetException"
> ------------------------------------------------------------------------
>
>                 Key: SPARK-1710
>                 URL: https://issues.apache.org/jira/browse/SPARK-1710
>             Project: Spark
>          Issue Type: Improvement
>    Affects Versions: 1.0.0
>            Reporter: Matei Zaharia
>            Assignee: Sandeep Singh
>            Priority: Minor
>              Labels: Starter
>
> It's not horrible, but it is a bit confusing that exceptions in your driver program get hidden inside InvocationTargetException:
> {code}
> matei@mbp-3:~/workspace/apache-spark$ bin/spark-submit --class SparkTest ../spark-test/target/scala-2.10/simple-project_2.10-1.0.jar 
> Exception in thread "main" java.lang.reflect.InvocationTargetException
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:256)
> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:54)
> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: org.apache.spark.SparkException: An application name must be set in your configuration
> 	at org.apache.spark.SparkContext.<init>(SparkContext.scala:163)
> 	at SparkTest$.main(Test.scala:7)
> 	at SparkTest.main(Test.scala)
> 	... 7 more
> {code}
> It would be better to print just the stack trace of the nested exception



--
This message was sent by Atlassian JIRA
(v6.2#6252)