You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by sara mustafa <en...@gmail.com> on 2015/04/03 18:22:13 UTC

IntelliJ Runtime error

Hi,

I have built Spark 1.3.0 successfully on IntelliJ IDEA 14, but when i try to
SparkPi example under the examples module i face this error:
Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/spark/SparkConf
	at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:27)
	at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.SparkConf
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	... 7 more

Could anyone help me please?



--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/IntelliJ-Runtime-error-tp11383.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: IntelliJ Runtime error

Posted by Stephen Boesch <ja...@gmail.com>.
Thanks Cheng. Yes, the problem is that the way to set up to run inside
Intellij changes v frequently.  It is unfortunately not simply a one-time
investment to get IJ debugging working properly: the steps required are a
moving target approximately monthly to bi-monthly.

Doing remote debugging is probably a good choice to reduce the dev
environment volatility/maintenance.



2015-04-04 5:46 GMT-07:00 Cheng Lian <li...@gmail.com>:

> I found in general it's a pain to build/run Spark inside IntelliJ IDEA. I
> guess most people resort to this approach so that they can leverage the
> integrated debugger to debug and/or learn Spark internals. A more
> convenient way I'm using recently is resorting to the remote debugging
> feature. In this way, by adding driver/executor Java options, you may build
> and start the Spark applications/tests/daemons in the normal way and attach
> the debugger to it. I was using this to debug the HiveThriftServer2, and it
> worked perfectly.
>
> Steps to enable remote debugging:
>
> 1. Menu "Run / Edit configurations..."
> 2. Click the "+" button, choose "Remote"
> 3. Choose "Attach" or "Listen" in "Debugger mode" according to your actual
> needs
> 4. Copy, edit, and add Java options suggested in the dialog to
> `--driver-java-options` or `--executor-java-options`
> 5. If you're using attaching mode, first start your Spark program, then
> start remote debugging in IDEA
> 6. If you're using listening mode, first start remote debugging in IDEA,
> and then start your Spark program.
>
> Hope this can be helpful.
>
> Cheng
>
>
> On 4/4/15 12:54 AM, sara mustafa wrote:
>
>> Thank you, it works with me when I changed the dependencies from provided
>> to
>> compile.
>>
>>
>>
>> --
>> View this message in context: http://apache-spark-
>> developers-list.1001551.n3.nabble.com/IntelliJ-Runtime-
>> error-tp11383p11385.html
>> Sent from the Apache Spark Developers List mailing list archive at
>> Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
>> For additional commands, e-mail: dev-help@spark.apache.org
>>
>>
>>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>

Re: IntelliJ Runtime error

Posted by Cheng Lian <li...@gmail.com>.
I found in general it's a pain to build/run Spark inside IntelliJ IDEA. 
I guess most people resort to this approach so that they can leverage 
the integrated debugger to debug and/or learn Spark internals. A more 
convenient way I'm using recently is resorting to the remote debugging 
feature. In this way, by adding driver/executor Java options, you may 
build and start the Spark applications/tests/daemons in the normal way 
and attach the debugger to it. I was using this to debug the 
HiveThriftServer2, and it worked perfectly.

Steps to enable remote debugging:

1. Menu "Run / Edit configurations..."
2. Click the "+" button, choose "Remote"
3. Choose "Attach" or "Listen" in "Debugger mode" according to your 
actual needs
4. Copy, edit, and add Java options suggested in the dialog to 
`--driver-java-options` or `--executor-java-options`
5. If you're using attaching mode, first start your Spark program, then 
start remote debugging in IDEA
6. If you're using listening mode, first start remote debugging in IDEA, 
and then start your Spark program.

Hope this can be helpful.

Cheng

On 4/4/15 12:54 AM, sara mustafa wrote:
> Thank you, it works with me when I changed the dependencies from provided to
> compile.
>
>
>
> --
> View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/IntelliJ-Runtime-error-tp11383p11385.html
> Sent from the Apache Spark Developers List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>
>


---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


RE: IntelliJ Runtime error

Posted by sara mustafa <en...@gmail.com>.
Thank you, it works with me when I changed the dependencies from provided to
compile. 



--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/IntelliJ-Runtime-error-tp11383p11385.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


RE: IntelliJ Runtime error

Posted by java8964 <ja...@hotmail.com>.
You have to change most of dependences in the spark-example model from "provided" to "compile", so you can run the example in Intellij.
Yong

> Date: Fri, 3 Apr 2015 09:22:13 -0700
> From: eng.sara.mustafa@gmail.com
> To: dev@spark.apache.org
> Subject: IntelliJ Runtime error
> 
> Hi,
> 
> I have built Spark 1.3.0 successfully on IntelliJ IDEA 14, but when i try to
> SparkPi example under the examples module i face this error:
> Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/spark/SparkConf
> 	at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:27)
> 	at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
> Caused by: java.lang.ClassNotFoundException: org.apache.spark.SparkConf
> 	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> 	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> 	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> 	... 7 more
> 
> Could anyone help me please?
> 
> 
> 
> --
> View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/IntelliJ-Runtime-error-tp11383.html
> Sent from the Apache Spark Developers List mailing list archive at Nabble.com.
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
> For additional commands, e-mail: dev-help@spark.apache.org
>