You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Stephen Boesch <ja...@gmail.com> on 2017/10/11 21:48:25 UTC

Running spark examples in Intellij

When attempting to run any example program w/ Intellij I am running into
guava versioning issues:

Exception in thread "main" java.lang.NoClassDefFoundError:
com/google/common/cache/CacheLoader
at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:73)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:68)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:55)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$
7.apply(SparkSession.scala:919)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$
7.apply(SparkSession.scala:918)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.
scala:918)
at org.apache.spark.examples.ml.KMeansExample$.main(KMeansExample.scala:40)
at org.apache.spark.examples.ml.KMeansExample.main(KMeansExample.scala)
Caused by: java.lang.ClassNotFoundException: com.google.common.cache.
CacheLoader
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 9 more

The *scope*s for the spark dependencies were already changed from
*provided* to *compile* .  Both `sbt assembly` and `mvn package` had
already been run (successfully) from command line - and the (mvn) project
completely rebuilt inside intellij.

The spark testcases run fine: this is a problem only in the examples
module.  Anyone running these successfully in IJ?  I have tried for
2.1.0-SNAPSHOT and 2.3.0-SNAPSHOT - with the same outcome.

Re: Running spark examples in Intellij

Posted by Stephen Boesch <ja...@gmail.com>.
Thinking more carefully on your comment:

   - There may be some ambiguity as to whether the repo provided libraries
   are actually being used here - as you indicate - instead of the in-project
   classes. That would have to do with how the classpath inside IJ were
   constructed.   When I click through any of the spark classes in the IDE
   they do go directly to the in-project versions and not the repo jars: but
   that may not be definitive
   - In any case I had already performed the maven install and just
   verified that the jar's do have the correct timestamps in the local maven
   repo
   - The local maven repo is included by default  - so should not need to
   do anything special there

The same errors from the original post continue to occur.


2017-10-11 20:05 GMT-07:00 Stephen Boesch <ja...@gmail.com>:

> A clarification here: the example is being run *from the Spark codebase*.
> Therefore the mvn install step would not be required as the classes are
> available directly within the project.
>
> The reason for needing the `mvn package` to be invoked is to pick up the
> changes of having updated the spark dependency scopes from *provided *to
> *compile*.
>
> As mentioned the spark unit tests are working (and within Intellij and
> without `mvn install`): only the examples are not.
>
> 2017-10-11 19:43 GMT-07:00 Paul <co...@gmail.com>:
>
>> You say you did the maven package but did you do a maven install and
>> define your local maven repo in SBT?
>>
>> -Paul
>>
>> Sent from my iPhone
>>
>> On Oct 11, 2017, at 5:48 PM, Stephen Boesch <ja...@gmail.com> wrote:
>>
>> When attempting to run any example program w/ Intellij I am running into
>> guava versioning issues:
>>
>> Exception in thread "main" java.lang.NoClassDefFoundError:
>> com/google/common/cache/CacheLoader
>> at org.apache.spark.SparkConf.loadFromSystemProperties(SparkCon
>> f.scala:73)
>> at org.apache.spark.SparkConf.<init>(SparkConf.scala:68)
>> at org.apache.spark.SparkConf.<init>(SparkConf.scala:55)
>> at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(S
>> parkSession.scala:919)
>> at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(S
>> parkSession.scala:918)
>> at scala.Option.getOrElse(Option.scala:121)
>> at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkS
>> ession.scala:918)
>> at org.apache.spark.examples.ml.KMeansExample$.main(KMeansExamp
>> le.scala:40)
>> at org.apache.spark.examples.ml.KMeansExample.main(KMeansExample.scala)
>> Caused by: java.lang.ClassNotFoundException:
>> com.google.common.cache.CacheLoader
>> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>> ... 9 more
>>
>> The *scope*s for the spark dependencies were already changed from
>> *provided* to *compile* .  Both `sbt assembly` and `mvn package` had
>> already been run (successfully) from command line - and the (mvn) project
>> completely rebuilt inside intellij.
>>
>> The spark testcases run fine: this is a problem only in the examples
>> module.  Anyone running these successfully in IJ?  I have tried for
>> 2.1.0-SNAPSHOT and 2.3.0-SNAPSHOT - with the same outcome.
>>
>>
>>
>

Re: Running spark examples in Intellij

Posted by Stephen Boesch <ja...@gmail.com>.
A clarification here: the example is being run *from the Spark codebase*.
Therefore the mvn install step would not be required as the classes are
available directly within the project.

The reason for needing the `mvn package` to be invoked is to pick up the
changes of having updated the spark dependency scopes from *provided *to
*compile*.

As mentioned the spark unit tests are working (and within Intellij and
without `mvn install`): only the examples are not.

2017-10-11 19:43 GMT-07:00 Paul <co...@gmail.com>:

> You say you did the maven package but did you do a maven install and
> define your local maven repo in SBT?
>
> -Paul
>
> Sent from my iPhone
>
> On Oct 11, 2017, at 5:48 PM, Stephen Boesch <ja...@gmail.com> wrote:
>
> When attempting to run any example program w/ Intellij I am running into
> guava versioning issues:
>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> com/google/common/cache/CacheLoader
> at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:73)
> at org.apache.spark.SparkConf.<init>(SparkConf.scala:68)
> at org.apache.spark.SparkConf.<init>(SparkConf.scala:55)
> at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(
> SparkSession.scala:919)
> at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(
> SparkSession.scala:918)
> at scala.Option.getOrElse(Option.scala:121)
> at org.apache.spark.sql.SparkSession$Builder.getOrCreate(
> SparkSession.scala:918)
> at org.apache.spark.examples.ml.KMeansExample$.main(KMeansExamp
> le.scala:40)
> at org.apache.spark.examples.ml.KMeansExample.main(KMeansExample.scala)
> Caused by: java.lang.ClassNotFoundException:
> com.google.common.cache.CacheLoader
> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> ... 9 more
>
> The *scope*s for the spark dependencies were already changed from
> *provided* to *compile* .  Both `sbt assembly` and `mvn package` had
> already been run (successfully) from command line - and the (mvn) project
> completely rebuilt inside intellij.
>
> The spark testcases run fine: this is a problem only in the examples
> module.  Anyone running these successfully in IJ?  I have tried for
> 2.1.0-SNAPSHOT and 2.3.0-SNAPSHOT - with the same outcome.
>
>
>

Re: Running spark examples in Intellij

Posted by Paul <co...@gmail.com>.
You say you did the maven package but did you do a maven install and define your local maven repo in SBT?

-Paul

Sent from my iPhone

> On Oct 11, 2017, at 5:48 PM, Stephen Boesch <ja...@gmail.com> wrote:
> 
> When attempting to run any example program w/ Intellij I am running into guava versioning issues:
> 
> Exception in thread "main" java.lang.NoClassDefFoundError: com/google/common/cache/CacheLoader
> 	at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:73)
> 	at org.apache.spark.SparkConf.<init>(SparkConf.scala:68)
> 	at org.apache.spark.SparkConf.<init>(SparkConf.scala:55)
> 	at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:919)
> 	at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:918)
> 	at scala.Option.getOrElse(Option.scala:121)
> 	at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:918)
> 	at org.apache.spark.examples.ml.KMeansExample$.main(KMeansExample.scala:40)
> 	at org.apache.spark.examples.ml.KMeansExample.main(KMeansExample.scala)
> Caused by: java.lang.ClassNotFoundException: com.google.common.cache.CacheLoader
> 	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> 	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> 	... 9 more
> 
> The *scope*s for the spark dependencies were already changed from *provided* to *compile* .  Both `sbt assembly` and `mvn package` had already been run (successfully) from command line - and the (mvn) project completely rebuilt inside intellij.
> 
> The spark testcases run fine: this is a problem only in the examples module.  Anyone running these successfully in IJ?  I have tried for 2.1.0-SNAPSHOT and 2.3.0-SNAPSHOT - with the same outcome.
> 
>