You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@flink.apache.org by Gyula Fora <gy...@apache.org> on 2014/10/01 09:40:53 UTC

Re: Eclipse issue after latest dependency rework pr

That worked, thanks.

On 30 Sep 2014, at 23:12, Stephan Ewen <se...@apache.org> wrote:

> Yes, I wanted to write a few pointers about such issues.
> 
> We recently shaded the guava dependency, meaning that we have a custom
> version of guava where all classes reside in
> "org.apache.flink.shaded.com.google" and maven rewrites all our guava
> references to that namespace.
> 
> The reason is that guava is a frequently used library and versions are not
> all compatible. For example, hadoop 2.5 uses guava 11 and fails if we bring
> guava 17 into the classpath. The shading causes our guava version to not
> conflict, because our shaded version's classes reside different namespace.
> 
> Since Eclipse does not interpret the shading (it only happens in the maven
> package phase) you need a regular dependency to guava that is not exported
> ("provided" scope)
> 
> Try adding this to the project's pom, it will tell maven to compile and
> individually run with the guava lib, but not package it.
> 
> <dependency>
> <groupId>com.google.guava</groupId>
> <artifactId>guava</artifactId>
> <version>${guava.version}</version>
> <scope>provided</scope>
> </dependency>
> 
> 
> 
> On Tue, Sep 30, 2014 at 11:02 PM, Gyula Fóra <gy...@apache.org> wrote:
> 
>> Hey,
>> 
>> We have pulled the dependency rework from the apache master and now we
>> cannot get our tests to run in eclipse. With maven from the command line
>> and also with travis it works perfectly but when I try to run tests that
>> access the the Configuration class we get the following exception:
>> 
>> java.lang.NoClassDefFoundError: com/google/common/io/BaseEncoding
>> at
>> 
>> org.apache.flink.configuration.Configuration.setBytes(Configuration.java:358)
>> at
>> .....
>> some stuff here
>> .....
>> 
>> Caused by: java.lang.ClassNotFoundException:
>> com.google.common.io.BaseEncoding
>> at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>> ... 32 more
>> 
>> I literally tried everything to make it work, deleted all maven files,
>> reinstalled eclipse etc, but still no luck. Do you experience the same
>> issue when running the streaming-core tests in eclipse?
>> 
>> Regards,
>> Gyula
>> 


Re: Eclipse issue after latest dependency rework pr

Posted by Fabian Hueske <fh...@apache.org>.
Yes, I had the same error when I tried to run the batch WordCount in
Eclipse.
For me, adding the dependency to flink-parent did not work, but adding it
to flink-examples-java did.

I guess the same problem will occur when running a testcase that starts
Flink from Eclipse.

Fabian


2014-10-01 17:43 GMT+02:00 Márton Balassi <ba...@gmail.com>:

> I've just updated the streaming fork with Stefan's recent commit
> encapsulating this issue, namely:
>
> https://github.com/mbalassi/incubator-flink/commit/949699dbfe17b62352413769635aed3aaff56100
>
> It solves the problem for the streaming-core project, but running the batch
> WordCount example in Eclipse still produces the following:
>
> Executing WordCount example with built-in default data.
>   Provide parameters to read input data from a file.
>   Usage: WordCount <text path> <result path>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> com/google/common/base/Preconditions
> at
>
> org.apache.flink.api.common.operators.util.UserCodeObjectWrapper.<init>(UserCodeObjectWrapper.java:40)
> at
>
> org.apache.flink.api.common.operators.base.GenericDataSourceBase.<init>(GenericDataSourceBase.java:58)
> at
>
> org.apache.flink.api.java.operators.DataSource.translateToDataFlow(DataSource.java:75)
> at
>
> org.apache.flink.api.java.operators.OperatorTranslation.translate(OperatorTranslation.java:82)
> at
>
> org.apache.flink.api.java.operators.OperatorTranslation.translateSingleInputOperator(OperatorTranslation.java:117)
> at
>
> org.apache.flink.api.java.operators.OperatorTranslation.translate(OperatorTranslation.java:85)
> at
>
> org.apache.flink.api.java.operators.OperatorTranslation.translateSingleInputOperator(OperatorTranslation.java:117)
> at
>
> org.apache.flink.api.java.operators.OperatorTranslation.translate(OperatorTranslation.java:85)
> at
>
> org.apache.flink.api.java.operators.OperatorTranslation.translate(OperatorTranslation.java:60)
> at
>
> org.apache.flink.api.java.operators.OperatorTranslation.translateToPlan(OperatorTranslation.java:48)
> at
>
> org.apache.flink.api.java.ExecutionEnvironment.createProgramPlan(ExecutionEnvironment.java:650)
> at
>
> org.apache.flink.api.java.LocalEnvironment.execute(LocalEnvironment.java:48)
> at
> org.apache.flink.examples.java.wordcount.WordCount.main(WordCount.java:82)
> Caused by: java.lang.ClassNotFoundException:
> com.google.common.base.Preconditions
> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> ... 13 more
>
> Of course one approach is to move
>
> <groupId>com.google.guava</groupId>
> <artifactId>guava</artifactId>
> <version>${guava.version}</version>
> <scope>provided</scope>
> </dependency>
>
> towards the flink-parent pom, but this ultimately works against shading and
> the hadoop 2.5 version is going to fail as it does here:
>
> https://travis-ci.org/mbalassi/incubator-flink/jobs/36780802
>
> Any suggestions? :)
>
> On Wed, Oct 1, 2014 at 9:40 AM, Gyula Fora <gy...@apache.org> wrote:
>
> > That worked, thanks.
> >
> > On 30 Sep 2014, at 23:12, Stephan Ewen <se...@apache.org> wrote:
> >
> > > Yes, I wanted to write a few pointers about such issues.
> > >
> > > We recently shaded the guava dependency, meaning that we have a custom
> > > version of guava where all classes reside in
> > > "org.apache.flink.shaded.com.google" and maven rewrites all our guava
> > > references to that namespace.
> > >
> > > The reason is that guava is a frequently used library and versions are
> > not
> > > all compatible. For example, hadoop 2.5 uses guava 11 and fails if we
> > bring
> > > guava 17 into the classpath. The shading causes our guava version to
> not
> > > conflict, because our shaded version's classes reside different
> > namespace.
> > >
> > > Since Eclipse does not interpret the shading (it only happens in the
> > maven
> > > package phase) you need a regular dependency to guava that is not
> > exported
> > > ("provided" scope)
> > >
> > > Try adding this to the project's pom, it will tell maven to compile and
> > > individually run with the guava lib, but not package it.
> > >
> > > <dependency>
> > > <groupId>com.google.guava</groupId>
> > > <artifactId>guava</artifactId>
> > > <version>${guava.version}</version>
> > > <scope>provided</scope>
> > > </dependency>
> > >
> > >
> > >
> > > On Tue, Sep 30, 2014 at 11:02 PM, Gyula Fóra <gy...@apache.org>
> wrote:
> > >
> > >> Hey,
> > >>
> > >> We have pulled the dependency rework from the apache master and now we
> > >> cannot get our tests to run in eclipse. With maven from the command
> line
> > >> and also with travis it works perfectly but when I try to run tests
> that
> > >> access the the Configuration class we get the following exception:
> > >>
> > >> java.lang.NoClassDefFoundError: com/google/common/io/BaseEncoding
> > >> at
> > >>
> > >>
> >
> org.apache.flink.configuration.Configuration.setBytes(Configuration.java:358)
> > >> at
> > >> .....
> > >> some stuff here
> > >> .....
> > >>
> > >> Caused by: java.lang.ClassNotFoundException:
> > >> com.google.common.io.BaseEncoding
> > >> at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> > >> at java.security.AccessController.doPrivileged(Native Method)
> > >> at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > >> at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > >> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> > >> at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > >> ... 32 more
> > >>
> > >> I literally tried everything to make it work, deleted all maven files,
> > >> reinstalled eclipse etc, but still no luck. Do you experience the same
> > >> issue when running the streaming-core tests in eclipse?
> > >>
> > >> Regards,
> > >> Gyula
> > >>
> >
> >
>

Re: Eclipse issue after latest dependency rework pr

Posted by Fabian Hueske <fh...@apache.org>.
Perfect! Ran a few Java examples, Scala examples, and integration tests in
Eclipse.
Thanks for the quick fix!

2014-10-01 20:36 GMT+02:00 Stephan Ewen <se...@apache.org>:

> It works.
>
> The current master should work for all programs started out of
> Eclipse/IntelliJ (testted by Aljoscha and me) and exposes correct
> dependencies through maven.
>
> Please post if you have further trouble.
>
>
>
> On Wed, Oct 1, 2014 at 6:25 PM, Stephan Ewen <se...@apache.org> wrote:
>
> > Just had an idea:
> >
> > We can let the IDEs export guava from the "flink-shaded" project and have
> > the maven builds use the autogenerated dependency-reduced-pom to hide the
> > original guava.
> >
> > Let me try this...
> >
> > On Wed, Oct 1, 2014 at 6:10 PM, Stephan Ewen <se...@apache.org> wrote:
> >
> >> Sorry for the hassle, I know this whole dependency thing is tedious, but
> >> I think it is important that we get our guava out of the way. Otherwise
> >> anyone that uses Flink will be in the trouble that we are with hadoop
> (that
> >> did not shade its guava).
> >>
> >> I had the same issue. In a prior version, we had guava as provided in
> the
> >> parent pom.
> >>
> >> The hadoop dependency (that needs guava) thought guava will be provided
> >> and did not add it, ultimately leading to failures. We need to make sure
> >> that a dependency that depends on guava adds its guava version, while
> our
> >> own code must not add it (our code must only assume its presence - it is
> >> rewritten to the shaded version anyways).
> >>
> >> I am not sure how to simplify that, I tried various things for many many
> >> hours and I did not find a better approach.
> >>
> >> The only alternative I could come up with is shading guava in each of
> our
> >> projects separately, which means that the guava classes will be added to
> >> that project's jar file, in a different name space. Since we have 10+
> >> projects depending on guava, we get guava 10 times into our jars in
> total.
> >> Not a clean solution as well (although it works, as the class loaders
> >> ignore duplicate classes.
> >>
> >> What would be cool is if we had a maven option to "unprovide" guava. The
> >> we could provide in the parent pom, and unprovide for the hadoop
> dependency.
> >>
> >>
> >>
> >>
> >>
> >> On Wed, Oct 1, 2014 at 5:43 PM, Márton Balassi <
> balassi.marton@gmail.com>
> >> wrote:
> >>
> >>> I've just updated the streaming fork with Stefan's recent commit
> >>> encapsulating this issue, namely:
> >>>
> >>>
> https://github.com/mbalassi/incubator-flink/commit/949699dbfe17b62352413769635aed3aaff56100
> >>>
> >>> It solves the problem for the streaming-core project, but running the
> >>> batch
> >>> WordCount example in Eclipse still produces the following:
> >>>
> >>> Executing WordCount example with built-in default data.
> >>>   Provide parameters to read input data from a file.
> >>>   Usage: WordCount <text path> <result path>
> >>> Exception in thread "main" java.lang.NoClassDefFoundError:
> >>> com/google/common/base/Preconditions
> >>> at
> >>>
> >>>
> org.apache.flink.api.common.operators.util.UserCodeObjectWrapper.<init>(UserCodeObjectWrapper.java:40)
> >>> at
> >>>
> >>>
> org.apache.flink.api.common.operators.base.GenericDataSourceBase.<init>(GenericDataSourceBase.java:58)
> >>> at
> >>>
> >>>
> org.apache.flink.api.java.operators.DataSource.translateToDataFlow(DataSource.java:75)
> >>> at
> >>>
> >>>
> org.apache.flink.api.java.operators.OperatorTranslation.translate(OperatorTranslation.java:82)
> >>> at
> >>>
> >>>
> org.apache.flink.api.java.operators.OperatorTranslation.translateSingleInputOperator(OperatorTranslation.java:117)
> >>> at
> >>>
> >>>
> org.apache.flink.api.java.operators.OperatorTranslation.translate(OperatorTranslation.java:85)
> >>> at
> >>>
> >>>
> org.apache.flink.api.java.operators.OperatorTranslation.translateSingleInputOperator(OperatorTranslation.java:117)
> >>> at
> >>>
> >>>
> org.apache.flink.api.java.operators.OperatorTranslation.translate(OperatorTranslation.java:85)
> >>> at
> >>>
> >>>
> org.apache.flink.api.java.operators.OperatorTranslation.translate(OperatorTranslation.java:60)
> >>> at
> >>>
> >>>
> org.apache.flink.api.java.operators.OperatorTranslation.translateToPlan(OperatorTranslation.java:48)
> >>> at
> >>>
> >>>
> org.apache.flink.api.java.ExecutionEnvironment.createProgramPlan(ExecutionEnvironment.java:650)
> >>> at
> >>>
> >>>
> org.apache.flink.api.java.LocalEnvironment.execute(LocalEnvironment.java:48)
> >>> at
> >>>
> >>>
> org.apache.flink.examples.java.wordcount.WordCount.main(WordCount.java:82)
> >>> Caused by: java.lang.ClassNotFoundException:
> >>> com.google.common.base.Preconditions
> >>> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> >>> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> >>> at java.security.AccessController.doPrivileged(Native Method)
> >>> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> >>> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> >>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> >>> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> >>> ... 13 more
> >>>
> >>> Of course one approach is to move
> >>>
> >>> <groupId>com.google.guava</groupId>
> >>> <artifactId>guava</artifactId>
> >>> <version>${guava.version}</version>
> >>> <scope>provided</scope>
> >>> </dependency>
> >>>
> >>> towards the flink-parent pom, but this ultimately works against shading
> >>> and
> >>> the hadoop 2.5 version is going to fail as it does here:
> >>>
> >>> https://travis-ci.org/mbalassi/incubator-flink/jobs/36780802
> >>>
> >>> Any suggestions? :)
> >>>
> >>> On Wed, Oct 1, 2014 at 9:40 AM, Gyula Fora <gy...@apache.org> wrote:
> >>>
> >>> > That worked, thanks.
> >>> >
> >>> > On 30 Sep 2014, at 23:12, Stephan Ewen <se...@apache.org> wrote:
> >>> >
> >>> > > Yes, I wanted to write a few pointers about such issues.
> >>> > >
> >>> > > We recently shaded the guava dependency, meaning that we have a
> >>> custom
> >>> > > version of guava where all classes reside in
> >>> > > "org.apache.flink.shaded.com.google" and maven rewrites all our
> guava
> >>> > > references to that namespace.
> >>> > >
> >>> > > The reason is that guava is a frequently used library and versions
> >>> are
> >>> > not
> >>> > > all compatible. For example, hadoop 2.5 uses guava 11 and fails if
> we
> >>> > bring
> >>> > > guava 17 into the classpath. The shading causes our guava version
> to
> >>> not
> >>> > > conflict, because our shaded version's classes reside different
> >>> > namespace.
> >>> > >
> >>> > > Since Eclipse does not interpret the shading (it only happens in
> the
> >>> > maven
> >>> > > package phase) you need a regular dependency to guava that is not
> >>> > exported
> >>> > > ("provided" scope)
> >>> > >
> >>> > > Try adding this to the project's pom, it will tell maven to compile
> >>> and
> >>> > > individually run with the guava lib, but not package it.
> >>> > >
> >>> > > <dependency>
> >>> > > <groupId>com.google.guava</groupId>
> >>> > > <artifactId>guava</artifactId>
> >>> > > <version>${guava.version}</version>
> >>> > > <scope>provided</scope>
> >>> > > </dependency>
> >>> > >
> >>> > >
> >>> > >
> >>> > > On Tue, Sep 30, 2014 at 11:02 PM, Gyula Fóra <gy...@apache.org>
> >>> wrote:
> >>> > >
> >>> > >> Hey,
> >>> > >>
> >>> > >> We have pulled the dependency rework from the apache master and
> now
> >>> we
> >>> > >> cannot get our tests to run in eclipse. With maven from the
> command
> >>> line
> >>> > >> and also with travis it works perfectly but when I try to run
> tests
> >>> that
> >>> > >> access the the Configuration class we get the following exception:
> >>> > >>
> >>> > >> java.lang.NoClassDefFoundError: com/google/common/io/BaseEncoding
> >>> > >> at
> >>> > >>
> >>> > >>
> >>> >
> >>>
> org.apache.flink.configuration.Configuration.setBytes(Configuration.java:358)
> >>> > >> at
> >>> > >> .....
> >>> > >> some stuff here
> >>> > >> .....
> >>> > >>
> >>> > >> Caused by: java.lang.ClassNotFoundException:
> >>> > >> com.google.common.io.BaseEncoding
> >>> > >> at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> >>> > >> at java.security.AccessController.doPrivileged(Native Method)
> >>> > >> at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> >>> > >> at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> >>> > >> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> >>> > >> at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> >>> > >> ... 32 more
> >>> > >>
> >>> > >> I literally tried everything to make it work, deleted all maven
> >>> files,
> >>> > >> reinstalled eclipse etc, but still no luck. Do you experience the
> >>> same
> >>> > >> issue when running the streaming-core tests in eclipse?
> >>> > >>
> >>> > >> Regards,
> >>> > >> Gyula
> >>> > >>
> >>> >
> >>> >
> >>>
> >>
> >>
> >
>

Re: Eclipse issue after latest dependency rework pr

Posted by Stephan Ewen <se...@apache.org>.
It works.

The current master should work for all programs started out of
Eclipse/IntelliJ (testted by Aljoscha and me) and exposes correct
dependencies through maven.

Please post if you have further trouble.



On Wed, Oct 1, 2014 at 6:25 PM, Stephan Ewen <se...@apache.org> wrote:

> Just had an idea:
>
> We can let the IDEs export guava from the "flink-shaded" project and have
> the maven builds use the autogenerated dependency-reduced-pom to hide the
> original guava.
>
> Let me try this...
>
> On Wed, Oct 1, 2014 at 6:10 PM, Stephan Ewen <se...@apache.org> wrote:
>
>> Sorry for the hassle, I know this whole dependency thing is tedious, but
>> I think it is important that we get our guava out of the way. Otherwise
>> anyone that uses Flink will be in the trouble that we are with hadoop (that
>> did not shade its guava).
>>
>> I had the same issue. In a prior version, we had guava as provided in the
>> parent pom.
>>
>> The hadoop dependency (that needs guava) thought guava will be provided
>> and did not add it, ultimately leading to failures. We need to make sure
>> that a dependency that depends on guava adds its guava version, while our
>> own code must not add it (our code must only assume its presence - it is
>> rewritten to the shaded version anyways).
>>
>> I am not sure how to simplify that, I tried various things for many many
>> hours and I did not find a better approach.
>>
>> The only alternative I could come up with is shading guava in each of our
>> projects separately, which means that the guava classes will be added to
>> that project's jar file, in a different name space. Since we have 10+
>> projects depending on guava, we get guava 10 times into our jars in total.
>> Not a clean solution as well (although it works, as the class loaders
>> ignore duplicate classes.
>>
>> What would be cool is if we had a maven option to "unprovide" guava. The
>> we could provide in the parent pom, and unprovide for the hadoop dependency.
>>
>>
>>
>>
>>
>> On Wed, Oct 1, 2014 at 5:43 PM, Márton Balassi <ba...@gmail.com>
>> wrote:
>>
>>> I've just updated the streaming fork with Stefan's recent commit
>>> encapsulating this issue, namely:
>>>
>>> https://github.com/mbalassi/incubator-flink/commit/949699dbfe17b62352413769635aed3aaff56100
>>>
>>> It solves the problem for the streaming-core project, but running the
>>> batch
>>> WordCount example in Eclipse still produces the following:
>>>
>>> Executing WordCount example with built-in default data.
>>>   Provide parameters to read input data from a file.
>>>   Usage: WordCount <text path> <result path>
>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>> com/google/common/base/Preconditions
>>> at
>>>
>>> org.apache.flink.api.common.operators.util.UserCodeObjectWrapper.<init>(UserCodeObjectWrapper.java:40)
>>> at
>>>
>>> org.apache.flink.api.common.operators.base.GenericDataSourceBase.<init>(GenericDataSourceBase.java:58)
>>> at
>>>
>>> org.apache.flink.api.java.operators.DataSource.translateToDataFlow(DataSource.java:75)
>>> at
>>>
>>> org.apache.flink.api.java.operators.OperatorTranslation.translate(OperatorTranslation.java:82)
>>> at
>>>
>>> org.apache.flink.api.java.operators.OperatorTranslation.translateSingleInputOperator(OperatorTranslation.java:117)
>>> at
>>>
>>> org.apache.flink.api.java.operators.OperatorTranslation.translate(OperatorTranslation.java:85)
>>> at
>>>
>>> org.apache.flink.api.java.operators.OperatorTranslation.translateSingleInputOperator(OperatorTranslation.java:117)
>>> at
>>>
>>> org.apache.flink.api.java.operators.OperatorTranslation.translate(OperatorTranslation.java:85)
>>> at
>>>
>>> org.apache.flink.api.java.operators.OperatorTranslation.translate(OperatorTranslation.java:60)
>>> at
>>>
>>> org.apache.flink.api.java.operators.OperatorTranslation.translateToPlan(OperatorTranslation.java:48)
>>> at
>>>
>>> org.apache.flink.api.java.ExecutionEnvironment.createProgramPlan(ExecutionEnvironment.java:650)
>>> at
>>>
>>> org.apache.flink.api.java.LocalEnvironment.execute(LocalEnvironment.java:48)
>>> at
>>>
>>> org.apache.flink.examples.java.wordcount.WordCount.main(WordCount.java:82)
>>> Caused by: java.lang.ClassNotFoundException:
>>> com.google.common.base.Preconditions
>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>> at java.security.AccessController.doPrivileged(Native Method)
>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>> ... 13 more
>>>
>>> Of course one approach is to move
>>>
>>> <groupId>com.google.guava</groupId>
>>> <artifactId>guava</artifactId>
>>> <version>${guava.version}</version>
>>> <scope>provided</scope>
>>> </dependency>
>>>
>>> towards the flink-parent pom, but this ultimately works against shading
>>> and
>>> the hadoop 2.5 version is going to fail as it does here:
>>>
>>> https://travis-ci.org/mbalassi/incubator-flink/jobs/36780802
>>>
>>> Any suggestions? :)
>>>
>>> On Wed, Oct 1, 2014 at 9:40 AM, Gyula Fora <gy...@apache.org> wrote:
>>>
>>> > That worked, thanks.
>>> >
>>> > On 30 Sep 2014, at 23:12, Stephan Ewen <se...@apache.org> wrote:
>>> >
>>> > > Yes, I wanted to write a few pointers about such issues.
>>> > >
>>> > > We recently shaded the guava dependency, meaning that we have a
>>> custom
>>> > > version of guava where all classes reside in
>>> > > "org.apache.flink.shaded.com.google" and maven rewrites all our guava
>>> > > references to that namespace.
>>> > >
>>> > > The reason is that guava is a frequently used library and versions
>>> are
>>> > not
>>> > > all compatible. For example, hadoop 2.5 uses guava 11 and fails if we
>>> > bring
>>> > > guava 17 into the classpath. The shading causes our guava version to
>>> not
>>> > > conflict, because our shaded version's classes reside different
>>> > namespace.
>>> > >
>>> > > Since Eclipse does not interpret the shading (it only happens in the
>>> > maven
>>> > > package phase) you need a regular dependency to guava that is not
>>> > exported
>>> > > ("provided" scope)
>>> > >
>>> > > Try adding this to the project's pom, it will tell maven to compile
>>> and
>>> > > individually run with the guava lib, but not package it.
>>> > >
>>> > > <dependency>
>>> > > <groupId>com.google.guava</groupId>
>>> > > <artifactId>guava</artifactId>
>>> > > <version>${guava.version}</version>
>>> > > <scope>provided</scope>
>>> > > </dependency>
>>> > >
>>> > >
>>> > >
>>> > > On Tue, Sep 30, 2014 at 11:02 PM, Gyula Fóra <gy...@apache.org>
>>> wrote:
>>> > >
>>> > >> Hey,
>>> > >>
>>> > >> We have pulled the dependency rework from the apache master and now
>>> we
>>> > >> cannot get our tests to run in eclipse. With maven from the command
>>> line
>>> > >> and also with travis it works perfectly but when I try to run tests
>>> that
>>> > >> access the the Configuration class we get the following exception:
>>> > >>
>>> > >> java.lang.NoClassDefFoundError: com/google/common/io/BaseEncoding
>>> > >> at
>>> > >>
>>> > >>
>>> >
>>> org.apache.flink.configuration.Configuration.setBytes(Configuration.java:358)
>>> > >> at
>>> > >> .....
>>> > >> some stuff here
>>> > >> .....
>>> > >>
>>> > >> Caused by: java.lang.ClassNotFoundException:
>>> > >> com.google.common.io.BaseEncoding
>>> > >> at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>>> > >> at java.security.AccessController.doPrivileged(Native Method)
>>> > >> at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>> > >> at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>> > >> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>> > >> at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>> > >> ... 32 more
>>> > >>
>>> > >> I literally tried everything to make it work, deleted all maven
>>> files,
>>> > >> reinstalled eclipse etc, but still no luck. Do you experience the
>>> same
>>> > >> issue when running the streaming-core tests in eclipse?
>>> > >>
>>> > >> Regards,
>>> > >> Gyula
>>> > >>
>>> >
>>> >
>>>
>>
>>
>

Re: Eclipse issue after latest dependency rework pr

Posted by Stephan Ewen <se...@apache.org>.
Just had an idea:

We can let the IDEs export guava from the "flink-shaded" project and have
the maven builds use the autogenerated dependency-reduced-pom to hide the
original guava.

Let me try this...

On Wed, Oct 1, 2014 at 6:10 PM, Stephan Ewen <se...@apache.org> wrote:

> Sorry for the hassle, I know this whole dependency thing is tedious, but I
> think it is important that we get our guava out of the way. Otherwise
> anyone that uses Flink will be in the trouble that we are with hadoop (that
> did not shade its guava).
>
> I had the same issue. In a prior version, we had guava as provided in the
> parent pom.
>
> The hadoop dependency (that needs guava) thought guava will be provided
> and did not add it, ultimately leading to failures. We need to make sure
> that a dependency that depends on guava adds its guava version, while our
> own code must not add it (our code must only assume its presence - it is
> rewritten to the shaded version anyways).
>
> I am not sure how to simplify that, I tried various things for many many
> hours and I did not find a better approach.
>
> The only alternative I could come up with is shading guava in each of our
> projects separately, which means that the guava classes will be added to
> that project's jar file, in a different name space. Since we have 10+
> projects depending on guava, we get guava 10 times into our jars in total.
> Not a clean solution as well (although it works, as the class loaders
> ignore duplicate classes.
>
> What would be cool is if we had a maven option to "unprovide" guava. The
> we could provide in the parent pom, and unprovide for the hadoop dependency.
>
>
>
>
>
> On Wed, Oct 1, 2014 at 5:43 PM, Márton Balassi <ba...@gmail.com>
> wrote:
>
>> I've just updated the streaming fork with Stefan's recent commit
>> encapsulating this issue, namely:
>>
>> https://github.com/mbalassi/incubator-flink/commit/949699dbfe17b62352413769635aed3aaff56100
>>
>> It solves the problem for the streaming-core project, but running the
>> batch
>> WordCount example in Eclipse still produces the following:
>>
>> Executing WordCount example with built-in default data.
>>   Provide parameters to read input data from a file.
>>   Usage: WordCount <text path> <result path>
>> Exception in thread "main" java.lang.NoClassDefFoundError:
>> com/google/common/base/Preconditions
>> at
>>
>> org.apache.flink.api.common.operators.util.UserCodeObjectWrapper.<init>(UserCodeObjectWrapper.java:40)
>> at
>>
>> org.apache.flink.api.common.operators.base.GenericDataSourceBase.<init>(GenericDataSourceBase.java:58)
>> at
>>
>> org.apache.flink.api.java.operators.DataSource.translateToDataFlow(DataSource.java:75)
>> at
>>
>> org.apache.flink.api.java.operators.OperatorTranslation.translate(OperatorTranslation.java:82)
>> at
>>
>> org.apache.flink.api.java.operators.OperatorTranslation.translateSingleInputOperator(OperatorTranslation.java:117)
>> at
>>
>> org.apache.flink.api.java.operators.OperatorTranslation.translate(OperatorTranslation.java:85)
>> at
>>
>> org.apache.flink.api.java.operators.OperatorTranslation.translateSingleInputOperator(OperatorTranslation.java:117)
>> at
>>
>> org.apache.flink.api.java.operators.OperatorTranslation.translate(OperatorTranslation.java:85)
>> at
>>
>> org.apache.flink.api.java.operators.OperatorTranslation.translate(OperatorTranslation.java:60)
>> at
>>
>> org.apache.flink.api.java.operators.OperatorTranslation.translateToPlan(OperatorTranslation.java:48)
>> at
>>
>> org.apache.flink.api.java.ExecutionEnvironment.createProgramPlan(ExecutionEnvironment.java:650)
>> at
>>
>> org.apache.flink.api.java.LocalEnvironment.execute(LocalEnvironment.java:48)
>> at
>> org.apache.flink.examples.java.wordcount.WordCount.main(WordCount.java:82)
>> Caused by: java.lang.ClassNotFoundException:
>> com.google.common.base.Preconditions
>> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>> ... 13 more
>>
>> Of course one approach is to move
>>
>> <groupId>com.google.guava</groupId>
>> <artifactId>guava</artifactId>
>> <version>${guava.version}</version>
>> <scope>provided</scope>
>> </dependency>
>>
>> towards the flink-parent pom, but this ultimately works against shading
>> and
>> the hadoop 2.5 version is going to fail as it does here:
>>
>> https://travis-ci.org/mbalassi/incubator-flink/jobs/36780802
>>
>> Any suggestions? :)
>>
>> On Wed, Oct 1, 2014 at 9:40 AM, Gyula Fora <gy...@apache.org> wrote:
>>
>> > That worked, thanks.
>> >
>> > On 30 Sep 2014, at 23:12, Stephan Ewen <se...@apache.org> wrote:
>> >
>> > > Yes, I wanted to write a few pointers about such issues.
>> > >
>> > > We recently shaded the guava dependency, meaning that we have a custom
>> > > version of guava where all classes reside in
>> > > "org.apache.flink.shaded.com.google" and maven rewrites all our guava
>> > > references to that namespace.
>> > >
>> > > The reason is that guava is a frequently used library and versions are
>> > not
>> > > all compatible. For example, hadoop 2.5 uses guava 11 and fails if we
>> > bring
>> > > guava 17 into the classpath. The shading causes our guava version to
>> not
>> > > conflict, because our shaded version's classes reside different
>> > namespace.
>> > >
>> > > Since Eclipse does not interpret the shading (it only happens in the
>> > maven
>> > > package phase) you need a regular dependency to guava that is not
>> > exported
>> > > ("provided" scope)
>> > >
>> > > Try adding this to the project's pom, it will tell maven to compile
>> and
>> > > individually run with the guava lib, but not package it.
>> > >
>> > > <dependency>
>> > > <groupId>com.google.guava</groupId>
>> > > <artifactId>guava</artifactId>
>> > > <version>${guava.version}</version>
>> > > <scope>provided</scope>
>> > > </dependency>
>> > >
>> > >
>> > >
>> > > On Tue, Sep 30, 2014 at 11:02 PM, Gyula Fóra <gy...@apache.org>
>> wrote:
>> > >
>> > >> Hey,
>> > >>
>> > >> We have pulled the dependency rework from the apache master and now
>> we
>> > >> cannot get our tests to run in eclipse. With maven from the command
>> line
>> > >> and also with travis it works perfectly but when I try to run tests
>> that
>> > >> access the the Configuration class we get the following exception:
>> > >>
>> > >> java.lang.NoClassDefFoundError: com/google/common/io/BaseEncoding
>> > >> at
>> > >>
>> > >>
>> >
>> org.apache.flink.configuration.Configuration.setBytes(Configuration.java:358)
>> > >> at
>> > >> .....
>> > >> some stuff here
>> > >> .....
>> > >>
>> > >> Caused by: java.lang.ClassNotFoundException:
>> > >> com.google.common.io.BaseEncoding
>> > >> at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>> > >> at java.security.AccessController.doPrivileged(Native Method)
>> > >> at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>> > >> at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>> > >> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>> > >> at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>> > >> ... 32 more
>> > >>
>> > >> I literally tried everything to make it work, deleted all maven
>> files,
>> > >> reinstalled eclipse etc, but still no luck. Do you experience the
>> same
>> > >> issue when running the streaming-core tests in eclipse?
>> > >>
>> > >> Regards,
>> > >> Gyula
>> > >>
>> >
>> >
>>
>
>

Re: Eclipse issue after latest dependency rework pr

Posted by Stephan Ewen <se...@apache.org>.
Sorry for the hassle, I know this whole dependency thing is tedious, but I
think it is important that we get our guava out of the way. Otherwise
anyone that uses Flink will be in the trouble that we are with hadoop (that
did not shade its guava).

I had the same issue. In a prior version, we had guava as provided in the
parent pom.

The hadoop dependency (that needs guava) thought guava will be provided and
did not add it, ultimately leading to failures. We need to make sure that a
dependency that depends on guava adds its guava version, while our own code
must not add it (our code must only assume its presence - it is rewritten
to the shaded version anyways).

I am not sure how to simplify that, I tried various things for many many
hours and I did not find a better approach.

The only alternative I could come up with is shading guava in each of our
projects separately, which means that the guava classes will be added to
that project's jar file, in a different name space. Since we have 10+
projects depending on guava, we get guava 10 times into our jars in total.
Not a clean solution as well (although it works, as the class loaders
ignore duplicate classes.

What would be cool is if we had a maven option to "unprovide" guava. The we
could provide in the parent pom, and unprovide for the hadoop dependency.





On Wed, Oct 1, 2014 at 5:43 PM, Márton Balassi <ba...@gmail.com>
wrote:

> I've just updated the streaming fork with Stefan's recent commit
> encapsulating this issue, namely:
>
> https://github.com/mbalassi/incubator-flink/commit/949699dbfe17b62352413769635aed3aaff56100
>
> It solves the problem for the streaming-core project, but running the batch
> WordCount example in Eclipse still produces the following:
>
> Executing WordCount example with built-in default data.
>   Provide parameters to read input data from a file.
>   Usage: WordCount <text path> <result path>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> com/google/common/base/Preconditions
> at
>
> org.apache.flink.api.common.operators.util.UserCodeObjectWrapper.<init>(UserCodeObjectWrapper.java:40)
> at
>
> org.apache.flink.api.common.operators.base.GenericDataSourceBase.<init>(GenericDataSourceBase.java:58)
> at
>
> org.apache.flink.api.java.operators.DataSource.translateToDataFlow(DataSource.java:75)
> at
>
> org.apache.flink.api.java.operators.OperatorTranslation.translate(OperatorTranslation.java:82)
> at
>
> org.apache.flink.api.java.operators.OperatorTranslation.translateSingleInputOperator(OperatorTranslation.java:117)
> at
>
> org.apache.flink.api.java.operators.OperatorTranslation.translate(OperatorTranslation.java:85)
> at
>
> org.apache.flink.api.java.operators.OperatorTranslation.translateSingleInputOperator(OperatorTranslation.java:117)
> at
>
> org.apache.flink.api.java.operators.OperatorTranslation.translate(OperatorTranslation.java:85)
> at
>
> org.apache.flink.api.java.operators.OperatorTranslation.translate(OperatorTranslation.java:60)
> at
>
> org.apache.flink.api.java.operators.OperatorTranslation.translateToPlan(OperatorTranslation.java:48)
> at
>
> org.apache.flink.api.java.ExecutionEnvironment.createProgramPlan(ExecutionEnvironment.java:650)
> at
>
> org.apache.flink.api.java.LocalEnvironment.execute(LocalEnvironment.java:48)
> at
> org.apache.flink.examples.java.wordcount.WordCount.main(WordCount.java:82)
> Caused by: java.lang.ClassNotFoundException:
> com.google.common.base.Preconditions
> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> ... 13 more
>
> Of course one approach is to move
>
> <groupId>com.google.guava</groupId>
> <artifactId>guava</artifactId>
> <version>${guava.version}</version>
> <scope>provided</scope>
> </dependency>
>
> towards the flink-parent pom, but this ultimately works against shading and
> the hadoop 2.5 version is going to fail as it does here:
>
> https://travis-ci.org/mbalassi/incubator-flink/jobs/36780802
>
> Any suggestions? :)
>
> On Wed, Oct 1, 2014 at 9:40 AM, Gyula Fora <gy...@apache.org> wrote:
>
> > That worked, thanks.
> >
> > On 30 Sep 2014, at 23:12, Stephan Ewen <se...@apache.org> wrote:
> >
> > > Yes, I wanted to write a few pointers about such issues.
> > >
> > > We recently shaded the guava dependency, meaning that we have a custom
> > > version of guava where all classes reside in
> > > "org.apache.flink.shaded.com.google" and maven rewrites all our guava
> > > references to that namespace.
> > >
> > > The reason is that guava is a frequently used library and versions are
> > not
> > > all compatible. For example, hadoop 2.5 uses guava 11 and fails if we
> > bring
> > > guava 17 into the classpath. The shading causes our guava version to
> not
> > > conflict, because our shaded version's classes reside different
> > namespace.
> > >
> > > Since Eclipse does not interpret the shading (it only happens in the
> > maven
> > > package phase) you need a regular dependency to guava that is not
> > exported
> > > ("provided" scope)
> > >
> > > Try adding this to the project's pom, it will tell maven to compile and
> > > individually run with the guava lib, but not package it.
> > >
> > > <dependency>
> > > <groupId>com.google.guava</groupId>
> > > <artifactId>guava</artifactId>
> > > <version>${guava.version}</version>
> > > <scope>provided</scope>
> > > </dependency>
> > >
> > >
> > >
> > > On Tue, Sep 30, 2014 at 11:02 PM, Gyula Fóra <gy...@apache.org>
> wrote:
> > >
> > >> Hey,
> > >>
> > >> We have pulled the dependency rework from the apache master and now we
> > >> cannot get our tests to run in eclipse. With maven from the command
> line
> > >> and also with travis it works perfectly but when I try to run tests
> that
> > >> access the the Configuration class we get the following exception:
> > >>
> > >> java.lang.NoClassDefFoundError: com/google/common/io/BaseEncoding
> > >> at
> > >>
> > >>
> >
> org.apache.flink.configuration.Configuration.setBytes(Configuration.java:358)
> > >> at
> > >> .....
> > >> some stuff here
> > >> .....
> > >>
> > >> Caused by: java.lang.ClassNotFoundException:
> > >> com.google.common.io.BaseEncoding
> > >> at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> > >> at java.security.AccessController.doPrivileged(Native Method)
> > >> at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > >> at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > >> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> > >> at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > >> ... 32 more
> > >>
> > >> I literally tried everything to make it work, deleted all maven files,
> > >> reinstalled eclipse etc, but still no luck. Do you experience the same
> > >> issue when running the streaming-core tests in eclipse?
> > >>
> > >> Regards,
> > >> Gyula
> > >>
> >
> >
>

Re: Eclipse issue after latest dependency rework pr

Posted by Márton Balassi <ba...@gmail.com>.
I've just updated the streaming fork with Stefan's recent commit
encapsulating this issue, namely:
https://github.com/mbalassi/incubator-flink/commit/949699dbfe17b62352413769635aed3aaff56100

It solves the problem for the streaming-core project, but running the batch
WordCount example in Eclipse still produces the following:

Executing WordCount example with built-in default data.
  Provide parameters to read input data from a file.
  Usage: WordCount <text path> <result path>
Exception in thread "main" java.lang.NoClassDefFoundError:
com/google/common/base/Preconditions
at
org.apache.flink.api.common.operators.util.UserCodeObjectWrapper.<init>(UserCodeObjectWrapper.java:40)
at
org.apache.flink.api.common.operators.base.GenericDataSourceBase.<init>(GenericDataSourceBase.java:58)
at
org.apache.flink.api.java.operators.DataSource.translateToDataFlow(DataSource.java:75)
at
org.apache.flink.api.java.operators.OperatorTranslation.translate(OperatorTranslation.java:82)
at
org.apache.flink.api.java.operators.OperatorTranslation.translateSingleInputOperator(OperatorTranslation.java:117)
at
org.apache.flink.api.java.operators.OperatorTranslation.translate(OperatorTranslation.java:85)
at
org.apache.flink.api.java.operators.OperatorTranslation.translateSingleInputOperator(OperatorTranslation.java:117)
at
org.apache.flink.api.java.operators.OperatorTranslation.translate(OperatorTranslation.java:85)
at
org.apache.flink.api.java.operators.OperatorTranslation.translate(OperatorTranslation.java:60)
at
org.apache.flink.api.java.operators.OperatorTranslation.translateToPlan(OperatorTranslation.java:48)
at
org.apache.flink.api.java.ExecutionEnvironment.createProgramPlan(ExecutionEnvironment.java:650)
at
org.apache.flink.api.java.LocalEnvironment.execute(LocalEnvironment.java:48)
at
org.apache.flink.examples.java.wordcount.WordCount.main(WordCount.java:82)
Caused by: java.lang.ClassNotFoundException:
com.google.common.base.Preconditions
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 13 more

Of course one approach is to move

<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>${guava.version}</version>
<scope>provided</scope>
</dependency>

towards the flink-parent pom, but this ultimately works against shading and
the hadoop 2.5 version is going to fail as it does here:

https://travis-ci.org/mbalassi/incubator-flink/jobs/36780802

Any suggestions? :)

On Wed, Oct 1, 2014 at 9:40 AM, Gyula Fora <gy...@apache.org> wrote:

> That worked, thanks.
>
> On 30 Sep 2014, at 23:12, Stephan Ewen <se...@apache.org> wrote:
>
> > Yes, I wanted to write a few pointers about such issues.
> >
> > We recently shaded the guava dependency, meaning that we have a custom
> > version of guava where all classes reside in
> > "org.apache.flink.shaded.com.google" and maven rewrites all our guava
> > references to that namespace.
> >
> > The reason is that guava is a frequently used library and versions are
> not
> > all compatible. For example, hadoop 2.5 uses guava 11 and fails if we
> bring
> > guava 17 into the classpath. The shading causes our guava version to not
> > conflict, because our shaded version's classes reside different
> namespace.
> >
> > Since Eclipse does not interpret the shading (it only happens in the
> maven
> > package phase) you need a regular dependency to guava that is not
> exported
> > ("provided" scope)
> >
> > Try adding this to the project's pom, it will tell maven to compile and
> > individually run with the guava lib, but not package it.
> >
> > <dependency>
> > <groupId>com.google.guava</groupId>
> > <artifactId>guava</artifactId>
> > <version>${guava.version}</version>
> > <scope>provided</scope>
> > </dependency>
> >
> >
> >
> > On Tue, Sep 30, 2014 at 11:02 PM, Gyula Fóra <gy...@apache.org> wrote:
> >
> >> Hey,
> >>
> >> We have pulled the dependency rework from the apache master and now we
> >> cannot get our tests to run in eclipse. With maven from the command line
> >> and also with travis it works perfectly but when I try to run tests that
> >> access the the Configuration class we get the following exception:
> >>
> >> java.lang.NoClassDefFoundError: com/google/common/io/BaseEncoding
> >> at
> >>
> >>
> org.apache.flink.configuration.Configuration.setBytes(Configuration.java:358)
> >> at
> >> .....
> >> some stuff here
> >> .....
> >>
> >> Caused by: java.lang.ClassNotFoundException:
> >> com.google.common.io.BaseEncoding
> >> at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> >> at java.security.AccessController.doPrivileged(Native Method)
> >> at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> >> at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> >> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> >> at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> >> ... 32 more
> >>
> >> I literally tried everything to make it work, deleted all maven files,
> >> reinstalled eclipse etc, but still no luck. Do you experience the same
> >> issue when running the streaming-core tests in eclipse?
> >>
> >> Regards,
> >> Gyula
> >>
>
>