You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by oggie <go...@gmail.com> on 2015/10/06 14:57:44 UTC

compatibility issue with Jersey2

I have some jersey compatibility issues when I tried to upgrade from 1.3.1 to
1.4.1..

We have a Java app written with spark 1.3.1. That app also uses Jersey 2.9
client to make external calls.  We see spark 1.4.1 uses Jersey 1.9.

In 1.3.1 we were able to add some exclusions to our pom and everything
worked fine.  But now it seems there's extra logic in Spark that now needs
those exclusions.  If I remove the exclusions, then our code that uses
Jersey2 fails. I don't really want to downgrade our code to Jersey1 though.

The error is:

java.lang.NoClassDefFoundError:
com/sun/jersey/spi/container/servlet/ServletContainer
...
at
org.apache.spark.status.api.v1.ApiRootResource$.getServletHandler(ApiRootResource.scala:174)


Is there anything that can be done in the pom to fix this? Here's what we
have right now in our pom:
          <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.10</artifactId>
            <version>1.4.1</version>
            <exclusions>
                <exclusion>
                    <groupId>com.sun.jersey</groupId>
                    <artifactId>jersey-core</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>com.sun.jersey</groupId>
                    <artifactId>jersey-client</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>com.sun.jersey</groupId>
                    <artifactId>jersey-server</artifactId>
                </exclusion>
                 <exclusion>
                    <groupId>com.sun.jersey</groupId>
                    <artifactId>jersey-json</artifactId>
                </exclusion>
                 <exclusion>
                    <groupId>com.sun.jersey.contribs</groupId>
                    <artifactId>jersey-guice</artifactId>
                </exclusion>
                 <exclusion>
                    <groupId>com.sun.jersey</groupId>
                    <artifactId>jersey-grizzly2</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>com.sun.jersey.jersey-test-framework</groupId>
                    <artifactId>jersey-test-framework-core</artifactId>
                </exclusion>
                <exclusion>
                    <groupId>com.sun.jersey.jersey-test-framework</groupId>
                    <artifactId>jersey-test-framework-grizzly2</artifactId>
                </exclusion>
            </exclusions>
            <scope>provided</scope>
        </dependency>


        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-streaming_2.10</artifactId>
            <version>1.4.1</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.10</artifactId>
            <version>1.4.1</version>
            <scope>provided</scope>
        </dependency>
        <dependency>
            <groupId>org.glassfish.jersey.core</groupId>
            <artifactId>jersey-client</artifactId>
            <version>2.9</version>
        </dependency>




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/compatibility-issue-with-Jersey2-tp24951.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: compatibility issue with Jersey2

Posted by SimonL <si...@gmail.com>.
Hi, I'm a new subscriber, has their been any solution to the below issue?

Many thanks,
Simon



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/compatibility-issue-with-Jersey2-tp24951p27820.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: compatibility issue with Jersey2

Posted by Ted Yu <yu...@gmail.com>.
Maybe build Spark with -Djersey.version=2.9 ?

Cheers

On Tue, Oct 6, 2015 at 5:57 AM, oggie <go...@gmail.com> wrote:

> I have some jersey compatibility issues when I tried to upgrade from 1.3.1
> to
> 1.4.1..
>
> We have a Java app written with spark 1.3.1. That app also uses Jersey 2.9
> client to make external calls.  We see spark 1.4.1 uses Jersey 1.9.
>
> In 1.3.1 we were able to add some exclusions to our pom and everything
> worked fine.  But now it seems there's extra logic in Spark that now needs
> those exclusions.  If I remove the exclusions, then our code that uses
> Jersey2 fails. I don't really want to downgrade our code to Jersey1 though.
>
> The error is:
>
> java.lang.NoClassDefFoundError:
> com/sun/jersey/spi/container/servlet/ServletContainer
> ...
> at
>
> org.apache.spark.status.api.v1.ApiRootResource$.getServletHandler(ApiRootResource.scala:174)
>
>
> Is there anything that can be done in the pom to fix this? Here's what we
> have right now in our pom:
>           <dependency>
>             <groupId>org.apache.spark</groupId>
>             <artifactId>spark-core_2.10</artifactId>
>             <version>1.4.1</version>
>             <exclusions>
>                 <exclusion>
>                     <groupId>com.sun.jersey</groupId>
>                     <artifactId>jersey-core</artifactId>
>                 </exclusion>
>                 <exclusion>
>                     <groupId>com.sun.jersey</groupId>
>                     <artifactId>jersey-client</artifactId>
>                 </exclusion>
>                 <exclusion>
>                     <groupId>com.sun.jersey</groupId>
>                     <artifactId>jersey-server</artifactId>
>                 </exclusion>
>                  <exclusion>
>                     <groupId>com.sun.jersey</groupId>
>                     <artifactId>jersey-json</artifactId>
>                 </exclusion>
>                  <exclusion>
>                     <groupId>com.sun.jersey.contribs</groupId>
>                     <artifactId>jersey-guice</artifactId>
>                 </exclusion>
>                  <exclusion>
>                     <groupId>com.sun.jersey</groupId>
>                     <artifactId>jersey-grizzly2</artifactId>
>                 </exclusion>
>                 <exclusion>
>                     <groupId>com.sun.jersey.jersey-test-framework</groupId>
>                     <artifactId>jersey-test-framework-core</artifactId>
>                 </exclusion>
>                 <exclusion>
>                     <groupId>com.sun.jersey.jersey-test-framework</groupId>
>                     <artifactId>jersey-test-framework-grizzly2</artifactId>
>                 </exclusion>
>             </exclusions>
>             <scope>provided</scope>
>         </dependency>
>
>
>         <dependency>
>             <groupId>org.apache.spark</groupId>
>             <artifactId>spark-streaming_2.10</artifactId>
>             <version>1.4.1</version>
>             <scope>provided</scope>
>         </dependency>
>         <dependency>
>             <groupId>org.apache.spark</groupId>
>             <artifactId>spark-sql_2.10</artifactId>
>             <version>1.4.1</version>
>             <scope>provided</scope>
>         </dependency>
>         <dependency>
>             <groupId>org.glassfish.jersey.core</groupId>
>             <artifactId>jersey-client</artifactId>
>             <version>2.9</version>
>         </dependency>
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/compatibility-issue-with-Jersey2-tp24951.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Re: compatibility issue with Jersey2

Posted by Mingyu Kim <mk...@palantir.com>.
Hi all,

I filed https://issues.apache.org/jira/browse/SPARK-11081. Since Jersey’s surface area is relatively small and seems to be only used for Spark UI and json API, shading the dependency might make sense similar to what’s done for Jerry dependencies at https://issues.apache.org/jira/browse/SPARK-3996. Would this be reasonable?

Mingyu







On 10/7/15, 11:26 AM, "Marcelo Vanzin" <va...@cloudera.com> wrote:

>Seems like you might be running into
>https://urldefense.proofpoint.com/v2/url?u=https-3A__issues.apache.org_jira_browse_SPARK-2D10910&d=CQIBaQ&c=izlc9mHr637UR4lpLEZLFFS3Vn2UXBrZ4tFb6oOnmz8&r=ennQJq47pNnObsDh-88a9YUrUulcYQoV8giPASqXB84&m=GuNlWwLNE7UP5euS6Ccu86dUSs1AuiouVOM3bTeZuoQ&s=Z23j4oFFQ12DNJYiFfXFsXPlpav2HD0W0eZqVEhjjOk&e= . I've been busy with
>other things but plan to take a look at that one when I find time...
>right now I don't really have a solution, other than making sure your
>application's jars do not include those classes the exception is
>complaining about.
>
>On Wed, Oct 7, 2015 at 10:23 AM, Gary Ogden <go...@gmail.com> wrote:
>> What you suggested seems to have worked for unit tests. But now it throws
>> this at run time on mesos with spark-submit:
>>
>> Exception in thread "main" java.lang.LinkageError: loader constraint
>> violation: when resolving method
>> "org.slf4j.impl.StaticLoggerBinder.getLoggerFactory()Lorg/slf4j/ILoggerFactory;"
>> the class loader (instance of
>> org/apache/spark/util/ChildFirstURLClassLoader) of the current class,
>> org/slf4j/LoggerFactory, and the class loader (instance of
>> sun/misc/Launcher$AppClassLoader) for resolved class,
>> org/slf4j/impl/StaticLoggerBinder, have different Class objects for the type
>> LoggerFactory; used in the signature
>> 	at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:336)
>> 	at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:284)
>> 	at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:305)
>> 	at com.company.spark.utils.SparkJob.<clinit>(SparkJob.java:41)
>> 	at java.lang.Class.forName0(Native Method)
>> 	at java.lang.Class.forName(Unknown Source)
>> 	at
>> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:634)
>> 	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
>> 	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
>> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
>> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>
>>
>> On 6 October 2015 at 16:20, Marcelo Vanzin <va...@cloudera.com> wrote:
>>>
>>> On Tue, Oct 6, 2015 at 12:04 PM, Gary Ogden <go...@gmail.com> wrote:
>>> > But we run unit tests differently in our build environment, which is
>>> > throwing the error. It's setup like this:
>>> >
>>> > I suspect this is what you were referring to when you said I have a
>>> > problem?
>>>
>>> Yes, that is what I was referring to. But, in your test environment,
>>> you might be able to work around the problem by setting
>>> "spark.ui.enabled=false"; that should disable all the code that uses
>>> Jersey, so you can use your newer version in your unit tests.
>>>
>>>
>>> --
>>> Marcelo
>>
>>
>
>
>
>-- 
>Marcelo
>
>---------------------------------------------------------------------
>To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>For additional commands, e-mail: user-help@spark.apache.org
>

Re: compatibility issue with Jersey2

Posted by Marcelo Vanzin <va...@cloudera.com>.
Seems like you might be running into
https://issues.apache.org/jira/browse/SPARK-10910. I've been busy with
other things but plan to take a look at that one when I find time...
right now I don't really have a solution, other than making sure your
application's jars do not include those classes the exception is
complaining about.

On Wed, Oct 7, 2015 at 10:23 AM, Gary Ogden <go...@gmail.com> wrote:
> What you suggested seems to have worked for unit tests. But now it throws
> this at run time on mesos with spark-submit:
>
> Exception in thread "main" java.lang.LinkageError: loader constraint
> violation: when resolving method
> "org.slf4j.impl.StaticLoggerBinder.getLoggerFactory()Lorg/slf4j/ILoggerFactory;"
> the class loader (instance of
> org/apache/spark/util/ChildFirstURLClassLoader) of the current class,
> org/slf4j/LoggerFactory, and the class loader (instance of
> sun/misc/Launcher$AppClassLoader) for resolved class,
> org/slf4j/impl/StaticLoggerBinder, have different Class objects for the type
> LoggerFactory; used in the signature
> 	at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:336)
> 	at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:284)
> 	at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:305)
> 	at com.company.spark.utils.SparkJob.<clinit>(SparkJob.java:41)
> 	at java.lang.Class.forName0(Native Method)
> 	at java.lang.Class.forName(Unknown Source)
> 	at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:634)
> 	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
> 	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>
>
> On 6 October 2015 at 16:20, Marcelo Vanzin <va...@cloudera.com> wrote:
>>
>> On Tue, Oct 6, 2015 at 12:04 PM, Gary Ogden <go...@gmail.com> wrote:
>> > But we run unit tests differently in our build environment, which is
>> > throwing the error. It's setup like this:
>> >
>> > I suspect this is what you were referring to when you said I have a
>> > problem?
>>
>> Yes, that is what I was referring to. But, in your test environment,
>> you might be able to work around the problem by setting
>> "spark.ui.enabled=false"; that should disable all the code that uses
>> Jersey, so you can use your newer version in your unit tests.
>>
>>
>> --
>> Marcelo
>
>



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: compatibility issue with Jersey2

Posted by Gary Ogden <go...@gmail.com>.
What you suggested seems to have worked for unit tests. But now it throws
this at run time on mesos with spark-submit:

Exception in thread "main" java.lang.LinkageError: loader constraint
violation: when resolving method
"org.slf4j.impl.StaticLoggerBinder.getLoggerFactory()Lorg/slf4j/ILoggerFactory;"
the class loader (instance of
org/apache/spark/util/ChildFirstURLClassLoader) of the current class,
org/slf4j/LoggerFactory, and the class loader (instance of
sun/misc/Launcher$AppClassLoader) for resolved class,
org/slf4j/impl/StaticLoggerBinder, have different Class objects for
the type LoggerFactory; used in the signature
	at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:336)
	at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:284)
	at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:305)
	at com.company.spark.utils.SparkJob.<clinit>(SparkJob.java:41)
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Unknown Source)
	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:634)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)


On 6 October 2015 at 16:20, Marcelo Vanzin <va...@cloudera.com> wrote:

> On Tue, Oct 6, 2015 at 12:04 PM, Gary Ogden <go...@gmail.com> wrote:
> > But we run unit tests differently in our build environment, which is
> > throwing the error. It's setup like this:
> >
> > I suspect this is what you were referring to when you said I have a
> problem?
>
> Yes, that is what I was referring to. But, in your test environment,
> you might be able to work around the problem by setting
> "spark.ui.enabled=false"; that should disable all the code that uses
> Jersey, so you can use your newer version in your unit tests.
>
>
> --
> Marcelo
>

Re: compatibility issue with Jersey2

Posted by Marcelo Vanzin <va...@cloudera.com>.
On Tue, Oct 6, 2015 at 12:04 PM, Gary Ogden <go...@gmail.com> wrote:
> But we run unit tests differently in our build environment, which is
> throwing the error. It's setup like this:
>
> I suspect this is what you were referring to when you said I have a problem?

Yes, that is what I was referring to. But, in your test environment,
you might be able to work around the problem by setting
"spark.ui.enabled=false"; that should disable all the code that uses
Jersey, so you can use your newer version in your unit tests.


-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: compatibility issue with Jersey2

Posted by Gary Ogden <go...@gmail.com>.
In our separate environments we run it with spark-submit, so I can give
that a try.

But we run unit tests differently in our build environment, which is
throwing the error. It's setup like this:

        helper = new CassandraHelper(settings.getCassandra().get());
        SparkConf sparkConf = getCassSparkConf(helper);
        sparkConf.setMaster("local[*]");
        sparkConf.setAppName("TEST");
        sparkConf.set("spark.driver.allowMultipleContexts", "true");

        sc = new JavaSparkContext(sparkConf);

I suspect this is what you were referring to when you said I have a problem?



On 6 October 2015 at 15:40, Marcelo Vanzin <va...@cloudera.com> wrote:

> On Tue, Oct 6, 2015 at 5:57 AM, oggie <go...@gmail.com> wrote:
> > We have a Java app written with spark 1.3.1. That app also uses Jersey
> 2.9
> > client to make external calls.  We see spark 1.4.1 uses Jersey 1.9.
>
> How is this app deployed? If it's run via spark-submit, you could use
> "spark.{driver,executor}.userClassPathFirst" to make your app use
> jersey 2.9 while letting Spark use the older jersey.
>
> If you're somehow embedding Spark and running everything in the same
> classloader, then you have a problem.
>
> --
> Marcelo
>

Re: compatibility issue with Jersey2

Posted by Marcelo Vanzin <va...@cloudera.com>.
On Tue, Oct 6, 2015 at 5:57 AM, oggie <go...@gmail.com> wrote:
> We have a Java app written with spark 1.3.1. That app also uses Jersey 2.9
> client to make external calls.  We see spark 1.4.1 uses Jersey 1.9.

How is this app deployed? If it's run via spark-submit, you could use
"spark.{driver,executor}.userClassPathFirst" to make your app use
jersey 2.9 while letting Spark use the older jersey.

If you're somehow embedding Spark and running everything in the same
classloader, then you have a problem.

-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org