You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (JIRA)" <ji...@apache.org> on 2019/08/16 22:09:00 UTC

[jira] [Resolved] (SPARK-28737) Update jersey to 2.27+ (2.29)

     [ https://issues.apache.org/jira/browse/SPARK-28737?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Dongjoon Hyun resolved SPARK-28737.
-----------------------------------
       Resolution: Fixed
    Fix Version/s: 3.0.0

Issue resolved by pull request 25455
[https://github.com/apache/spark/pull/25455]

> Update jersey to 2.27+ (2.29)
> -----------------------------
>
>                 Key: SPARK-28737
>                 URL: https://issues.apache.org/jira/browse/SPARK-28737
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Spark Core
>    Affects Versions: 3.0.0
>            Reporter: Sean Owen
>            Assignee: Sean Owen
>            Priority: Major
>             Fix For: 3.0.0
>
>
> Looks like we might need to update Jersey after all, from recent JDK 11 testing: 
> {code}
> Caused by: java.lang.IllegalArgumentException
> 	at jersey.repackaged.org.objectweb.asm.ClassReader.&lt;init&gt;(ClassReader.java:170)
> 	at jersey.repackaged.org.objectweb.asm.ClassReader.&lt;init&gt;(ClassReader.java:153)
> 	at jersey.repackaged.org.objectweb.asm.ClassReader.&lt;init&gt;(ClassReader.java:424)
> 	at org.glassfish.jersey.server.internal.scanning.AnnotationAcceptingListener.process(AnnotationAcceptingListener.java:170)
> {code}
> It looks like 2.27+ may solve the issue, so worth trying 2.29. 
> I'm not 100% sure this is an issue as the JDK 11 testing process is still undergoing change, but will work on it to see how viable it is anyway, as it may be worthwhile to update for 3.0 in any event.



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org