You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Gil Vernik <GI...@il.ibm.com> on 2015/01/18 07:26:38 UTC

run time exceptions in Spark 1.2.0 manual build together with OpenStack hadoop driver

Hi,

I took a source code of Spark 1.2.0 and tried to build it together with 
hadoop-openstack.jar ( To allow Spark an access to OpenStack Swift )
I used Hadoop 2.6.0.

The build was fine without problems, however in run time, while trying to 
access "swift://" name space i got an exception:
java.lang.NoClassDefFoundError: org/codehaus/jackson/annotate/JsonClass
                 at 
org.codehaus.jackson.map.introspect.JacksonAnnotationIntrospector.findDeserializationType(JacksonAnnotationIntrospector.java:524)
                 at 
org.codehaus.jackson.map.deser.BasicDeserializerFactory.modifyTypeByAnnotation(BasicDeserializerFactory.java:732)
                ...and the long stack trace goes here

Digging into the problem i saw the following:
Jackson versions 1.9.X are not backward compatible, in particular they 
removed JsonClass annotation.
Hadoop 2.6.0 uses jackson-asl version 1.9.13, while Spark has reference to 
older version of jackson.

This is the main  pom.xml of Spark 1.2.0 :

      <dependency>
        <!-- Matches the version of jackson-core-asl pulled in by avro -->
        <groupId>org.codehaus.jackson</groupId>
        <artifactId>jackson-mapper-asl</artifactId>
        <version>1.8.8</version>
      </dependency>

Referencing 1.8.8 version, which is not compatible with Hadoop 2.6.0 .
If we change version to 1.9.13, than all will work fine and there will be 
no run time exceptions while accessing Swift. The following change will 
solve the problem:

      <dependency>
        <!-- Matches the version of jackson-core-asl pulled in by avro -->
        <groupId>org.codehaus.jackson</groupId>
        <artifactId>jackson-mapper-asl</artifactId>
        <version>1.9.13</version>
      </dependency>

I am trying to resolve this somehow so people will not get into this 
issue.
Is there any particular need in Spark for jackson 1.8.8 and not 1.9.13?
Can we remove 1.8.8 and put 1.9.13 for Avro? 
It looks to me that all works fine when Spark build with jackson 1.9.13, 
but i am not an expert and not sure what should be tested.

Thanks,
Gil Vernik.

Re: run time exceptions in Spark 1.2.0 manual build together with OpenStack hadoop driver

Posted by Sean Owen <so...@cloudera.com>.
Old releases can't be changed, but new ones can. This was merged into
the 1.3 branch for the upcoming 1.3.0 release.

If you really had to, you could do some surgery on existing
distributions to swap in/out Jackson.

On Mon, Feb 9, 2015 at 11:22 AM, Gil Vernik <GI...@il.ibm.com> wrote:
> Hi All,
>
> I understand that https://github.com/apache/spark/pull/3938 was closed and
> merged into Spark? And this suppose to fix this Jackson issue.
> If so, is there any way to update binary distributions of Spark so that it
> will contain this fix? Current binary versions of Spark available for
> download were built with jackson 1.8.8 which makes them  impossible to use
> with Hadoop 2.6.0 jars
>
> Thanks
> Gil Vernik.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: run time exceptions in Spark 1.2.0 manual build together with OpenStack hadoop driver

Posted by Gil Vernik <GI...@il.ibm.com>.
Hi All,

I understand that https://github.com/apache/spark/pull/3938 was closed and 
merged into Spark? And this suppose to fix this Jackson issue.
If so, is there any way to update binary distributions of Spark so that it 
will contain this fix? Current binary versions of Spark available for 
download were built with jackson 1.8.8 which makes them  impossible to use 
with Hadoop 2.6.0 jars

Thanks
Gil Vernik.






From:   Sean Owen <so...@cloudera.com>
To:     Ted Yu <yu...@gmail.com>
Cc:     Gil Vernik/Haifa/IBM@IBMIL, dev <de...@spark.apache.org>
Date:   18/01/2015 08:23 PM
Subject:        Re: run time exceptions in Spark 1.2.0 manual build 
together with OpenStack hadoop driver



Agree, I think this can / should be fixed with a slightly more
conservative version of https://github.com/apache/spark/pull/3938
related to SPARK-5108.

On Sun, Jan 18, 2015 at 3:41 PM, Ted Yu <yu...@gmail.com> wrote:
> Please tale a look at SPARK-4048 and SPARK-5108
>
> Cheers
>
> On Sat, Jan 17, 2015 at 10:26 PM, Gil Vernik <GI...@il.ibm.com> wrote:
>
>> Hi,
>>
>> I took a source code of Spark 1.2.0 and tried to build it together with
>> hadoop-openstack.jar ( To allow Spark an access to OpenStack Swift )
>> I used Hadoop 2.6.0.
>>
>> The build was fine without problems, however in run time, while trying 
to
>> access "swift://" name space i got an exception:
>> java.lang.NoClassDefFoundError: org/codehaus/jackson/annotate/JsonClass
>>                  at
>>
>> 
org.codehaus.jackson.map.introspect.JacksonAnnotationIntrospector.findDeserializationType(JacksonAnnotationIntrospector.java:524)
>>                  at
>>
>> 
org.codehaus.jackson.map.deser.BasicDeserializerFactory.modifyTypeByAnnotation(BasicDeserializerFactory.java:732)
>>                 ...and the long stack trace goes here
>>
>> Digging into the problem i saw the following:
>> Jackson versions 1.9.X are not backward compatible, in particular they
>> removed JsonClass annotation.
>> Hadoop 2.6.0 uses jackson-asl version 1.9.13, while Spark has reference 
to
>> older version of jackson.
>>
>> This is the main  pom.xml of Spark 1.2.0 :
>>
>>       <dependency>
>>         <!-- Matches the version of jackson-core-asl pulled in by avro 
-->
>>         <groupId>org.codehaus.jackson</groupId>
>>         <artifactId>jackson-mapper-asl</artifactId>
>>         <version>1.8.8</version>
>>       </dependency>
>>
>> Referencing 1.8.8 version, which is not compatible with Hadoop 2.6.0 .
>> If we change version to 1.9.13, than all will work fine and there will 
be
>> no run time exceptions while accessing Swift. The following change will
>> solve the problem:
>>
>>       <dependency>
>>         <!-- Matches the version of jackson-core-asl pulled in by avro 
-->
>>         <groupId>org.codehaus.jackson</groupId>
>>         <artifactId>jackson-mapper-asl</artifactId>
>>         <version>1.9.13</version>
>>       </dependency>
>>
>> I am trying to resolve this somehow so people will not get into this
>> issue.
>> Is there any particular need in Spark for jackson 1.8.8 and not 1.9.13?
>> Can we remove 1.8.8 and put 1.9.13 for Avro?
>> It looks to me that all works fine when Spark build with jackson 
1.9.13,
>> but i am not an expert and not sure what should be tested.
>>
>> Thanks,
>> Gil Vernik.
>>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org



Re: run time exceptions in Spark 1.2.0 manual build together with OpenStack hadoop driver

Posted by Sean Owen <so...@cloudera.com>.
Agree, I think this can / should be fixed with a slightly more
conservative version of https://github.com/apache/spark/pull/3938
related to SPARK-5108.

On Sun, Jan 18, 2015 at 3:41 PM, Ted Yu <yu...@gmail.com> wrote:
> Please tale a look at SPARK-4048 and SPARK-5108
>
> Cheers
>
> On Sat, Jan 17, 2015 at 10:26 PM, Gil Vernik <GI...@il.ibm.com> wrote:
>
>> Hi,
>>
>> I took a source code of Spark 1.2.0 and tried to build it together with
>> hadoop-openstack.jar ( To allow Spark an access to OpenStack Swift )
>> I used Hadoop 2.6.0.
>>
>> The build was fine without problems, however in run time, while trying to
>> access "swift://" name space i got an exception:
>> java.lang.NoClassDefFoundError: org/codehaus/jackson/annotate/JsonClass
>>                  at
>>
>> org.codehaus.jackson.map.introspect.JacksonAnnotationIntrospector.findDeserializationType(JacksonAnnotationIntrospector.java:524)
>>                  at
>>
>> org.codehaus.jackson.map.deser.BasicDeserializerFactory.modifyTypeByAnnotation(BasicDeserializerFactory.java:732)
>>                 ...and the long stack trace goes here
>>
>> Digging into the problem i saw the following:
>> Jackson versions 1.9.X are not backward compatible, in particular they
>> removed JsonClass annotation.
>> Hadoop 2.6.0 uses jackson-asl version 1.9.13, while Spark has reference to
>> older version of jackson.
>>
>> This is the main  pom.xml of Spark 1.2.0 :
>>
>>       <dependency>
>>         <!-- Matches the version of jackson-core-asl pulled in by avro -->
>>         <groupId>org.codehaus.jackson</groupId>
>>         <artifactId>jackson-mapper-asl</artifactId>
>>         <version>1.8.8</version>
>>       </dependency>
>>
>> Referencing 1.8.8 version, which is not compatible with Hadoop 2.6.0 .
>> If we change version to 1.9.13, than all will work fine and there will be
>> no run time exceptions while accessing Swift. The following change will
>> solve the problem:
>>
>>       <dependency>
>>         <!-- Matches the version of jackson-core-asl pulled in by avro -->
>>         <groupId>org.codehaus.jackson</groupId>
>>         <artifactId>jackson-mapper-asl</artifactId>
>>         <version>1.9.13</version>
>>       </dependency>
>>
>> I am trying to resolve this somehow so people will not get into this
>> issue.
>> Is there any particular need in Spark for jackson 1.8.8 and not 1.9.13?
>> Can we remove 1.8.8 and put 1.9.13 for Avro?
>> It looks to me that all works fine when Spark build with jackson 1.9.13,
>> but i am not an expert and not sure what should be tested.
>>
>> Thanks,
>> Gil Vernik.
>>

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Re: run time exceptions in Spark 1.2.0 manual build together with OpenStack hadoop driver

Posted by Ted Yu <yu...@gmail.com>.
Please tale a look at SPARK-4048 and SPARK-5108

Cheers

On Sat, Jan 17, 2015 at 10:26 PM, Gil Vernik <GI...@il.ibm.com> wrote:

> Hi,
>
> I took a source code of Spark 1.2.0 and tried to build it together with
> hadoop-openstack.jar ( To allow Spark an access to OpenStack Swift )
> I used Hadoop 2.6.0.
>
> The build was fine without problems, however in run time, while trying to
> access "swift://" name space i got an exception:
> java.lang.NoClassDefFoundError: org/codehaus/jackson/annotate/JsonClass
>                  at
>
> org.codehaus.jackson.map.introspect.JacksonAnnotationIntrospector.findDeserializationType(JacksonAnnotationIntrospector.java:524)
>                  at
>
> org.codehaus.jackson.map.deser.BasicDeserializerFactory.modifyTypeByAnnotation(BasicDeserializerFactory.java:732)
>                 ...and the long stack trace goes here
>
> Digging into the problem i saw the following:
> Jackson versions 1.9.X are not backward compatible, in particular they
> removed JsonClass annotation.
> Hadoop 2.6.0 uses jackson-asl version 1.9.13, while Spark has reference to
> older version of jackson.
>
> This is the main  pom.xml of Spark 1.2.0 :
>
>       <dependency>
>         <!-- Matches the version of jackson-core-asl pulled in by avro -->
>         <groupId>org.codehaus.jackson</groupId>
>         <artifactId>jackson-mapper-asl</artifactId>
>         <version>1.8.8</version>
>       </dependency>
>
> Referencing 1.8.8 version, which is not compatible with Hadoop 2.6.0 .
> If we change version to 1.9.13, than all will work fine and there will be
> no run time exceptions while accessing Swift. The following change will
> solve the problem:
>
>       <dependency>
>         <!-- Matches the version of jackson-core-asl pulled in by avro -->
>         <groupId>org.codehaus.jackson</groupId>
>         <artifactId>jackson-mapper-asl</artifactId>
>         <version>1.9.13</version>
>       </dependency>
>
> I am trying to resolve this somehow so people will not get into this
> issue.
> Is there any particular need in Spark for jackson 1.8.8 and not 1.9.13?
> Can we remove 1.8.8 and put 1.9.13 for Avro?
> It looks to me that all works fine when Spark build with jackson 1.9.13,
> but i am not an expert and not sure what should be tested.
>
> Thanks,
> Gil Vernik.
>