You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@avro.apache.org by Deepak Nettem <de...@gmail.com> on 2012/03/19 23:48:46 UTC

Java MapReduce Avro Jackson Error

When I include some Avro code in my Mapper, I get this error:

Error:
org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;

Particularly, just these two lines of code:

            InputStream in =
getClass().getResourceAsStream("schema.avsc");
            Schema schema = Schema.parse(in);

This code works perfectly when run as a stand alone application outside of
Hadoop. Why do I get this error? and what's the best way to get rid of it?

I am using Hadoop 0.20.2, and writing code in the new API.

Re: Java MapReduce Avro Jackson Error

Posted by Deepak Nettem <de...@gmail.com>.
Hi Scott,

I was still getting the error, and the issue is unresolved with Apache
Hadoop.

But I have now moved to CDH3 which doesn't have this problem. Hope that
helps somebody stuck with the same issue.

Best,
Deepak

On Mon, Mar 26, 2012 at 11:25 PM, Scott Carey <sc...@apache.org> wrote:

> What happens if you remove avro-tools entirely?
>
> What in there are you using?  It is all command-line tools, if there is
> anything in there you need, you can use the equivalent Java API instead.
>  If there is something in there of use that is not a command line tool, we
> should make it available in the other modules.
>
> What is the stack trace in full, not just the top error?  What code (Avro
> or Hadoop) is triggering it?
>
> "Error:
> org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;"
>
>
> is not enough info if the basic things are not working.
>
> On 3/26/12 7:58 PM, "Deepak Nettem" <de...@gmail.com> wrote:
>
> Hi Folks,
>
> This issue is still not resolved :( Any other ideas?
>
> Best,
> Deepak
>
> On Mon, Mar 19, 2012 at 11:25 PM, Scott Carey <sc...@apache.org>wrote:
>
>> If you are using avro-tools, beware it is a shaded jar with all
>> dependencies inside of it for use as a command line tool (java –jar
>> avro-tools-VERSION.jar).
>>
>> If you are using avro-tools in your project for some reason (there is
>> really only command line utilities in it) use the nodeps classifier:
>>
>> <classifier>nodeps</classifier>
>>
>> http://repo1.maven.org/maven2/org/apache/avro/avro-tools/1.6.3/
>>
>> Note the nodeps jar is 47K, while the default jar is 10MB.
>>
>>
>> For what it is worth, I removed the Jackson jar from our hadoop install
>> long ago.  It is used to dump configuration files to JSON there, a
>> peripheral feature we don't use.
>>
>> Another thing that you may want to do is change your Hadoop dependency
>> scope to
>> <scope>provided</scope> since hadoop will be put on your classpath by the
>> hadoop environment.   Short of this, excluding the chained Hadoop
>> dependencies you aren't using (most likely: jetty,  kfs, and the
>> tomcat:jasper and eclipse:jdt stuff) may help.
>>
>> On 3/19/12 6:23 PM, "Deepak Nettem" <de...@gmail.com> wrote:
>>
>> Hi Tatu / Scott,
>>
>> Thanks for your replies. I replaced the earlier dependencies with these:
>>
>>    <dependency>
>>     <groupId>org.apache.avro</groupId>
>>     <artifactId>avro-tools</artifactId>
>>     <version>1.6.3</version>
>>     </dependency>
>>
>>     <dependency>
>>     <groupId>org.apache.avro</groupId>
>>     <artifactId>avro</artifactId>
>>     <version>1.6.3</version>
>>     </dependency>
>>
>>     <dependency>
>>     <groupId>org.codehaus.jackson</groupId>
>>       <artifactId>jackson-mapper-asl</artifactId>
>>       <version>1.8.8</version>
>>       <scope>compile</scope>
>>     </dependency>
>>
>>     <dependency>
>>     <groupId>org.codehaus.jackson</groupId>
>>       <artifactId>jackson-core-asl</artifactId>
>>       <version>1.8.8</version>
>>       <scope>compile</scope>
>>     </dependency>
>>
>> And this is my app dependency tree:
>>
>> [INFO] --- maven-dependency-plugin:2.1:tree (default-cli) @ AvroTest ---
>> [INFO] org.avrotest:AvroTest:jar:1.0-SNAPSHOT
>> [INFO] +- junit:junit:jar:3.8.1:test (scope not updated to compile)
>> [INFO] +- org.codehaus.jackson:jackson-mapper-asl:jar:1.8.8:compile
>> [INFO] +- org.codehaus.jackson:jackson-core-asl:jar:1.8.8:compile
>> [INFO] +- net.sf.json-lib:json-lib:jar:jdk15:2.3:compile
>> [INFO] |  +- commons-beanutils:commons-beanutils:jar:1.8.0:compile
>> [INFO] |  +- commons-collections:commons-collections:jar:3.2.1:compile
>> [INFO] |  +- commons-lang:commons-lang:jar:2.4:compile
>> [INFO] |  +- commons-logging:commons-logging:jar:1.1.1:compile
>> [INFO] |  \- net.sf.ezmorph:ezmorph:jar:1.0.6:compile
>> [INFO] +- org.apache.avro:avro-tools:jar:1.6.3:compile
>> [INFO] |  \- org.slf4j:slf4j-api:jar:1.6.4:compile
>> [INFO] +- org.apache.avro:avro:jar:1.6.3:compile
>> [INFO] |  +- com.thoughtworks.paranamer:paranamer:jar:2.3:compile
>> [INFO] |  \- org.xerial.snappy:snappy-java:jar:1.0.4.1:compile
>> [INFO] \- org.apache.hadoop:hadoop-core:jar:0.20.2:compile
>> [INFO]    +- commons-cli:commons-cli:jar:1.2:compile
>> [INFO]    +- xmlenc:xmlenc:jar:0.52:compile
>> [INFO]    +- commons-httpclient:commons-httpclient:jar:3.0.1:compile
>> [INFO]    +- commons-codec:commons-codec:jar:1.3:compile
>> [INFO]    +- commons-net:commons-net:jar:1.4.1:compile
>> [INFO]    +- org.mortbay.jetty:jetty:jar:6.1.14:compile
>> [INFO]    +- org.mortbay.jetty:jetty-util:jar:6.1.14:compile
>> [INFO]    +- tomcat:jasper-runtime:jar:5.5.12:compile
>> [INFO]    +- tomcat:jasper-compiler:jar:5.5.12:compile
>> [INFO]    +- org.mortbay.jetty:jsp-api-2.1:jar:6.1.14:compile
>> [INFO]    +- org.mortbay.jetty:jsp-2.1:jar:6.1.14:compile
>> [INFO]    |  \- ant:ant:jar:1.6.5:compile
>> [INFO]    +- commons-el:commons-el:jar:1.0:compile
>> [INFO]    +- net.java.dev.jets3t:jets3t:jar:0.7.1:compile
>> [INFO]    +- org.mortbay.jetty:servlet-api-2.5:jar:6.1.14:compile
>> [INFO]    +- net.sf.kosmosfs:kfs:jar:0.3:compile
>> [INFO]    +- hsqldb:hsqldb:jar:1.8.0.10:compile
>> [INFO]    +- oro:oro:jar:2.0.8:compile
>> [INFO]    \- org.eclipse.jdt:core:jar:3.1.1:compile
>>
>> I still get the same error. Is there anything specific I need to do other
>> than changing dependencies in pom.xml to make this error go away?
>>
>> On Mon, Mar 19, 2012 at 9:12 PM, Tatu Saloranta <ts...@gmail.com>wrote:
>>
>>> On Mon, Mar 19, 2012 at 6:06 PM, Scott Carey <sc...@apache.org>
>>> wrote:
>>> > What version of Avro are you using?
>>> >
>>> > You may want to try Avro 1.6.3 + Jackson 1.8.8.
>>> >
>>> > This is related, but is not your exact problem.
>>> > https://issues.apache.org/jira/browse/AVRO-1037
>>> >
>>> > You are likely pulling in some other version of jackson somehow.  You
>>> may
>>> > want to use 'mvn dependency:tree' on your project to see where all the
>>> > dependencies are coming from.  That may help identify the culprit.
>>>
>>> This sounds like a good idea, and I agree in that this is probably
>>> still due to an old version lurking somewhere.
>>>
>>> -+ Tatu +-
>>>
>>
>>
>>
>
>

Re: BigInt / longlong

Posted by Tatu Saloranta <ts...@gmail.com>.
On Thu, Mar 29, 2012 at 12:20 AM, Meyer, Dennis <de...@adtech.com> wrote:
> Hi,
>
> That's not the best idea as it's wasting a lot of space as encoding eats
> up lots of space (e.g. 1Byte ASCII, 2-3Byte for UTF-8). Especially as AVRO
> uses the MSB for compressing smaller ints, this does not seem very keen
> for mass data.
>
> I'll see if 64Bit unsigned -> 64Bit signed conversion or using the matisse
> of double works better for us.

Wouldn't going via double is every bit as bad an idea as using Strings
(i.e. neither makes much sense to me) -- double operations are rather
costly, and you still lose many more bits on magnitude. So why would
you consider conversions to/from doubles?

But as Scott pointed out, most platforms use simple 2s complement, so
that simple cast should just work (i.e. it's all just matter of
interpretation), as well as basic arithmetics (add, subtract,
non-sign-extending shifts). So as long as you deal with values as
unsigned ints, code should work.

-+ Tatu +-

Re: BigInt / longlong

Posted by "Meyer, Dennis" <de...@adtech.com>.
Hi,

That's not the best idea as it's wasting a lot of space as encoding eats
up lots of space (e.g. 1Byte ASCII, 2-3Byte for UTF-8). Especially as AVRO
uses the MSB for compressing smaller ints, this does not seem very keen
for mass data.

I'll see if 64Bit unsigned -> 64Bit signed conversion or using the matisse
of double works better for us.

Thanks,
Dennis




Am 29.03.12 01:38 schrieb "Miki Tebeka" unter <mi...@gmail.com>:

>I would encode to string. Should be simple enough, just means you need
>a pass on the data after reading it.
>
>On Wed, Mar 28, 2012 at 11:43 AM, Scott Carey <sc...@apache.org>
>wrote:
>> On 3/28/12 11:01 AM, "Meyer, Dennis" <de...@adtech.com> wrote:
>>
>> Hi,
>>
>> What type refers to an Java Bigint or C long long? Or is there any other
>> type in Avro that maps a 64 bit unsigned int?
>>
>> I unfortunately could only find smaller types in the docs:
>>
>> Primitive Types
>>
>> The set of primitive type names is:
>>
>> string: unicode character sequence
>> bytes: sequence of 8-bit bytes
>> int: 32-bit signed integer
>> long: 64-bit signed integer
>> float: single precision (32-bit) IEEE 754 floating-point number
>> double: double precision (64-bit) IEEE 754 floating-point number
>> boolean: a binary value
>> null: no value
>>
>>
>> Anyway in the encoding section theres some 64bit unsigned. Can I use
>>them
>> somehow by a type?
>>
>>
>> An unsigned value fits in a signed one.  They are both 64 bits.  Each
>> language that supports a long unsigned type has its own way to convert
>>from
>> one to the other without loss of data.
>>
>> Work around might be to use the 52 significant bits of a double, but
>>seems
>> like a hack and of course loosing some more number space compared to
>>uint64.
>> I'd like to get around any other self-encoding hacks as I'd like to
>>also use
>> Hadoop/PIG/HIVE on top on AVRO, so would like to keep functionality on
>> numbers if possible.
>>
>>
>> Java does not have an unsigned 64 bit type.  Hadoop/Pig/Hive all only
>>have
>> signed 64 bit integer quantities.
>>
>> Luckily, multiplication and addition on two's compliment signed values
>>is
>> identical to the operations on unsigned ints, so for many operations
>>there
>> is no loss in fidelity as long as you pass the raw bits on to something
>>that
>> interprets the number as an unsigned quantity.
>>
>> That is, if you take the raw bits of a set of unsigned 64 bit numbers,
>>and
>> treat those bits as if they are a signed 64 bit quantities, then do
>> addition, subtraction, and multiplication on them, then treat the raw
>>bit
>> result as an unsigned 64 bit value, it is as if you did the whole thing
>> unsigned.
>>
>> http://en.wikipedia.org/wiki/Two%27s_complement
>>
>> Avro only has signed 32 and 64 bit integer quantities because they can
>>be
>> mapped to unsigned ones in most cases without a problem and many
>>(actually,
>> most) languages do not support unsigned integers.
>>
>> If you want various precision quantities you can use an Avro Fixed type
>>to
>> map to any type you choose.  For example you can use a 16 byte fixed to
>>map
>> to 128 bit unsigned ints.
>>
>>
>> Thanks,
>> Dennis


Re: BigInt / longlong

Posted by Miki Tebeka <mi...@gmail.com>.
I would encode to string. Should be simple enough, just means you need
a pass on the data after reading it.

On Wed, Mar 28, 2012 at 11:43 AM, Scott Carey <sc...@apache.org> wrote:
> On 3/28/12 11:01 AM, "Meyer, Dennis" <de...@adtech.com> wrote:
>
> Hi,
>
> What type refers to an Java Bigint or C long long? Or is there any other
> type in Avro that maps a 64 bit unsigned int?
>
> I unfortunately could only find smaller types in the docs:
>
> Primitive Types
>
> The set of primitive type names is:
>
> string: unicode character sequence
> bytes: sequence of 8-bit bytes
> int: 32-bit signed integer
> long: 64-bit signed integer
> float: single precision (32-bit) IEEE 754 floating-point number
> double: double precision (64-bit) IEEE 754 floating-point number
> boolean: a binary value
> null: no value
>
>
> Anyway in the encoding section theres some 64bit unsigned. Can I use them
> somehow by a type?
>
>
> An unsigned value fits in a signed one.  They are both 64 bits.  Each
> language that supports a long unsigned type has its own way to convert from
> one to the other without loss of data.
>
> Work around might be to use the 52 significant bits of a double, but seems
> like a hack and of course loosing some more number space compared to uint64.
> I'd like to get around any other self-encoding hacks as I'd like to also use
> Hadoop/PIG/HIVE on top on AVRO, so would like to keep functionality on
> numbers if possible.
>
>
> Java does not have an unsigned 64 bit type.  Hadoop/Pig/Hive all only have
> signed 64 bit integer quantities.
>
> Luckily, multiplication and addition on two's compliment signed values is
> identical to the operations on unsigned ints, so for many operations there
> is no loss in fidelity as long as you pass the raw bits on to something that
> interprets the number as an unsigned quantity.
>
> That is, if you take the raw bits of a set of unsigned 64 bit numbers, and
> treat those bits as if they are a signed 64 bit quantities, then do
> addition, subtraction, and multiplication on them, then treat the raw bit
> result as an unsigned 64 bit value, it is as if you did the whole thing
> unsigned.
>
> http://en.wikipedia.org/wiki/Two%27s_complement
>
> Avro only has signed 32 and 64 bit integer quantities because they can be
> mapped to unsigned ones in most cases without a problem and many (actually,
> most) languages do not support unsigned integers.
>
> If you want various precision quantities you can use an Avro Fixed type to
> map to any type you choose.  For example you can use a 16 byte fixed to map
> to 128 bit unsigned ints.
>
>
> Thanks,
> Dennis

Re: BigInt / longlong

Posted by Scott Carey <sc...@apache.org>.
On 3/28/12 11:01 AM, "Meyer, Dennis" <de...@adtech.com> wrote:

> Hi,
> 
> What type refers to an Java Bigint or C long long? Or is there any other type
> in Avro that maps a 64 bit unsigned int?
> 
> I unfortunately could only find smaller types in the docs:
> Primitive Types
> The set of primitive type names is:
> * string: unicode character sequence
> * bytes: sequence of 8-bit bytes
> * int: 32-bit signed integer
> * long: 64-bit signed integer
> * float: single precision (32-bit) IEEE 754 floating-point number
> * double: double precision (64-bit) IEEE 754 floating-point number
> * boolean: a binary value
> * null: no value
> 
> Anyway in the encoding section theres some 64bit unsigned. Can I use them
> somehow by a type?

An unsigned value fits in a signed one.  They are both 64 bits.  Each
language that supports a long unsigned type has its own way to convert from
one to the other without loss of data.

> Work around might be to use the 52 significant bits of a double, but seems
> like a hack and of course loosing some more number space compared to uint64.
> I'd like to get around any other self-encoding hacks as I'd like to also use
> Hadoop/PIG/HIVE on top on AVRO, so would like to keep functionality on numbers
> if possible.

Java does not have an unsigned 64 bit type.  Hadoop/Pig/Hive all only have
signed 64 bit integer quantities.

Luckily, multiplication and addition on two's compliment signed values is
identical to the operations on unsigned ints, so for many operations there
is no loss in fidelity as long as you pass the raw bits on to something that
interprets the number as an unsigned quantity.

That is, if you take the raw bits of a set of unsigned 64 bit numbers, and
treat those bits as if they are a signed 64 bit quantities, then do
addition, subtraction, and multiplication on them, then treat the raw bit
result as an unsigned 64 bit value, it is as if you did the whole thing
unsigned.

http://en.wikipedia.org/wiki/Two%27s_complement

Avro only has signed 32 and 64 bit integer quantities because they can be
mapped to unsigned ones in most cases without a problem and many (actually,
most) languages do not support unsigned integers.

If you want various precision quantities you can use an Avro Fixed type to
map to any type you choose.  For example you can use a 16 byte fixed to map
to 128 bit unsigned ints.

> 
> Thanks,
> Dennis



BigInt / longlong

Posted by "Meyer, Dennis" <de...@adtech.com>.
Hi,

What type refers to an Java Bigint or C long long? Or is there any other type in Avro that maps a 64 bit unsigned int?

I unfortunately could only find smaller types in the docs:
Primitive Types

The set of primitive type names is:

  *   string: unicode character sequence
  *   bytes: sequence of 8-bit bytes
  *   int: 32-bit signed integer
  *   long: 64-bit signed integer
  *   float: single precision (32-bit) IEEE 754 floating-point number
  *   double: double precision (64-bit) IEEE 754 floating-point number
  *   boolean: a binary value
  *   null: no value

Anyway in the encoding section theres some 64bit unsigned. Can I use them somehow by a type?
Work around might be to use the 52 significant bits of a double, but seems like a hack and of course loosing some more number space compared to uint64. I'd like to get around any other self-encoding hacks as I'd like to also use Hadoop/PIG/HIVE on top on AVRO, so would like to keep functionality on numbers if possible.

Thanks,
Dennis

Re: Java MapReduce Avro Jackson Error

Posted by Scott Carey <sc...@apache.org>.
What happens if you remove avro-tools entirely?

What in there are you using?  It is all command-line tools, if there is
anything in there you need, you can use the equivalent Java API instead.  If
there is something in there of use that is not a command line tool, we
should make it available in the other modules.

What is the stack trace in full, not just the top error?  What code (Avro or
Hadoop) is triggering it?

"Error: 
org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Fea
ture;)Lorg/codehaus/jackson/JsonFactory;"

is not enough info if the basic things are not working.

On 3/26/12 7:58 PM, "Deepak Nettem" <de...@gmail.com> wrote:

> Hi Folks,
> 
> This issue is still not resolved :( Any other ideas?
> 
> Best,
> Deepak
> 
> On Mon, Mar 19, 2012 at 11:25 PM, Scott Carey <sc...@apache.org> wrote:
>> If you are using avro-tools, beware it is a shaded jar with all dependencies
>> inside of it for use as a command line tool (java ­jar
>> avro-tools-VERSION.jar).
>> 
>> If you are using avro-tools in your project for some reason (there is really
>> only command line utilities in it) use the nodeps classifier:
>> 
>> <classifier>nodeps</classifier>
>> 
>> http://repo1.maven.org/maven2/org/apache/avro/avro-tools/1.6.3/
>> 
>> Note the nodeps jar is 47K, while the default jar is 10MB.
>> 
>> 
>> For what it is worth, I removed the Jackson jar from our hadoop install long
>> ago.  It is used to dump configuration files to JSON there, a peripheral
>> feature we don't use.
>> 
>> Another thing that you may want to do is change your Hadoop dependency scope
>> to
>> <scope>provided</scope> since hadoop will be put on your classpath by the
>> hadoop environment.   Short of this, excluding the chained Hadoop
>> dependencies you aren't using (most likely: jetty,  kfs, and the
>> tomcat:jasper and eclipse:jdt stuff) may help.
>> 
>> On 3/19/12 6:23 PM, "Deepak Nettem" <de...@gmail.com> wrote:
>> 
>>> Hi Tatu / Scott,
>>> 
>>> Thanks for your replies. I replaced the earlier dependencies with these:
>>> 
>>>    <dependency>
>>>     <groupId>org.apache.avro</groupId>
>>>     <artifactId>avro-tools</artifactId>
>>>     <version>1.6.3</version>
>>>     </dependency>
>>>     
>>>     <dependency>
>>>     <groupId>org.apache.avro</groupId>
>>>     <artifactId>avro</artifactId>
>>>     <version>1.6.3</version>
>>>     </dependency>
>>> 
>>>     <dependency>
>>>     <groupId>org.codehaus.jackson</groupId>
>>>       <artifactId>jackson-mapper-asl</artifactId>
>>>       <version>1.8.8</version>
>>>       <scope>compile</scope>
>>>     </dependency>
>>> 
>>>     <dependency>
>>>     <groupId>org.codehaus.jackson</groupId>
>>>       <artifactId>jackson-core-asl</artifactId>
>>>       <version>1.8.8</version>
>>>       <scope>compile</scope>
>>>     </dependency>
>>> 
>>> And this is my app dependency tree:
>>> 
>>> [INFO] --- maven-dependency-plugin:2.1:tree (default-cli) @ AvroTest ---
>>> [INFO] org.avrotest:AvroTest:jar:1.0-SNAPSHOT
>>> [INFO] +- junit:junit:jar:3.8.1:test (scope not updated to compile)
>>> [INFO] +- org.codehaus.jackson:jackson-mapper-asl:jar:1.8.8:compile
>>> [INFO] +- org.codehaus.jackson:jackson-core-asl:jar:1.8.8:compile
>>> [INFO] +- net.sf.json-lib:json-lib:jar:jdk15:2.3:compile
>>> [INFO] |  +- commons-beanutils:commons-beanutils:jar:1.8.0:compile
>>> [INFO] |  +- commons-collections:commons-collections:jar:3.2.1:compile
>>> [INFO] |  +- commons-lang:commons-lang:jar:2.4:compile
>>> [INFO] |  +- commons-logging:commons-logging:jar:1.1.1:compile
>>> [INFO] |  \- net.sf.ezmorph:ezmorph:jar:1.0.6:compile
>>> [INFO] +- org.apache.avro:avro-tools:jar:1.6.3:compile
>>> [INFO] |  \- org.slf4j:slf4j-api:jar:1.6.4:compile
>>> [INFO] +- org.apache.avro:avro:jar:1.6.3:compile
>>> [INFO] |  +- com.thoughtworks.paranamer:paranamer:jar:2.3:compile
>>> [INFO] |  \- org.xerial.snappy:snappy-java:jar:1.0.4.1:compile
>>> [INFO] \- org.apache.hadoop:hadoop-core:jar:0.20.2:compile
>>> [INFO]    +- commons-cli:commons-cli:jar:1.2:compile
>>> [INFO]    +- xmlenc:xmlenc:jar:0.52:compile
>>> [INFO]    +- commons-httpclient:commons-httpclient:jar:3.0.1:compile
>>> [INFO]    +- commons-codec:commons-codec:jar:1.3:compile
>>> [INFO]    +- commons-net:commons-net:jar:1.4.1:compile
>>> [INFO]    +- org.mortbay.jetty:jetty:jar:6.1.14:compile
>>> [INFO]    +- org.mortbay.jetty:jetty-util:jar:6.1.14:compile
>>> [INFO]    +- tomcat:jasper-runtime:jar:5.5.12:compile
>>> [INFO]    +- tomcat:jasper-compiler:jar:5.5.12:compile
>>> [INFO]    +- org.mortbay.jetty:jsp-api-2.1:jar:6.1.14:compile
>>> [INFO]    +- org.mortbay.jetty:jsp-2.1:jar:6.1.14:compile
>>> [INFO]    |  \- ant:ant:jar:1.6.5:compile
>>> [INFO]    +- commons-el:commons-el:jar:1.0:compile
>>> [INFO]    +- net.java.dev.jets3t:jets3t:jar:0.7.1:compile
>>> [INFO]    +- org.mortbay.jetty:servlet-api-2.5:jar:6.1.14:compile
>>> [INFO]    +- net.sf.kosmosfs:kfs:jar:0.3:compile
>>> [INFO]    +- hsqldb:hsqldb:jar:1.8.0.10:compile
>>> [INFO]    +- oro:oro:jar:2.0.8:compile
>>> [INFO]    \- org.eclipse.jdt:core:jar:3.1.1:compile
>>> 
>>> I still get the same error. Is there anything specific I need to do other
>>> than changing dependencies in pom.xml to make this error go away?
>>> 
>>> On Mon, Mar 19, 2012 at 9:12 PM, Tatu Saloranta <ts...@gmail.com>
>>> wrote:
>>>> On Mon, Mar 19, 2012 at 6:06 PM, Scott Carey <sc...@apache.org> wrote:
>>>>> > What version of Avro are you using?
>>>>> >
>>>>> > You may want to try Avro 1.6.3 + Jackson 1.8.8.
>>>>> >
>>>>> > This is related, but is not your exact problem.
>>>>> > https://issues.apache.org/jira/browse/AVRO-1037
>>>>> >
>>>>> > You are likely pulling in some other version of jackson somehow.  You
>>>>> may
>>>>> > want to use 'mvn dependency:tree' on your project to see where all the
>>>>> > dependencies are coming from.  That may help identify the culprit.
>>>> 
>>>> This sounds like a good idea, and I agree in that this is probably
>>>> still due to an old version lurking somewhere.
>>>> 
>>>> -+ Tatu +-
>>> 
>>> 
> 
> 



Re: Java MapReduce Avro Jackson Error

Posted by Deepak Nettem <de...@gmail.com>.
Hi Folks,

This issue is still not resolved :( Any other ideas?

Best,
Deepak

On Mon, Mar 19, 2012 at 11:25 PM, Scott Carey <sc...@apache.org> wrote:

> If you are using avro-tools, beware it is a shaded jar with all
> dependencies inside of it for use as a command line tool (java –jar
> avro-tools-VERSION.jar).
>
> If you are using avro-tools in your project for some reason (there is
> really only command line utilities in it) use the nodeps classifier:
>
> <classifier>nodeps</classifier>
>
> http://repo1.maven.org/maven2/org/apache/avro/avro-tools/1.6.3/
>
> Note the nodeps jar is 47K, while the default jar is 10MB.
>
>
> For what it is worth, I removed the Jackson jar from our hadoop install
> long ago.  It is used to dump configuration files to JSON there, a
> peripheral feature we don't use.
>
> Another thing that you may want to do is change your Hadoop dependency
> scope to
> <scope>provided</scope> since hadoop will be put on your classpath by the
> hadoop environment.   Short of this, excluding the chained Hadoop
> dependencies you aren't using (most likely: jetty,  kfs, and the
> tomcat:jasper and eclipse:jdt stuff) may help.
>
> On 3/19/12 6:23 PM, "Deepak Nettem" <de...@gmail.com> wrote:
>
> Hi Tatu / Scott,
>
> Thanks for your replies. I replaced the earlier dependencies with these:
>
>    <dependency>
>     <groupId>org.apache.avro</groupId>
>     <artifactId>avro-tools</artifactId>
>     <version>1.6.3</version>
>     </dependency>
>
>     <dependency>
>     <groupId>org.apache.avro</groupId>
>     <artifactId>avro</artifactId>
>     <version>1.6.3</version>
>     </dependency>
>
>     <dependency>
>     <groupId>org.codehaus.jackson</groupId>
>       <artifactId>jackson-mapper-asl</artifactId>
>       <version>1.8.8</version>
>       <scope>compile</scope>
>     </dependency>
>
>     <dependency>
>     <groupId>org.codehaus.jackson</groupId>
>       <artifactId>jackson-core-asl</artifactId>
>       <version>1.8.8</version>
>       <scope>compile</scope>
>     </dependency>
>
> And this is my app dependency tree:
>
> [INFO] --- maven-dependency-plugin:2.1:tree (default-cli) @ AvroTest ---
> [INFO] org.avrotest:AvroTest:jar:1.0-SNAPSHOT
> [INFO] +- junit:junit:jar:3.8.1:test (scope not updated to compile)
> [INFO] +- org.codehaus.jackson:jackson-mapper-asl:jar:1.8.8:compile
> [INFO] +- org.codehaus.jackson:jackson-core-asl:jar:1.8.8:compile
> [INFO] +- net.sf.json-lib:json-lib:jar:jdk15:2.3:compile
> [INFO] |  +- commons-beanutils:commons-beanutils:jar:1.8.0:compile
> [INFO] |  +- commons-collections:commons-collections:jar:3.2.1:compile
> [INFO] |  +- commons-lang:commons-lang:jar:2.4:compile
> [INFO] |  +- commons-logging:commons-logging:jar:1.1.1:compile
> [INFO] |  \- net.sf.ezmorph:ezmorph:jar:1.0.6:compile
> [INFO] +- org.apache.avro:avro-tools:jar:1.6.3:compile
> [INFO] |  \- org.slf4j:slf4j-api:jar:1.6.4:compile
> [INFO] +- org.apache.avro:avro:jar:1.6.3:compile
> [INFO] |  +- com.thoughtworks.paranamer:paranamer:jar:2.3:compile
> [INFO] |  \- org.xerial.snappy:snappy-java:jar:1.0.4.1:compile
> [INFO] \- org.apache.hadoop:hadoop-core:jar:0.20.2:compile
> [INFO]    +- commons-cli:commons-cli:jar:1.2:compile
> [INFO]    +- xmlenc:xmlenc:jar:0.52:compile
> [INFO]    +- commons-httpclient:commons-httpclient:jar:3.0.1:compile
> [INFO]    +- commons-codec:commons-codec:jar:1.3:compile
> [INFO]    +- commons-net:commons-net:jar:1.4.1:compile
> [INFO]    +- org.mortbay.jetty:jetty:jar:6.1.14:compile
> [INFO]    +- org.mortbay.jetty:jetty-util:jar:6.1.14:compile
> [INFO]    +- tomcat:jasper-runtime:jar:5.5.12:compile
> [INFO]    +- tomcat:jasper-compiler:jar:5.5.12:compile
> [INFO]    +- org.mortbay.jetty:jsp-api-2.1:jar:6.1.14:compile
> [INFO]    +- org.mortbay.jetty:jsp-2.1:jar:6.1.14:compile
> [INFO]    |  \- ant:ant:jar:1.6.5:compile
> [INFO]    +- commons-el:commons-el:jar:1.0:compile
> [INFO]    +- net.java.dev.jets3t:jets3t:jar:0.7.1:compile
> [INFO]    +- org.mortbay.jetty:servlet-api-2.5:jar:6.1.14:compile
> [INFO]    +- net.sf.kosmosfs:kfs:jar:0.3:compile
> [INFO]    +- hsqldb:hsqldb:jar:1.8.0.10:compile
> [INFO]    +- oro:oro:jar:2.0.8:compile
> [INFO]    \- org.eclipse.jdt:core:jar:3.1.1:compile
>
> I still get the same error. Is there anything specific I need to do other
> than changing dependencies in pom.xml to make this error go away?
>
> On Mon, Mar 19, 2012 at 9:12 PM, Tatu Saloranta <ts...@gmail.com>wrote:
>
>> On Mon, Mar 19, 2012 at 6:06 PM, Scott Carey <sc...@apache.org>
>> wrote:
>> > What version of Avro are you using?
>> >
>> > You may want to try Avro 1.6.3 + Jackson 1.8.8.
>> >
>> > This is related, but is not your exact problem.
>> > https://issues.apache.org/jira/browse/AVRO-1037
>> >
>> > You are likely pulling in some other version of jackson somehow.  You
>> may
>> > want to use 'mvn dependency:tree' on your project to see where all the
>> > dependencies are coming from.  That may help identify the culprit.
>>
>> This sounds like a good idea, and I agree in that this is probably
>> still due to an old version lurking somewhere.
>>
>> -+ Tatu +-
>>
>
>
>

Re: Java MapReduce Avro Jackson Error

Posted by Something Something <ma...@gmail.com>.
I recall running into a similar issue a while back.  In our case, the
problem was that the classes from the jar file in /hadoop/lib were
overriding those in the shaded Jar from Maven - or something like that.  I
will look into my notes & see if I can find more info.

On Mon, Mar 19, 2012 at 6:34 PM, Tatu Saloranta <ts...@gmail.com>wrote:

> On Mon, Mar 19, 2012 at 6:23 PM, Deepak Nettem <de...@gmail.com>
> wrote:
> > Hi Tatu / Scott,
> >
> > Thanks for your replies. I replaced the earlier dependencies with these:
> ...
> > I still get the same error. Is there anything specific I need to do other
> > than changing dependencies in pom.xml to make this error go away?
>
> Sounds like some other library has been compiled against old Jackson
> version then.
> Perhaps one of hadoop core jars?
>
> Whatever library it is needs to be compiled against Jackson 1.1 or newer.
>
> -+ Tatu +-
>

Re: Java MapReduce Avro Jackson Error

Posted by Tatu Saloranta <ts...@gmail.com>.
On Mon, Mar 19, 2012 at 6:23 PM, Deepak Nettem <de...@gmail.com> wrote:
> Hi Tatu / Scott,
>
> Thanks for your replies. I replaced the earlier dependencies with these:
...
> I still get the same error. Is there anything specific I need to do other
> than changing dependencies in pom.xml to make this error go away?

Sounds like some other library has been compiled against old Jackson
version then.
Perhaps one of hadoop core jars?

Whatever library it is needs to be compiled against Jackson 1.1 or newer.

-+ Tatu +-

Re: Java MapReduce Avro Jackson Error

Posted by Scott Carey <sc...@apache.org>.
If you are using avro-tools, beware it is a shaded jar with all dependencies
inside of it for use as a command line tool (java ­jar
avro-tools-VERSION.jar).

If you are using avro-tools in your project for some reason (there is really
only command line utilities in it) use the nodeps classifier:

<classifier>nodeps</classifier>

http://repo1.maven.org/maven2/org/apache/avro/avro-tools/1.6.3/

Note the nodeps jar is 47K, while the default jar is 10MB.


For what it is worth, I removed the Jackson jar from our hadoop install long
ago.  It is used to dump configuration files to JSON there, a peripheral
feature we don't use.

Another thing that you may want to do is change your Hadoop dependency scope
to
<scope>provided</scope> since hadoop will be put on your classpath by the
hadoop environment.   Short of this, excluding the chained Hadoop
dependencies you aren't using (most likely: jetty,  kfs, and the
tomcat:jasper and eclipse:jdt stuff) may help.

On 3/19/12 6:23 PM, "Deepak Nettem" <de...@gmail.com> wrote:

> Hi Tatu / Scott,
> 
> Thanks for your replies. I replaced the earlier dependencies with these:
> 
>    <dependency>
>     <groupId>org.apache.avro</groupId>
>     <artifactId>avro-tools</artifactId>
>     <version>1.6.3</version>
>     </dependency>
>     
>     <dependency>
>     <groupId>org.apache.avro</groupId>
>     <artifactId>avro</artifactId>
>     <version>1.6.3</version>
>     </dependency>
> 
>     <dependency> 
>     <groupId>org.codehaus.jackson</groupId>
>       <artifactId>jackson-mapper-asl</artifactId>
>       <version>1.8.8</version>
>       <scope>compile</scope>
>     </dependency>
> 
>     <dependency> 
>     <groupId>org.codehaus.jackson</groupId>
>       <artifactId>jackson-core-asl</artifactId>
>       <version>1.8.8</version>
>       <scope>compile</scope>
>     </dependency>
> 
> And this is my app dependency tree:
> 
> [INFO] --- maven-dependency-plugin:2.1:tree (default-cli) @ AvroTest ---
> [INFO] org.avrotest:AvroTest:jar:1.0-SNAPSHOT
> [INFO] +- junit:junit:jar:3.8.1:test (scope not updated to compile)
> [INFO] +- org.codehaus.jackson:jackson-mapper-asl:jar:1.8.8:compile
> [INFO] +- org.codehaus.jackson:jackson-core-asl:jar:1.8.8:compile
> [INFO] +- net.sf.json-lib:json-lib:jar:jdk15:2.3:compile
> [INFO] |  +- commons-beanutils:commons-beanutils:jar:1.8.0:compile
> [INFO] |  +- commons-collections:commons-collections:jar:3.2.1:compile
> [INFO] |  +- commons-lang:commons-lang:jar:2.4:compile
> [INFO] |  +- commons-logging:commons-logging:jar:1.1.1:compile
> [INFO] |  \- net.sf.ezmorph:ezmorph:jar:1.0.6:compile
> [INFO] +- org.apache.avro:avro-tools:jar:1.6.3:compile
> [INFO] |  \- org.slf4j:slf4j-api:jar:1.6.4:compile
> [INFO] +- org.apache.avro:avro:jar:1.6.3:compile
> [INFO] |  +- com.thoughtworks.paranamer:paranamer:jar:2.3:compile
> [INFO] |  \- org.xerial.snappy:snappy-java:jar:1.0.4.1:compile
> [INFO] \- org.apache.hadoop:hadoop-core:jar:0.20.2:compile
> [INFO]    +- commons-cli:commons-cli:jar:1.2:compile
> [INFO]    +- xmlenc:xmlenc:jar:0.52:compile
> [INFO]    +- commons-httpclient:commons-httpclient:jar:3.0.1:compile
> [INFO]    +- commons-codec:commons-codec:jar:1.3:compile
> [INFO]    +- commons-net:commons-net:jar:1.4.1:compile
> [INFO]    +- org.mortbay.jetty:jetty:jar:6.1.14:compile
> [INFO]    +- org.mortbay.jetty:jetty-util:jar:6.1.14:compile
> [INFO]    +- tomcat:jasper-runtime:jar:5.5.12:compile
> [INFO]    +- tomcat:jasper-compiler:jar:5.5.12:compile
> [INFO]    +- org.mortbay.jetty:jsp-api-2.1:jar:6.1.14:compile
> [INFO]    +- org.mortbay.jetty:jsp-2.1:jar:6.1.14:compile
> [INFO]    |  \- ant:ant:jar:1.6.5:compile
> [INFO]    +- commons-el:commons-el:jar:1.0:compile
> [INFO]    +- net.java.dev.jets3t:jets3t:jar:0.7.1:compile
> [INFO]    +- org.mortbay.jetty:servlet-api-2.5:jar:6.1.14:compile
> [INFO]    +- net.sf.kosmosfs:kfs:jar:0.3:compile
> [INFO]    +- hsqldb:hsqldb:jar:1.8.0.10:compile
> [INFO]    +- oro:oro:jar:2.0.8:compile
> [INFO]    \- org.eclipse.jdt:core:jar:3.1.1:compile
> 
> I still get the same error. Is there anything specific I need to do other than
> changing dependencies in pom.xml to make this error go away?
> 
> On Mon, Mar 19, 2012 at 9:12 PM, Tatu Saloranta <ts...@gmail.com> wrote:
>> On Mon, Mar 19, 2012 at 6:06 PM, Scott Carey <sc...@apache.org> wrote:
>>> > What version of Avro are you using?
>>> >
>>> > You may want to try Avro 1.6.3 + Jackson 1.8.8.
>>> >
>>> > This is related, but is not your exact problem.
>>> > https://issues.apache.org/jira/browse/AVRO-1037
>>> >
>>> > You are likely pulling in some other version of jackson somehow.  You may
>>> > want to use 'mvn dependency:tree' on your project to see where all the
>>> > dependencies are coming from.  That may help identify the culprit.
>> 
>> This sounds like a good idea, and I agree in that this is probably
>> still due to an old version lurking somewhere.
>> 
>> -+ Tatu +-
> 
> 



Re: Java MapReduce Avro Jackson Error

Posted by Deepak Nettem <de...@gmail.com>.
Hi Tatu / Scott,

Thanks for your replies. I replaced the earlier dependencies with these:

   <dependency>
    <groupId>org.apache.avro</groupId>
    <artifactId>avro-tools</artifactId>
    <version>1.6.3</version>
    </dependency>

    <dependency>
    <groupId>org.apache.avro</groupId>
    <artifactId>avro</artifactId>
    <version>1.6.3</version>
    </dependency>

    <dependency>
    <groupId>org.codehaus.jackson</groupId>
      <artifactId>jackson-mapper-asl</artifactId>
      <version>1.8.8</version>
      <scope>compile</scope>
    </dependency>

    <dependency>
    <groupId>org.codehaus.jackson</groupId>
      <artifactId>jackson-core-asl</artifactId>
      <version>1.8.8</version>
      <scope>compile</scope>
    </dependency>

And this is my app dependency tree:

[INFO] --- maven-dependency-plugin:2.1:tree (default-cli) @ AvroTest ---
[INFO] org.avrotest:AvroTest:jar:1.0-SNAPSHOT
[INFO] +- junit:junit:jar:3.8.1:test (scope not updated to compile)
[INFO] +- org.codehaus.jackson:jackson-mapper-asl:jar:1.8.8:compile
[INFO] +- org.codehaus.jackson:jackson-core-asl:jar:1.8.8:compile
[INFO] +- net.sf.json-lib:json-lib:jar:jdk15:2.3:compile
[INFO] |  +- commons-beanutils:commons-beanutils:jar:1.8.0:compile
[INFO] |  +- commons-collections:commons-collections:jar:3.2.1:compile
[INFO] |  +- commons-lang:commons-lang:jar:2.4:compile
[INFO] |  +- commons-logging:commons-logging:jar:1.1.1:compile
[INFO] |  \- net.sf.ezmorph:ezmorph:jar:1.0.6:compile
[INFO] +- org.apache.avro:avro-tools:jar:1.6.3:compile
[INFO] |  \- org.slf4j:slf4j-api:jar:1.6.4:compile
[INFO] +- org.apache.avro:avro:jar:1.6.3:compile
[INFO] |  +- com.thoughtworks.paranamer:paranamer:jar:2.3:compile
[INFO] |  \- org.xerial.snappy:snappy-java:jar:1.0.4.1:compile
[INFO] \- org.apache.hadoop:hadoop-core:jar:0.20.2:compile
[INFO]    +- commons-cli:commons-cli:jar:1.2:compile
[INFO]    +- xmlenc:xmlenc:jar:0.52:compile
[INFO]    +- commons-httpclient:commons-httpclient:jar:3.0.1:compile
[INFO]    +- commons-codec:commons-codec:jar:1.3:compile
[INFO]    +- commons-net:commons-net:jar:1.4.1:compile
[INFO]    +- org.mortbay.jetty:jetty:jar:6.1.14:compile
[INFO]    +- org.mortbay.jetty:jetty-util:jar:6.1.14:compile
[INFO]    +- tomcat:jasper-runtime:jar:5.5.12:compile
[INFO]    +- tomcat:jasper-compiler:jar:5.5.12:compile
[INFO]    +- org.mortbay.jetty:jsp-api-2.1:jar:6.1.14:compile
[INFO]    +- org.mortbay.jetty:jsp-2.1:jar:6.1.14:compile
[INFO]    |  \- ant:ant:jar:1.6.5:compile
[INFO]    +- commons-el:commons-el:jar:1.0:compile
[INFO]    +- net.java.dev.jets3t:jets3t:jar:0.7.1:compile
[INFO]    +- org.mortbay.jetty:servlet-api-2.5:jar:6.1.14:compile
[INFO]    +- net.sf.kosmosfs:kfs:jar:0.3:compile
[INFO]    +- hsqldb:hsqldb:jar:1.8.0.10:compile
[INFO]    +- oro:oro:jar:2.0.8:compile
[INFO]    \- org.eclipse.jdt:core:jar:3.1.1:compile

I still get the same error. Is there anything specific I need to do other
than changing dependencies in pom.xml to make this error go away?

On Mon, Mar 19, 2012 at 9:12 PM, Tatu Saloranta <ts...@gmail.com>wrote:

> On Mon, Mar 19, 2012 at 6:06 PM, Scott Carey <sc...@apache.org>
> wrote:
> > What version of Avro are you using?
> >
> > You may want to try Avro 1.6.3 + Jackson 1.8.8.
> >
> > This is related, but is not your exact problem.
> > https://issues.apache.org/jira/browse/AVRO-1037
> >
> > You are likely pulling in some other version of jackson somehow.  You may
> > want to use 'mvn dependency:tree' on your project to see where all the
> > dependencies are coming from.  That may help identify the culprit.
>
> This sounds like a good idea, and I agree in that this is probably
> still due to an old version lurking somewhere.
>
> -+ Tatu +-
>

Re: Java MapReduce Avro Jackson Error

Posted by Tatu Saloranta <ts...@gmail.com>.
On Mon, Mar 19, 2012 at 6:06 PM, Scott Carey <sc...@apache.org> wrote:
> What version of Avro are you using?
>
> You may want to try Avro 1.6.3 + Jackson 1.8.8.
>
> This is related, but is not your exact problem.
> https://issues.apache.org/jira/browse/AVRO-1037
>
> You are likely pulling in some other version of jackson somehow.  You may
> want to use 'mvn dependency:tree' on your project to see where all the
> dependencies are coming from.  That may help identify the culprit.

This sounds like a good idea, and I agree in that this is probably
still due to an old version lurking somewhere.

-+ Tatu +-

Re: Java MapReduce Avro Jackson Error

Posted by Scott Carey <sc...@apache.org>.
What version of Avro are you using?

You may want to try Avro 1.6.3 + Jackson 1.8.8.

This is related, but is not your exact problem.
https://issues.apache.org/jira/browse/AVRO-1037
 
You are likely pulling in some other version of jackson somehow.  You may
want to use 'mvn dependency:tree' on your project to see where all the
dependencies are coming from.  That may help identify the culprit.

-Scott

On 3/19/12 5:06 PM, "Deepak Nettem" <de...@gmail.com> wrote:


>Sorry,
>
>I meant, I added the jackson-core-asl dependency, and still get the error.
>
>    <groupId>org.codehaus.jackson</groupId>
>      <artifactId>jackson-core-asl</artifactId>
>      <version>1.5.2</version>
>      <scope>compile</scope>
>    </dependency>
>
>
>On Mon, Mar 19, 2012 at 8:05 PM, Deepak Nettem <de...@gmail.com>
>wrote:
>
>Hi Tatu,
>
>I added the dependency:
>
><dependency>    
>    <groupId>org.codehaus.jackson</groupId>
>      <artifactId>jackson-mapper-asl</artifactId>
>      <version>1.5.2</version>
>      <scope>compile</scope>
>    </dependency>
>
>
>But that still gives me this error:
>
>Error: 
>org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$F
>eature;)Lorg/codehaus/jackson/JsonFactory;
>
>Any other ideas?
>
>
>On Mon, Mar 19, 2012 at 7:27 PM, Tatu Saloranta <ts...@gmail.com>
>wrote:
>
>On Mon, Mar 19, 2012 at 4:20 PM, Deepak Nettem <de...@gmail.com>
>wrote:
>> I found that the Hadoop lib directory contains
>>jackson-core-asl-1.0.1.jar
>> and jackson-mapper-asl-1.0.1.jar.
>>
>> I removed these, but got this error:
>> hadoop Exception in thread "main" java.lang.NoClassDefFoundError:
>> org/codehaus/jackson/map/JsonMappingException
>>
>> I am using Maven as a build tool, and my pom.xml has this dependency:
>>
>>     <dependency>
>>     <groupId>org.codehaus.jackson</groupId>
>>       <artifactId>jackson-mapper-asl</artifactId>
>>       <version>1.5.2</version>
>>       <scope>compile</scope>
>>     </dependency>
>>
>> Any help would on this issue would be greatly appreciated.
>
>
>You may want to add similar entry for jackson-core-asl -- mapper does
>require core, and although there is transient dependency from mapper,
>Maven does not necessarily enforce correct version.
>So it is best to add explicit dependency so that version of core is
>also 1.5.x; you may otherwise just get 1.0.1 of that one.
>
>-+ Tatu +-
>
>
>
>
>
>
>
>



Re: Java MapReduce Avro Jackson Error

Posted by Deepak Nettem <de...@gmail.com>.
Sorry,

I meant, I added the jackson-core-asl dependency, and still get the error.

    <groupId>org.codehaus.jackson</groupId>
      <artifactId>jackson-core-asl</artifactId>
      <version>1.5.2</version>
      <scope>compile</scope>
    </dependency>


On Mon, Mar 19, 2012 at 8:05 PM, Deepak Nettem <de...@gmail.com>wrote:

> Hi Tatu,
>
> I added the dependency:
>
>
> <dependency>
>     <groupId>org.codehaus.jackson</groupId>
>       <artifactId>jackson-mapper-asl</artifactId>
>       <version>1.5.2</version>
>       <scope>compile</scope>
>     </dependency>
>
> But that still gives me this error:
>
> Error:
> org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
>
> Any other ideas?
>
>
>
> On Mon, Mar 19, 2012 at 7:27 PM, Tatu Saloranta <ts...@gmail.com>wrote:
>
>> On Mon, Mar 19, 2012 at 4:20 PM, Deepak Nettem <de...@gmail.com>
>> wrote:
>> > I found that the Hadoop lib directory contains
>> jackson-core-asl-1.0.1.jar
>> > and jackson-mapper-asl-1.0.1.jar.
>> >
>> > I removed these, but got this error:
>> > hadoop Exception in thread "main" java.lang.NoClassDefFoundError:
>> > org/codehaus/jackson/map/JsonMappingException
>> >
>> > I am using Maven as a build tool, and my pom.xml has this dependency:
>> >
>> >     <dependency>
>> >     <groupId>org.codehaus.jackson</groupId>
>> >       <artifactId>jackson-mapper-asl</artifactId>
>> >       <version>1.5.2</version>
>> >       <scope>compile</scope>
>> >     </dependency>
>> >
>> > Any help would on this issue would be greatly appreciated.
>>
>> You may want to add similar entry for jackson-core-asl -- mapper does
>> require core, and although there is transient dependency from mapper,
>> Maven does not necessarily enforce correct version.
>> So it is best to add explicit dependency so that version of core is
>> also 1.5.x; you may otherwise just get 1.0.1 of that one.
>>
>> -+ Tatu +-
>>
>

Re: Java MapReduce Avro Jackson Error

Posted by Deepak Nettem <de...@gmail.com>.
Hi Tatu,

I added the dependency:

<dependency>
    <groupId>org.codehaus.jackson</groupId>
      <artifactId>jackson-mapper-asl</artifactId>
      <version>1.5.2</version>
      <scope>compile</scope>
    </dependency>

But that still gives me this error:

Error:
org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;

Any other ideas?


On Mon, Mar 19, 2012 at 7:27 PM, Tatu Saloranta <ts...@gmail.com>wrote:

> On Mon, Mar 19, 2012 at 4:20 PM, Deepak Nettem <de...@gmail.com>
> wrote:
> > I found that the Hadoop lib directory contains jackson-core-asl-1.0.1.jar
> > and jackson-mapper-asl-1.0.1.jar.
> >
> > I removed these, but got this error:
> > hadoop Exception in thread "main" java.lang.NoClassDefFoundError:
> > org/codehaus/jackson/map/JsonMappingException
> >
> > I am using Maven as a build tool, and my pom.xml has this dependency:
> >
> >     <dependency>
> >     <groupId>org.codehaus.jackson</groupId>
> >       <artifactId>jackson-mapper-asl</artifactId>
> >       <version>1.5.2</version>
> >       <scope>compile</scope>
> >     </dependency>
> >
> > Any help would on this issue would be greatly appreciated.
>
> You may want to add similar entry for jackson-core-asl -- mapper does
> require core, and although there is transient dependency from mapper,
> Maven does not necessarily enforce correct version.
> So it is best to add explicit dependency so that version of core is
> also 1.5.x; you may otherwise just get 1.0.1 of that one.
>
> -+ Tatu +-
>

Re: Java MapReduce Avro Jackson Error

Posted by Tatu Saloranta <ts...@gmail.com>.
On Mon, Mar 19, 2012 at 4:20 PM, Deepak Nettem <de...@gmail.com> wrote:
> I found that the Hadoop lib directory contains jackson-core-asl-1.0.1.jar
> and jackson-mapper-asl-1.0.1.jar.
>
> I removed these, but got this error:
> hadoop Exception in thread "main" java.lang.NoClassDefFoundError:
> org/codehaus/jackson/map/JsonMappingException
>
> I am using Maven as a build tool, and my pom.xml has this dependency:
>
>     <dependency>
>     <groupId>org.codehaus.jackson</groupId>
>       <artifactId>jackson-mapper-asl</artifactId>
>       <version>1.5.2</version>
>       <scope>compile</scope>
>     </dependency>
>
> Any help would on this issue would be greatly appreciated.

You may want to add similar entry for jackson-core-asl -- mapper does
require core, and although there is transient dependency from mapper,
Maven does not necessarily enforce correct version.
So it is best to add explicit dependency so that version of core is
also 1.5.x; you may otherwise just get 1.0.1 of that one.

-+ Tatu +-

Re: Java MapReduce Avro Jackson Error

Posted by Deepak Nettem <de...@gmail.com>.
Hi Ken,

Yes, I restarted the daemons. When i remove the jackson jars from /lib, i
get this error:

Exception in thread "main" java.lang.NoClassDefFoundError:
org/codehaus/jackson/map/JsonMappingException

Deepak

On Mon, Mar 19, 2012 at 10:51 PM, Ken Krugler
<kk...@transpac.com>wrote:

>
> On Mar 19, 2012, at 4:20pm, Deepak Nettem wrote:
>
> I found that the Hadoop lib directory contains jackson-core-asl-1.0.1.jar
> and jackson-mapper-asl-1.0.1.jar.
>
> I removed these, but got this error:
> hadoop Exception in thread "main" java.lang.NoClassDefFoundError:
> org/codehaus/jackson/map/JsonMappingException
>
>
> Just confirming that you restarted the Hadoop daemons after removing these
> older Jackson jars.
>
> -- Ken
>
>
> I am using Maven as a build tool, and my pom.xml has this dependency:
>
>     <dependency>
>     <groupId>org.codehaus.jackson</groupId>
>       <artifactId>jackson-mapper-asl</artifactId>
>       <version>1.5.2</version>
>       <scope>compile</scope>
>     </dependency>
>
> Any help would on this issue would be greatly appreciated.
>
> Best,
> Deepak
>
> On Mon, Mar 19, 2012 at 6:48 PM, Deepak Nettem <de...@gmail.com>wrote:
>
>> When I include some Avro code in my Mapper, I get this error:
>>
>> Error:
>> org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
>>
>> Particularly, just these two lines of code:
>>
>>             InputStream in =
>> getClass().getResourceAsStream("schema.avsc");
>>             Schema schema = Schema.parse(in);
>>
>> This code works perfectly when run as a stand alone application outside
>> of Hadoop. Why do I get this error? and what's the best way to get rid of
>> it?
>>
>> I am using Hadoop 0.20.2, and writing code in the new API.
>>
>
> --------------------------
> Ken Krugler
> http://www.scaleunlimited.com
> custom big data solutions & training
> Hadoop, Cascading, Mahout & Solr
>
>
>

Re: Java MapReduce Avro Jackson Error

Posted by Ken Krugler <kk...@transpac.com>.
On Mar 19, 2012, at 4:20pm, Deepak Nettem wrote:

> I found that the Hadoop lib directory contains jackson-core-asl-1.0.1.jar and jackson-mapper-asl-1.0.1.jar.
> 
> I removed these, but got this error: 
> hadoop Exception in thread "main" java.lang.NoClassDefFoundError: org/codehaus/jackson/map/JsonMappingException

Just confirming that you restarted the Hadoop daemons after removing these older Jackson jars.

-- Ken

> 
> I am using Maven as a build tool, and my pom.xml has this dependency:
> 
>     <dependency>    
>     <groupId>org.codehaus.jackson</groupId>
>       <artifactId>jackson-mapper-asl</artifactId>
>       <version>1.5.2</version>
>       <scope>compile</scope>
>     </dependency>
>     
> Any help would on this issue would be greatly appreciated.
> 
> Best,
> Deepak
> 
> On Mon, Mar 19, 2012 at 6:48 PM, Deepak Nettem <de...@gmail.com> wrote:
> When I include some Avro code in my Mapper, I get this error:
> 
> Error: org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
> 
> Particularly, just these two lines of code:
> 
>             InputStream in = getClass().getResourceAsStream("schema.avsc");            
>             Schema schema = Schema.parse(in);
> 
> This code works perfectly when run as a stand alone application outside of Hadoop. Why do I get this error? and what's the best way to get rid of it?
> 
> I am using Hadoop 0.20.2, and writing code in the new API.

--------------------------
Ken Krugler
http://www.scaleunlimited.com
custom big data solutions & training
Hadoop, Cascading, Mahout & Solr





Re: Java MapReduce Avro Jackson Error

Posted by Deepak Nettem <de...@gmail.com>.
I found that the Hadoop lib directory contains jackson-core-asl-1.0.1.jar
and jackson-mapper-asl-1.0.1.jar.

I removed these, but got this error:
hadoop Exception in thread "main" java.lang.NoClassDefFoundError:
org/codehaus/jackson/map/JsonMappingException

I am using Maven as a build tool, and my pom.xml has this dependency:

    <dependency>
    <groupId>org.codehaus.jackson</groupId>
      <artifactId>jackson-mapper-asl</artifactId>
      <version>1.5.2</version>
      <scope>compile</scope>
    </dependency>

Any help would on this issue would be greatly appreciated.

Best,
Deepak

On Mon, Mar 19, 2012 at 6:48 PM, Deepak Nettem <de...@gmail.com>wrote:

> When I include some Avro code in my Mapper, I get this error:
>
> Error:
> org.codehaus.jackson.JsonFactory.enable(Lorg/codehaus/jackson/JsonParser$Feature;)Lorg/codehaus/jackson/JsonFactory;
>
> Particularly, just these two lines of code:
>
>             InputStream in =
> getClass().getResourceAsStream("schema.avsc");
>             Schema schema = Schema.parse(in);
>
> This code works perfectly when run as a stand alone application outside of
> Hadoop. Why do I get this error? and what's the best way to get rid of it?
>
> I am using Hadoop 0.20.2, and writing code in the new API.
>