You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@pig.apache.org by Sharon Rapoport <sh...@plaid.com> on 2014/07/16 16:48:54 UTC

MongoStorage error

Hello,

I am trying to store an alias into mongo and getting an error. 

Example script:

REGISTER mongo-hadoop-pig-1.2.1-SNAPSHOT-hadoop_2.4.jar
REGISTER mongo-2.4.jar

categories_raw = LOAD 'mongodb://username:password@mydb...:27017/db.categories?authSource=admin' USING com.mongodb.hadoop.pig.MongoLoader();

doing a dump the load went fine, I indeed have my data. 
Now just trying to dump it back:

store categories_raw into 'mongodb://username:password@mydb...:27017/db.categories_pig?authSource=admin' USING com.mongodb.hadoop.pig.MongoInsertStorage(); 

I get this error:
FATAL org.apache.hadoop.mapred.Child (main): Error running child : java.lang.IncompatibleClassChangeError: Found class org.apache.hadoop.mapreduce.TaskAttemptContext, but interface was expected
	at com.mongodb.hadoop.MongoOutputFormat.getRecordWriter(MongoOutputFormat.java:48)
	at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.getRecordWriter(PigOutputFormat.java:84)
	at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:635)
	at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:760)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:375)
	at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1132)
	at org.apache.hadoop.mapred.Child.main(Child.java:249)
I am using EMR with 
Pig 0.11.1.1
AMI version 2.4.2, Hadoop 1.0.3

I am somewhat new to pig… Switched after I’ve had enough of Hive bugs, but starting to regret that ;)

Any help is appreciated!

Sharon


Re: MongoStorage error

Posted by Sharon Rapoport <sh...@plaid.com>.
Thanks, I indeed solved the problem a few days ago.

In case someone bumps into this post in the future: I switched to pig 0.12, hadoop 2.4.0, used these jar files:
REGISTER mongo-java-driver-2.11.1.jar;
REGISTER mongo-hadoop-pig-1.4.0-SNAPSHOT.jar;
REGISTER mongo-hadoop-core-1.4.0-SNAPSHOT.jar;

and the function isn’t MongoStorage, but MongoInsertStorage.

On Jul 19, 2014, at 7:43 PM, Cheolsoo Park <pi...@gmail.com> wrote:

>>> java.lang.IncompatibleClassChangeError: Found class
> org.apache.hadoop.mapreduce.TaskAttemptContext, but interface was expected
> 
> This error is about binary incompatibility between Hadoop 1 and 2. I
> suspect that the following jar that you register is the culprit-
> 
>>> REGISTER mongo-hadoop-pig-1.2.1-SNAPSHOT-hadoop_2.4.jar
> 
> This conflicts with Hadoop 1.0.3 that you're using.
> 
> 
> On Wed, Jul 16, 2014 at 7:48 AM, Sharon Rapoport <sh...@plaid.com> wrote:
> 
>> Hello,
>> 
>> I am trying to store an alias into mongo and getting an error.
>> 
>> Example script:
>> 
>> REGISTER mongo-hadoop-pig-1.2.1-SNAPSHOT-hadoop_2.4.jar
>> REGISTER mongo-2.4.jar
>> 
>> categories_raw = LOAD 'mongodb://username:password@mydb...:27017/db.categories?authSource=admin'
>> USING com.mongodb.hadoop.pig.MongoLoader();
>> 
>> doing a dump the load went fine, I indeed have my data.
>> Now just trying to dump it back:
>> 
>> store categories_raw into 'mongodb://username:password@mydb...:27017/db.categories_pig?authSource=admin'
>> USING com.mongodb.hadoop.pig.MongoInsertStorage();
>> 
>> I get this error:
>> FATAL org.apache.hadoop.mapred.Child (main): Error running child :
>> java.lang.IncompatibleClassChangeError: Found class
>> org.apache.hadoop.mapreduce.TaskAttemptContext, but interface was expected
>>        at
>> com.mongodb.hadoop.MongoOutputFormat.getRecordWriter(MongoOutputFormat.java:48)
>>        at
>> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.getRecordWriter(PigOutputFormat.java:84)
>>        at
>> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:635)
>>        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:760)
>>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:375)
>>        at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>>        at java.security.AccessController.doPrivileged(Native Method)
>>        at javax.security.auth.Subject.doAs(Subject.java:415)
>>        at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1132)
>>        at org.apache.hadoop.mapred.Child.main(Child.java:249)
>> I am using EMR with
>> Pig 0.11.1.1
>> AMI version 2.4.2, Hadoop 1.0.3
>> 
>> I am somewhat new to pig… Switched after I’ve had enough of Hive bugs, but
>> starting to regret that ;)
>> 
>> Any help is appreciated!
>> 
>> Sharon
>> 
>> 


Re: MongoStorage error

Posted by Cheolsoo Park <pi...@gmail.com>.
>> java.lang.IncompatibleClassChangeError: Found class
org.apache.hadoop.mapreduce.TaskAttemptContext, but interface was expected

This error is about binary incompatibility between Hadoop 1 and 2. I
suspect that the following jar that you register is the culprit-

>> REGISTER mongo-hadoop-pig-1.2.1-SNAPSHOT-hadoop_2.4.jar

This conflicts with Hadoop 1.0.3 that you're using.


On Wed, Jul 16, 2014 at 7:48 AM, Sharon Rapoport <sh...@plaid.com> wrote:

> Hello,
>
> I am trying to store an alias into mongo and getting an error.
>
> Example script:
>
> REGISTER mongo-hadoop-pig-1.2.1-SNAPSHOT-hadoop_2.4.jar
> REGISTER mongo-2.4.jar
>
> categories_raw = LOAD 'mongodb://username:password@mydb...:27017/db.categories?authSource=admin'
> USING com.mongodb.hadoop.pig.MongoLoader();
>
> doing a dump the load went fine, I indeed have my data.
> Now just trying to dump it back:
>
> store categories_raw into 'mongodb://username:password@mydb...:27017/db.categories_pig?authSource=admin'
> USING com.mongodb.hadoop.pig.MongoInsertStorage();
>
> I get this error:
> FATAL org.apache.hadoop.mapred.Child (main): Error running child :
> java.lang.IncompatibleClassChangeError: Found class
> org.apache.hadoop.mapreduce.TaskAttemptContext, but interface was expected
>         at
> com.mongodb.hadoop.MongoOutputFormat.getRecordWriter(MongoOutputFormat.java:48)
>         at
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.getRecordWriter(PigOutputFormat.java:84)
>         at
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:635)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:760)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:375)
>         at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:415)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1132)
>         at org.apache.hadoop.mapred.Child.main(Child.java:249)
> I am using EMR with
> Pig 0.11.1.1
> AMI version 2.4.2, Hadoop 1.0.3
>
> I am somewhat new to pig… Switched after I’ve had enough of Hive bugs, but
> starting to regret that ;)
>
> Any help is appreciated!
>
> Sharon
>
>