You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@mahout.apache.org by Phoenix Bai <ba...@gmail.com> on 2012/09/14 04:15:12 UTC

hadoop-0.19 and mahout 0.7: throwing incompatible errors, how can I fix it?

Hi guys,

I am trying to compile my application code using mahout 0.7 and hadoop 0.19.
during the compile process, it is throwing errors as below:

$ hadoop jar cluster-0.0.1-SNAPSHOT-jar-with-dependencies.jar
mahout.sample.ClusterVideos
12/09/13 20:36:18 INFO vectorizer.SparseVectorsFromSequenceFiles: Maximum
n-gram size is: 1
12/09/13 20:36:31 INFO vectorizer.SparseVectorsFromSequenceFiles: Minimum
LLR value: 1.0
12/09/13 20:36:31 INFO vectorizer.SparseVectorsFromSequenceFiles: Number of
reduce tasks: 1
java.lang.VerifyError: (class: org/apache/hadoop/mapreduce/Job, method:
submit signature: ()V) Incompatible argument to function
at
org.apache.mahout.vectorizer.DocumentProcessor.tokenizeDocuments(DocumentProcessor.java:78)
 at
org.apache.mahout.vectorizer.SparseVectorsFromSequenceFiles.run(SparseVectorsFromSequenceFiles.java:253)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
at
org.apache.mahout.vectorizer.SparseVectorsFromSequenceFiles.main(SparseVectorsFromSequenceFiles.java:55)
 at mahout.sample.ClusterVideos.runSeq2Sparse(ClusterVideos.java:133)
at mahout.sample.ClusterVideos.main(ClusterVideos.java:54)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
 at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
 at org.apache.hadoop.util.RunJar.main(RunJar.java:165)
at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
 at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)


This is due to incompability between hadoop0.19 and mahout 0.7 right?
so, how can I fix it?
I can`t upgrade hadoop 0.19 because it is not up to me,
and I don`t want to use mahout 0.5 either because, in that case, I might
have to rewrite my application code.

so, is there any way to solve this like through a patch or something?

Thanks

Re: hadoop-0.19 and mahout 0.7: throwing incompatible errors, how can I fix it?

Posted by Phoenix Bai <ba...@gmail.com>.
my admin add some patch to the current hadoop 0.19, and now it works
perfectly with mahout 0.7.
btw,thank you for all your concernl!

On Fri, Sep 21, 2012 at 10:58 PM, Ted Dunning <te...@gmail.com> wrote:

> On the other hand, the only way that I have been able to do a major version
> upgrade of Hadoop is to start a new company.
>
> It is really hard to change code and platform at the same time.  If you
> don't have enough hardware to have two clusters temporarily, things will be
> really hard moving off of 0.19.
>
> On Fri, Sep 21, 2012 at 9:44 AM, Mat Kelcey <matthew.kelcey@gmail.com
> >wrote:
>
> > I imagine the best use of your time and effort is to convince your admins
> > that running a 3 year old version of hadoop is a bad idea. Things are
> only
> > going to get worse...
> > Mat
> > On Sep 13, 2012 7:15 PM, "Phoenix Bai" <ba...@gmail.com> wrote:
> >
> > > Hi guys,
> > >
> > > I am trying to compile my application code using mahout 0.7 and hadoop
> > > 0.19.
> > > during the compile process, it is throwing errors as below:
> > >
> > > $ hadoop jar cluster-0.0.1-SNAPSHOT-jar-with-dependencies.jar
> > > mahout.sample.ClusterVideos
> > > 12/09/13 20:36:18 INFO vectorizer.SparseVectorsFromSequenceFiles:
> Maximum
> > > n-gram size is: 1
> > > 12/09/13 20:36:31 INFO vectorizer.SparseVectorsFromSequenceFiles:
> Minimum
> > > LLR value: 1.0
> > > 12/09/13 20:36:31 INFO vectorizer.SparseVectorsFromSequenceFiles:
> Number
> > of
> > > reduce tasks: 1
> > > java.lang.VerifyError: (class: org/apache/hadoop/mapreduce/Job, method:
> > > submit signature: ()V) Incompatible argument to function
> > > at
> > >
> > >
> >
> org.apache.mahout.vectorizer.DocumentProcessor.tokenizeDocuments(DocumentProcessor.java:78)
> > >  at
> > >
> > >
> >
> org.apache.mahout.vectorizer.SparseVectorsFromSequenceFiles.run(SparseVectorsFromSequenceFiles.java:253)
> > > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> > >  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
> > > at
> > >
> > >
> >
> org.apache.mahout.vectorizer.SparseVectorsFromSequenceFiles.main(SparseVectorsFromSequenceFiles.java:55)
> > >  at mahout.sample.ClusterVideos.runSeq2Sparse(ClusterVideos.java:133)
> > > at mahout.sample.ClusterVideos.main(ClusterVideos.java:54)
> > >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > at
> > >
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> > >  at
> > >
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > > at java.lang.reflect.Method.invoke(Method.java:597)
> > >  at org.apache.hadoop.util.RunJar.main(RunJar.java:165)
> > > at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
> > >  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> > > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
> > >  at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
> > >
> > >
> > > This is due to incompability between hadoop0.19 and mahout 0.7 right?
> > > so, how can I fix it?
> > > I can`t upgrade hadoop 0.19 because it is not up to me,
> > > and I don`t want to use mahout 0.5 either because, in that case, I
> might
> > > have to rewrite my application code.
> > >
> > > so, is there any way to solve this like through a patch or something?
> > >
> > > Thanks
> > >
> >
>

Re: hadoop-0.19 and mahout 0.7: throwing incompatible errors, how can I fix it?

Posted by Ted Dunning <te...@gmail.com>.
On the other hand, the only way that I have been able to do a major version
upgrade of Hadoop is to start a new company.

It is really hard to change code and platform at the same time.  If you
don't have enough hardware to have two clusters temporarily, things will be
really hard moving off of 0.19.

On Fri, Sep 21, 2012 at 9:44 AM, Mat Kelcey <ma...@gmail.com>wrote:

> I imagine the best use of your time and effort is to convince your admins
> that running a 3 year old version of hadoop is a bad idea. Things are only
> going to get worse...
> Mat
> On Sep 13, 2012 7:15 PM, "Phoenix Bai" <ba...@gmail.com> wrote:
>
> > Hi guys,
> >
> > I am trying to compile my application code using mahout 0.7 and hadoop
> > 0.19.
> > during the compile process, it is throwing errors as below:
> >
> > $ hadoop jar cluster-0.0.1-SNAPSHOT-jar-with-dependencies.jar
> > mahout.sample.ClusterVideos
> > 12/09/13 20:36:18 INFO vectorizer.SparseVectorsFromSequenceFiles: Maximum
> > n-gram size is: 1
> > 12/09/13 20:36:31 INFO vectorizer.SparseVectorsFromSequenceFiles: Minimum
> > LLR value: 1.0
> > 12/09/13 20:36:31 INFO vectorizer.SparseVectorsFromSequenceFiles: Number
> of
> > reduce tasks: 1
> > java.lang.VerifyError: (class: org/apache/hadoop/mapreduce/Job, method:
> > submit signature: ()V) Incompatible argument to function
> > at
> >
> >
> org.apache.mahout.vectorizer.DocumentProcessor.tokenizeDocuments(DocumentProcessor.java:78)
> >  at
> >
> >
> org.apache.mahout.vectorizer.SparseVectorsFromSequenceFiles.run(SparseVectorsFromSequenceFiles.java:253)
> > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> >  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
> > at
> >
> >
> org.apache.mahout.vectorizer.SparseVectorsFromSequenceFiles.main(SparseVectorsFromSequenceFiles.java:55)
> >  at mahout.sample.ClusterVideos.runSeq2Sparse(ClusterVideos.java:133)
> > at mahout.sample.ClusterVideos.main(ClusterVideos.java:54)
> >  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > at
> >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >  at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > at java.lang.reflect.Method.invoke(Method.java:597)
> >  at org.apache.hadoop.util.RunJar.main(RunJar.java:165)
> > at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
> >  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
> >  at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
> >
> >
> > This is due to incompability between hadoop0.19 and mahout 0.7 right?
> > so, how can I fix it?
> > I can`t upgrade hadoop 0.19 because it is not up to me,
> > and I don`t want to use mahout 0.5 either because, in that case, I might
> > have to rewrite my application code.
> >
> > so, is there any way to solve this like through a patch or something?
> >
> > Thanks
> >
>

Re: hadoop-0.19 and mahout 0.7: throwing incompatible errors, how can I fix it?

Posted by Mat Kelcey <ma...@gmail.com>.
I imagine the best use of your time and effort is to convince your admins
that running a 3 year old version of hadoop is a bad idea. Things are only
going to get worse...
Mat
On Sep 13, 2012 7:15 PM, "Phoenix Bai" <ba...@gmail.com> wrote:

> Hi guys,
>
> I am trying to compile my application code using mahout 0.7 and hadoop
> 0.19.
> during the compile process, it is throwing errors as below:
>
> $ hadoop jar cluster-0.0.1-SNAPSHOT-jar-with-dependencies.jar
> mahout.sample.ClusterVideos
> 12/09/13 20:36:18 INFO vectorizer.SparseVectorsFromSequenceFiles: Maximum
> n-gram size is: 1
> 12/09/13 20:36:31 INFO vectorizer.SparseVectorsFromSequenceFiles: Minimum
> LLR value: 1.0
> 12/09/13 20:36:31 INFO vectorizer.SparseVectorsFromSequenceFiles: Number of
> reduce tasks: 1
> java.lang.VerifyError: (class: org/apache/hadoop/mapreduce/Job, method:
> submit signature: ()V) Incompatible argument to function
> at
>
> org.apache.mahout.vectorizer.DocumentProcessor.tokenizeDocuments(DocumentProcessor.java:78)
>  at
>
> org.apache.mahout.vectorizer.SparseVectorsFromSequenceFiles.run(SparseVectorsFromSequenceFiles.java:253)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
> at
>
> org.apache.mahout.vectorizer.SparseVectorsFromSequenceFiles.main(SparseVectorsFromSequenceFiles.java:55)
>  at mahout.sample.ClusterVideos.runSeq2Sparse(ClusterVideos.java:133)
> at mahout.sample.ClusterVideos.main(ClusterVideos.java:54)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>  at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
>  at org.apache.hadoop.util.RunJar.main(RunJar.java:165)
> at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>  at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>  at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>
>
> This is due to incompability between hadoop0.19 and mahout 0.7 right?
> so, how can I fix it?
> I can`t upgrade hadoop 0.19 because it is not up to me,
> and I don`t want to use mahout 0.5 either because, in that case, I might
> have to rewrite my application code.
>
> so, is there any way to solve this like through a patch or something?
>
> Thanks
>