You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by Ben Rycroft <br...@gmail.com> on 2012/10/03 13:21:46 UTC

Lib conflicts

Hi all,

I have a jar that uses the Hadoop API to launch various remote mapreduce
jobs (ie, im not using the command-line to initiate the job). The service
jar that executes the various jobs is built with maven's
"jar-with-dependencies".

My jobs all run fine except one that uses commons-codec 1.7, I get:

FATAL org.apache.hadoop.mapred.Child: Error running child :
java.lang.NoSuchMethodError:
org.apache.commons.codec.binary.Base64.encodeAsString([B)Ljava/lang/String;

I think this is because my jar is including commons-codec 1.7 whereas my
Hadoop install's lib has commons-codec 1.4 ...

Is their any way to instruct Hadoop to use the distributed commons-codec
1.7 (I assume this is distributed as a job dependency) rather than the
commons-codec 1.4 in the hadoop 1.0.3 core lib?

Removing commons-codec-1.4.jar from my Hadoop library folder did seem to
solve the problem for a bit, but is not working on another VM. Replacing
the 1.4 jar with the 1.7 does seem to fix the problem but this doesn't seem
too sane. Hopefully there is a better alternative.

Thanks!

Re: Lib conflicts

Posted by Steve Loughran <st...@hortonworks.com>.
On 3 October 2012 12:21, Ben Rycroft <br...@gmail.com> wrote:

> Hi all,
>
> I have a jar that uses the Hadoop API to launch various remote mapreduce
> jobs (ie, im not using the command-line to initiate the job). The service
> jar that executes the various jobs is built with maven's
> "jar-with-dependencies".
>
> My jobs all run fine except one that uses commons-codec 1.7, I get:
>
> FATAL org.apache.hadoop.mapred.Child: Error running child :
> java.lang.NoSuchMethodError:
> org.apache.commons.codec.binary.Base64.encodeAsString([B)Ljava/lang/String;
>
> I think this is because my jar is including commons-codec 1.7 whereas my
> Hadoop install's lib has commons-codec 1.4 ...
>
> Is their any way to instruct Hadoop to use the distributed commons-codec
> 1.7 (I assume this is distributed as a job dependency) rather than the
> commons-codec 1.4 in the hadoop 1.0.3 core lib?
>
>
>
Maven's urge to do recursive resolution doesn't help. Play with exclusion
to force the 1.4 out.

FWIW Ivy can be configured to fail if there is any version conflict down
the tree, rather than apply a policy such as "latest wins" or "closest to
root wins"; I don't know if m2 has a similar option

Re: Lib conflicts

Posted by Steve Loughran <st...@hortonworks.com>.
On 3 October 2012 12:21, Ben Rycroft <br...@gmail.com> wrote:

> Hi all,
>
> I have a jar that uses the Hadoop API to launch various remote mapreduce
> jobs (ie, im not using the command-line to initiate the job). The service
> jar that executes the various jobs is built with maven's
> "jar-with-dependencies".
>
> My jobs all run fine except one that uses commons-codec 1.7, I get:
>
> FATAL org.apache.hadoop.mapred.Child: Error running child :
> java.lang.NoSuchMethodError:
> org.apache.commons.codec.binary.Base64.encodeAsString([B)Ljava/lang/String;
>
> I think this is because my jar is including commons-codec 1.7 whereas my
> Hadoop install's lib has commons-codec 1.4 ...
>
> Is their any way to instruct Hadoop to use the distributed commons-codec
> 1.7 (I assume this is distributed as a job dependency) rather than the
> commons-codec 1.4 in the hadoop 1.0.3 core lib?
>
>
>
Maven's urge to do recursive resolution doesn't help. Play with exclusion
to force the 1.4 out.

FWIW Ivy can be configured to fail if there is any version conflict down
the tree, rather than apply a policy such as "latest wins" or "closest to
root wins"; I don't know if m2 has a similar option

Re: Lib conflicts

Posted by Michael Segel <mi...@hotmail.com>.
Yup, I hate that when it happens. 

You tend to see this more with Avro than anything else. 

The issue is that in Java, the first class loaded wins. So when Hadoop loads 1.4 first, you can't unload it and replace it with 1.7. 

The only solution that we found to be workable is to replace the jars on all of the nodes in the cluster with the latest. 
The only drawback is if there are incompatibilities between releases. 

The other option is to roll back to the early version.  (Yeah I know. You need that method, which is why I made the first recommendation. ;-) 

HTH

-Mike

On Oct 3, 2012, at 6:21 AM, Ben Rycroft <br...@gmail.com> wrote:

> Hi all,
> 
> I have a jar that uses the Hadoop API to launch various remote mapreduce jobs (ie, im not using the command-line to initiate the job). The service jar that executes the various jobs is built with maven's "jar-with-dependencies".
> 
> My jobs all run fine except one that uses commons-codec 1.7, I get:
> 
> FATAL org.apache.hadoop.mapred.Child: Error running child : java.lang.NoSuchMethodError: org.apache.commons.codec.binary.Base64.encodeAsString([B)Ljava/lang/String;
> 
> I think this is because my jar is including commons-codec 1.7 whereas my Hadoop install's lib has commons-codec 1.4 ...
> 
> Is their any way to instruct Hadoop to use the distributed commons-codec 1.7 (I assume this is distributed as a job dependency) rather than the commons-codec 1.4 in the hadoop 1.0.3 core lib?
> 
> Removing commons-codec-1.4.jar from my Hadoop library folder did seem to solve the problem for a bit, but is not working on another VM. Replacing the 1.4 jar with the 1.7 does seem to fix the problem but this doesn't seem too sane. Hopefully there is a better alternative.
> 
> Thanks!
> 


Re: Lib conflicts

Posted by Jay Vyas <ja...@gmail.com>.
Yup! I hate this issue.  It also happens with json libs if you have an old
hadoop !!!

Its really easy also to dump the exact class path at runtime using the

((URLClassLoader)ClassLoader.getSystemClassLoader()).getURLs(); trick...
very very very very useful :)

I always do this just to make sure before going and mucking with the
classpath args.

I've found using this trick can save a lot of headaches, especially if your
dealing with jars that you can't programmatically get version #'s out of.

On Wed, Oct 3, 2012 at 7:45 AM, Ben Rycroft <br...@gmail.com> wrote:
>
> Hi guys,
>
> Thanks the speedy replies.
>
> I will try Harsh's suggestion and see if this solves the issue, if not I
will just do what Michael suggested and replace the jars on each of the
nodes.
>
> Thanks again!
>
>
> On Wed, Oct 3, 2012 at 12:39 PM, Harsh J <ha...@cloudera.com> wrote:
>>
>> Hi Ben,
>>
>> As long as the switch of libraries doesn't impact the execution of the
>> Child task code itself, for Apache Hadoop 1.x, using the config
>> "mapreduce.user.classpath.first" set to true may solve your trouble.
>>
>> On Wed, Oct 3, 2012 at 4:51 PM, Ben Rycroft <br...@gmail.com> wrote:
>> > Hi all,
>> >
>> > I have a jar that uses the Hadoop API to launch various remote
mapreduce
>> > jobs (ie, im not using the command-line to initiate the job). The
service
>> > jar that executes the various jobs is built with maven's
>> > "jar-with-dependencies".
>> >
>> > My jobs all run fine except one that uses commons-codec 1.7, I get:
>> >
>> > FATAL org.apache.hadoop.mapred.Child: Error running child :
>> > java.lang.NoSuchMethodError:
>> >
org.apache.commons.codec.binary.Base64.encodeAsString([B)Ljava/lang/String;
>> >
>> > I think this is because my jar is including commons-codec 1.7 whereas
my
>> > Hadoop install's lib has commons-codec 1.4 ...
>> >
>> > Is their any way to instruct Hadoop to use the distributed
commons-codec 1.7
>> > (I assume this is distributed as a job dependency) rather than the
>> > commons-codec 1.4 in the hadoop 1.0.3 core lib?
>> >
>> > Removing commons-codec-1.4.jar from my Hadoop library folder did seem
to
>> > solve the problem for a bit, but is not working on another VM.
Replacing the
>> > 1.4 jar with the 1.7 does seem to fix the problem but this doesn't
seem too
>> > sane. Hopefully there is a better alternative.
>> >
>> > Thanks!
>>
>>
>>
>> --
>> Harsh J
>
>



--
Jay Vyas
MMSB/UCHC

Re: Lib conflicts

Posted by Jay Vyas <ja...@gmail.com>.
Yup! I hate this issue.  It also happens with json libs if you have an old
hadoop !!!

Its really easy also to dump the exact class path at runtime using the

((URLClassLoader)ClassLoader.getSystemClassLoader()).getURLs(); trick...
very very very very useful :)

I always do this just to make sure before going and mucking with the
classpath args.

I've found using this trick can save a lot of headaches, especially if your
dealing with jars that you can't programmatically get version #'s out of.

On Wed, Oct 3, 2012 at 7:45 AM, Ben Rycroft <br...@gmail.com> wrote:
>
> Hi guys,
>
> Thanks the speedy replies.
>
> I will try Harsh's suggestion and see if this solves the issue, if not I
will just do what Michael suggested and replace the jars on each of the
nodes.
>
> Thanks again!
>
>
> On Wed, Oct 3, 2012 at 12:39 PM, Harsh J <ha...@cloudera.com> wrote:
>>
>> Hi Ben,
>>
>> As long as the switch of libraries doesn't impact the execution of the
>> Child task code itself, for Apache Hadoop 1.x, using the config
>> "mapreduce.user.classpath.first" set to true may solve your trouble.
>>
>> On Wed, Oct 3, 2012 at 4:51 PM, Ben Rycroft <br...@gmail.com> wrote:
>> > Hi all,
>> >
>> > I have a jar that uses the Hadoop API to launch various remote
mapreduce
>> > jobs (ie, im not using the command-line to initiate the job). The
service
>> > jar that executes the various jobs is built with maven's
>> > "jar-with-dependencies".
>> >
>> > My jobs all run fine except one that uses commons-codec 1.7, I get:
>> >
>> > FATAL org.apache.hadoop.mapred.Child: Error running child :
>> > java.lang.NoSuchMethodError:
>> >
org.apache.commons.codec.binary.Base64.encodeAsString([B)Ljava/lang/String;
>> >
>> > I think this is because my jar is including commons-codec 1.7 whereas
my
>> > Hadoop install's lib has commons-codec 1.4 ...
>> >
>> > Is their any way to instruct Hadoop to use the distributed
commons-codec 1.7
>> > (I assume this is distributed as a job dependency) rather than the
>> > commons-codec 1.4 in the hadoop 1.0.3 core lib?
>> >
>> > Removing commons-codec-1.4.jar from my Hadoop library folder did seem
to
>> > solve the problem for a bit, but is not working on another VM.
Replacing the
>> > 1.4 jar with the 1.7 does seem to fix the problem but this doesn't
seem too
>> > sane. Hopefully there is a better alternative.
>> >
>> > Thanks!
>>
>>
>>
>> --
>> Harsh J
>
>



--
Jay Vyas
MMSB/UCHC

Re: Lib conflicts

Posted by Jay Vyas <ja...@gmail.com>.
Yup! I hate this issue.  It also happens with json libs if you have an old
hadoop !!!

Its really easy also to dump the exact class path at runtime using the

((URLClassLoader)ClassLoader.getSystemClassLoader()).getURLs(); trick...
very very very very useful :)

I always do this just to make sure before going and mucking with the
classpath args.

I've found using this trick can save a lot of headaches, especially if your
dealing with jars that you can't programmatically get version #'s out of.

On Wed, Oct 3, 2012 at 7:45 AM, Ben Rycroft <br...@gmail.com> wrote:
>
> Hi guys,
>
> Thanks the speedy replies.
>
> I will try Harsh's suggestion and see if this solves the issue, if not I
will just do what Michael suggested and replace the jars on each of the
nodes.
>
> Thanks again!
>
>
> On Wed, Oct 3, 2012 at 12:39 PM, Harsh J <ha...@cloudera.com> wrote:
>>
>> Hi Ben,
>>
>> As long as the switch of libraries doesn't impact the execution of the
>> Child task code itself, for Apache Hadoop 1.x, using the config
>> "mapreduce.user.classpath.first" set to true may solve your trouble.
>>
>> On Wed, Oct 3, 2012 at 4:51 PM, Ben Rycroft <br...@gmail.com> wrote:
>> > Hi all,
>> >
>> > I have a jar that uses the Hadoop API to launch various remote
mapreduce
>> > jobs (ie, im not using the command-line to initiate the job). The
service
>> > jar that executes the various jobs is built with maven's
>> > "jar-with-dependencies".
>> >
>> > My jobs all run fine except one that uses commons-codec 1.7, I get:
>> >
>> > FATAL org.apache.hadoop.mapred.Child: Error running child :
>> > java.lang.NoSuchMethodError:
>> >
org.apache.commons.codec.binary.Base64.encodeAsString([B)Ljava/lang/String;
>> >
>> > I think this is because my jar is including commons-codec 1.7 whereas
my
>> > Hadoop install's lib has commons-codec 1.4 ...
>> >
>> > Is their any way to instruct Hadoop to use the distributed
commons-codec 1.7
>> > (I assume this is distributed as a job dependency) rather than the
>> > commons-codec 1.4 in the hadoop 1.0.3 core lib?
>> >
>> > Removing commons-codec-1.4.jar from my Hadoop library folder did seem
to
>> > solve the problem for a bit, but is not working on another VM.
Replacing the
>> > 1.4 jar with the 1.7 does seem to fix the problem but this doesn't
seem too
>> > sane. Hopefully there is a better alternative.
>> >
>> > Thanks!
>>
>>
>>
>> --
>> Harsh J
>
>



--
Jay Vyas
MMSB/UCHC

Re: Lib conflicts

Posted by Jay Vyas <ja...@gmail.com>.
Yup! I hate this issue.  It also happens with json libs if you have an old
hadoop !!!

Its really easy also to dump the exact class path at runtime using the

((URLClassLoader)ClassLoader.getSystemClassLoader()).getURLs(); trick...
very very very very useful :)

I always do this just to make sure before going and mucking with the
classpath args.

I've found using this trick can save a lot of headaches, especially if your
dealing with jars that you can't programmatically get version #'s out of.

On Wed, Oct 3, 2012 at 7:45 AM, Ben Rycroft <br...@gmail.com> wrote:
>
> Hi guys,
>
> Thanks the speedy replies.
>
> I will try Harsh's suggestion and see if this solves the issue, if not I
will just do what Michael suggested and replace the jars on each of the
nodes.
>
> Thanks again!
>
>
> On Wed, Oct 3, 2012 at 12:39 PM, Harsh J <ha...@cloudera.com> wrote:
>>
>> Hi Ben,
>>
>> As long as the switch of libraries doesn't impact the execution of the
>> Child task code itself, for Apache Hadoop 1.x, using the config
>> "mapreduce.user.classpath.first" set to true may solve your trouble.
>>
>> On Wed, Oct 3, 2012 at 4:51 PM, Ben Rycroft <br...@gmail.com> wrote:
>> > Hi all,
>> >
>> > I have a jar that uses the Hadoop API to launch various remote
mapreduce
>> > jobs (ie, im not using the command-line to initiate the job). The
service
>> > jar that executes the various jobs is built with maven's
>> > "jar-with-dependencies".
>> >
>> > My jobs all run fine except one that uses commons-codec 1.7, I get:
>> >
>> > FATAL org.apache.hadoop.mapred.Child: Error running child :
>> > java.lang.NoSuchMethodError:
>> >
org.apache.commons.codec.binary.Base64.encodeAsString([B)Ljava/lang/String;
>> >
>> > I think this is because my jar is including commons-codec 1.7 whereas
my
>> > Hadoop install's lib has commons-codec 1.4 ...
>> >
>> > Is their any way to instruct Hadoop to use the distributed
commons-codec 1.7
>> > (I assume this is distributed as a job dependency) rather than the
>> > commons-codec 1.4 in the hadoop 1.0.3 core lib?
>> >
>> > Removing commons-codec-1.4.jar from my Hadoop library folder did seem
to
>> > solve the problem for a bit, but is not working on another VM.
Replacing the
>> > 1.4 jar with the 1.7 does seem to fix the problem but this doesn't
seem too
>> > sane. Hopefully there is a better alternative.
>> >
>> > Thanks!
>>
>>
>>
>> --
>> Harsh J
>
>



--
Jay Vyas
MMSB/UCHC

Re: Lib conflicts

Posted by Ben Rycroft <br...@gmail.com>.
Hi guys,

Thanks the speedy replies.

I will try Harsh's suggestion and see if this solves the issue, if not I
will just do what Michael suggested and replace the jars on each of the
nodes.

Thanks again!

On Wed, Oct 3, 2012 at 12:39 PM, Harsh J <ha...@cloudera.com> wrote:

> Hi Ben,
>
> As long as the switch of libraries doesn't impact the execution of the
> Child task code itself, for Apache Hadoop 1.x, using the config
> "mapreduce.user.classpath.first" set to true may solve your trouble.
>
> On Wed, Oct 3, 2012 at 4:51 PM, Ben Rycroft <br...@gmail.com> wrote:
> > Hi all,
> >
> > I have a jar that uses the Hadoop API to launch various remote mapreduce
> > jobs (ie, im not using the command-line to initiate the job). The service
> > jar that executes the various jobs is built with maven's
> > "jar-with-dependencies".
> >
> > My jobs all run fine except one that uses commons-codec 1.7, I get:
> >
> > FATAL org.apache.hadoop.mapred.Child: Error running child :
> > java.lang.NoSuchMethodError:
> >
> org.apache.commons.codec.binary.Base64.encodeAsString([B)Ljava/lang/String;
> >
> > I think this is because my jar is including commons-codec 1.7 whereas my
> > Hadoop install's lib has commons-codec 1.4 ...
> >
> > Is their any way to instruct Hadoop to use the distributed commons-codec
> 1.7
> > (I assume this is distributed as a job dependency) rather than the
> > commons-codec 1.4 in the hadoop 1.0.3 core lib?
> >
> > Removing commons-codec-1.4.jar from my Hadoop library folder did seem to
> > solve the problem for a bit, but is not working on another VM. Replacing
> the
> > 1.4 jar with the 1.7 does seem to fix the problem but this doesn't seem
> too
> > sane. Hopefully there is a better alternative.
> >
> > Thanks!
>
>
>
> --
> Harsh J
>

Re: Lib conflicts

Posted by Ben Rycroft <br...@gmail.com>.
Hi guys,

Thanks the speedy replies.

I will try Harsh's suggestion and see if this solves the issue, if not I
will just do what Michael suggested and replace the jars on each of the
nodes.

Thanks again!

On Wed, Oct 3, 2012 at 12:39 PM, Harsh J <ha...@cloudera.com> wrote:

> Hi Ben,
>
> As long as the switch of libraries doesn't impact the execution of the
> Child task code itself, for Apache Hadoop 1.x, using the config
> "mapreduce.user.classpath.first" set to true may solve your trouble.
>
> On Wed, Oct 3, 2012 at 4:51 PM, Ben Rycroft <br...@gmail.com> wrote:
> > Hi all,
> >
> > I have a jar that uses the Hadoop API to launch various remote mapreduce
> > jobs (ie, im not using the command-line to initiate the job). The service
> > jar that executes the various jobs is built with maven's
> > "jar-with-dependencies".
> >
> > My jobs all run fine except one that uses commons-codec 1.7, I get:
> >
> > FATAL org.apache.hadoop.mapred.Child: Error running child :
> > java.lang.NoSuchMethodError:
> >
> org.apache.commons.codec.binary.Base64.encodeAsString([B)Ljava/lang/String;
> >
> > I think this is because my jar is including commons-codec 1.7 whereas my
> > Hadoop install's lib has commons-codec 1.4 ...
> >
> > Is their any way to instruct Hadoop to use the distributed commons-codec
> 1.7
> > (I assume this is distributed as a job dependency) rather than the
> > commons-codec 1.4 in the hadoop 1.0.3 core lib?
> >
> > Removing commons-codec-1.4.jar from my Hadoop library folder did seem to
> > solve the problem for a bit, but is not working on another VM. Replacing
> the
> > 1.4 jar with the 1.7 does seem to fix the problem but this doesn't seem
> too
> > sane. Hopefully there is a better alternative.
> >
> > Thanks!
>
>
>
> --
> Harsh J
>

Re: Lib conflicts

Posted by Ben Rycroft <br...@gmail.com>.
Hi guys,

Thanks the speedy replies.

I will try Harsh's suggestion and see if this solves the issue, if not I
will just do what Michael suggested and replace the jars on each of the
nodes.

Thanks again!

On Wed, Oct 3, 2012 at 12:39 PM, Harsh J <ha...@cloudera.com> wrote:

> Hi Ben,
>
> As long as the switch of libraries doesn't impact the execution of the
> Child task code itself, for Apache Hadoop 1.x, using the config
> "mapreduce.user.classpath.first" set to true may solve your trouble.
>
> On Wed, Oct 3, 2012 at 4:51 PM, Ben Rycroft <br...@gmail.com> wrote:
> > Hi all,
> >
> > I have a jar that uses the Hadoop API to launch various remote mapreduce
> > jobs (ie, im not using the command-line to initiate the job). The service
> > jar that executes the various jobs is built with maven's
> > "jar-with-dependencies".
> >
> > My jobs all run fine except one that uses commons-codec 1.7, I get:
> >
> > FATAL org.apache.hadoop.mapred.Child: Error running child :
> > java.lang.NoSuchMethodError:
> >
> org.apache.commons.codec.binary.Base64.encodeAsString([B)Ljava/lang/String;
> >
> > I think this is because my jar is including commons-codec 1.7 whereas my
> > Hadoop install's lib has commons-codec 1.4 ...
> >
> > Is their any way to instruct Hadoop to use the distributed commons-codec
> 1.7
> > (I assume this is distributed as a job dependency) rather than the
> > commons-codec 1.4 in the hadoop 1.0.3 core lib?
> >
> > Removing commons-codec-1.4.jar from my Hadoop library folder did seem to
> > solve the problem for a bit, but is not working on another VM. Replacing
> the
> > 1.4 jar with the 1.7 does seem to fix the problem but this doesn't seem
> too
> > sane. Hopefully there is a better alternative.
> >
> > Thanks!
>
>
>
> --
> Harsh J
>

Re: Lib conflicts

Posted by Ben Rycroft <br...@gmail.com>.
Hi guys,

Thanks the speedy replies.

I will try Harsh's suggestion and see if this solves the issue, if not I
will just do what Michael suggested and replace the jars on each of the
nodes.

Thanks again!

On Wed, Oct 3, 2012 at 12:39 PM, Harsh J <ha...@cloudera.com> wrote:

> Hi Ben,
>
> As long as the switch of libraries doesn't impact the execution of the
> Child task code itself, for Apache Hadoop 1.x, using the config
> "mapreduce.user.classpath.first" set to true may solve your trouble.
>
> On Wed, Oct 3, 2012 at 4:51 PM, Ben Rycroft <br...@gmail.com> wrote:
> > Hi all,
> >
> > I have a jar that uses the Hadoop API to launch various remote mapreduce
> > jobs (ie, im not using the command-line to initiate the job). The service
> > jar that executes the various jobs is built with maven's
> > "jar-with-dependencies".
> >
> > My jobs all run fine except one that uses commons-codec 1.7, I get:
> >
> > FATAL org.apache.hadoop.mapred.Child: Error running child :
> > java.lang.NoSuchMethodError:
> >
> org.apache.commons.codec.binary.Base64.encodeAsString([B)Ljava/lang/String;
> >
> > I think this is because my jar is including commons-codec 1.7 whereas my
> > Hadoop install's lib has commons-codec 1.4 ...
> >
> > Is their any way to instruct Hadoop to use the distributed commons-codec
> 1.7
> > (I assume this is distributed as a job dependency) rather than the
> > commons-codec 1.4 in the hadoop 1.0.3 core lib?
> >
> > Removing commons-codec-1.4.jar from my Hadoop library folder did seem to
> > solve the problem for a bit, but is not working on another VM. Replacing
> the
> > 1.4 jar with the 1.7 does seem to fix the problem but this doesn't seem
> too
> > sane. Hopefully there is a better alternative.
> >
> > Thanks!
>
>
>
> --
> Harsh J
>

Re: Lib conflicts

Posted by Harsh J <ha...@cloudera.com>.
Hi Ben,

As long as the switch of libraries doesn't impact the execution of the
Child task code itself, for Apache Hadoop 1.x, using the config
"mapreduce.user.classpath.first" set to true may solve your trouble.

On Wed, Oct 3, 2012 at 4:51 PM, Ben Rycroft <br...@gmail.com> wrote:
> Hi all,
>
> I have a jar that uses the Hadoop API to launch various remote mapreduce
> jobs (ie, im not using the command-line to initiate the job). The service
> jar that executes the various jobs is built with maven's
> "jar-with-dependencies".
>
> My jobs all run fine except one that uses commons-codec 1.7, I get:
>
> FATAL org.apache.hadoop.mapred.Child: Error running child :
> java.lang.NoSuchMethodError:
> org.apache.commons.codec.binary.Base64.encodeAsString([B)Ljava/lang/String;
>
> I think this is because my jar is including commons-codec 1.7 whereas my
> Hadoop install's lib has commons-codec 1.4 ...
>
> Is their any way to instruct Hadoop to use the distributed commons-codec 1.7
> (I assume this is distributed as a job dependency) rather than the
> commons-codec 1.4 in the hadoop 1.0.3 core lib?
>
> Removing commons-codec-1.4.jar from my Hadoop library folder did seem to
> solve the problem for a bit, but is not working on another VM. Replacing the
> 1.4 jar with the 1.7 does seem to fix the problem but this doesn't seem too
> sane. Hopefully there is a better alternative.
>
> Thanks!



-- 
Harsh J

Re: Lib conflicts

Posted by Harsh J <ha...@cloudera.com>.
Hi Ben,

As long as the switch of libraries doesn't impact the execution of the
Child task code itself, for Apache Hadoop 1.x, using the config
"mapreduce.user.classpath.first" set to true may solve your trouble.

On Wed, Oct 3, 2012 at 4:51 PM, Ben Rycroft <br...@gmail.com> wrote:
> Hi all,
>
> I have a jar that uses the Hadoop API to launch various remote mapreduce
> jobs (ie, im not using the command-line to initiate the job). The service
> jar that executes the various jobs is built with maven's
> "jar-with-dependencies".
>
> My jobs all run fine except one that uses commons-codec 1.7, I get:
>
> FATAL org.apache.hadoop.mapred.Child: Error running child :
> java.lang.NoSuchMethodError:
> org.apache.commons.codec.binary.Base64.encodeAsString([B)Ljava/lang/String;
>
> I think this is because my jar is including commons-codec 1.7 whereas my
> Hadoop install's lib has commons-codec 1.4 ...
>
> Is their any way to instruct Hadoop to use the distributed commons-codec 1.7
> (I assume this is distributed as a job dependency) rather than the
> commons-codec 1.4 in the hadoop 1.0.3 core lib?
>
> Removing commons-codec-1.4.jar from my Hadoop library folder did seem to
> solve the problem for a bit, but is not working on another VM. Replacing the
> 1.4 jar with the 1.7 does seem to fix the problem but this doesn't seem too
> sane. Hopefully there is a better alternative.
>
> Thanks!



-- 
Harsh J

Re: Lib conflicts

Posted by Michael Segel <mi...@hotmail.com>.
Yup, I hate that when it happens. 

You tend to see this more with Avro than anything else. 

The issue is that in Java, the first class loaded wins. So when Hadoop loads 1.4 first, you can't unload it and replace it with 1.7. 

The only solution that we found to be workable is to replace the jars on all of the nodes in the cluster with the latest. 
The only drawback is if there are incompatibilities between releases. 

The other option is to roll back to the early version.  (Yeah I know. You need that method, which is why I made the first recommendation. ;-) 

HTH

-Mike

On Oct 3, 2012, at 6:21 AM, Ben Rycroft <br...@gmail.com> wrote:

> Hi all,
> 
> I have a jar that uses the Hadoop API to launch various remote mapreduce jobs (ie, im not using the command-line to initiate the job). The service jar that executes the various jobs is built with maven's "jar-with-dependencies".
> 
> My jobs all run fine except one that uses commons-codec 1.7, I get:
> 
> FATAL org.apache.hadoop.mapred.Child: Error running child : java.lang.NoSuchMethodError: org.apache.commons.codec.binary.Base64.encodeAsString([B)Ljava/lang/String;
> 
> I think this is because my jar is including commons-codec 1.7 whereas my Hadoop install's lib has commons-codec 1.4 ...
> 
> Is their any way to instruct Hadoop to use the distributed commons-codec 1.7 (I assume this is distributed as a job dependency) rather than the commons-codec 1.4 in the hadoop 1.0.3 core lib?
> 
> Removing commons-codec-1.4.jar from my Hadoop library folder did seem to solve the problem for a bit, but is not working on another VM. Replacing the 1.4 jar with the 1.7 does seem to fix the problem but this doesn't seem too sane. Hopefully there is a better alternative.
> 
> Thanks!
> 


Re: Lib conflicts

Posted by Harsh J <ha...@cloudera.com>.
Hi Ben,

As long as the switch of libraries doesn't impact the execution of the
Child task code itself, for Apache Hadoop 1.x, using the config
"mapreduce.user.classpath.first" set to true may solve your trouble.

On Wed, Oct 3, 2012 at 4:51 PM, Ben Rycroft <br...@gmail.com> wrote:
> Hi all,
>
> I have a jar that uses the Hadoop API to launch various remote mapreduce
> jobs (ie, im not using the command-line to initiate the job). The service
> jar that executes the various jobs is built with maven's
> "jar-with-dependencies".
>
> My jobs all run fine except one that uses commons-codec 1.7, I get:
>
> FATAL org.apache.hadoop.mapred.Child: Error running child :
> java.lang.NoSuchMethodError:
> org.apache.commons.codec.binary.Base64.encodeAsString([B)Ljava/lang/String;
>
> I think this is because my jar is including commons-codec 1.7 whereas my
> Hadoop install's lib has commons-codec 1.4 ...
>
> Is their any way to instruct Hadoop to use the distributed commons-codec 1.7
> (I assume this is distributed as a job dependency) rather than the
> commons-codec 1.4 in the hadoop 1.0.3 core lib?
>
> Removing commons-codec-1.4.jar from my Hadoop library folder did seem to
> solve the problem for a bit, but is not working on another VM. Replacing the
> 1.4 jar with the 1.7 does seem to fix the problem but this doesn't seem too
> sane. Hopefully there is a better alternative.
>
> Thanks!



-- 
Harsh J

Re: Lib conflicts

Posted by Michael Segel <mi...@hotmail.com>.
Yup, I hate that when it happens. 

You tend to see this more with Avro than anything else. 

The issue is that in Java, the first class loaded wins. So when Hadoop loads 1.4 first, you can't unload it and replace it with 1.7. 

The only solution that we found to be workable is to replace the jars on all of the nodes in the cluster with the latest. 
The only drawback is if there are incompatibilities between releases. 

The other option is to roll back to the early version.  (Yeah I know. You need that method, which is why I made the first recommendation. ;-) 

HTH

-Mike

On Oct 3, 2012, at 6:21 AM, Ben Rycroft <br...@gmail.com> wrote:

> Hi all,
> 
> I have a jar that uses the Hadoop API to launch various remote mapreduce jobs (ie, im not using the command-line to initiate the job). The service jar that executes the various jobs is built with maven's "jar-with-dependencies".
> 
> My jobs all run fine except one that uses commons-codec 1.7, I get:
> 
> FATAL org.apache.hadoop.mapred.Child: Error running child : java.lang.NoSuchMethodError: org.apache.commons.codec.binary.Base64.encodeAsString([B)Ljava/lang/String;
> 
> I think this is because my jar is including commons-codec 1.7 whereas my Hadoop install's lib has commons-codec 1.4 ...
> 
> Is their any way to instruct Hadoop to use the distributed commons-codec 1.7 (I assume this is distributed as a job dependency) rather than the commons-codec 1.4 in the hadoop 1.0.3 core lib?
> 
> Removing commons-codec-1.4.jar from my Hadoop library folder did seem to solve the problem for a bit, but is not working on another VM. Replacing the 1.4 jar with the 1.7 does seem to fix the problem but this doesn't seem too sane. Hopefully there is a better alternative.
> 
> Thanks!
> 


Re: Lib conflicts

Posted by Steve Loughran <st...@hortonworks.com>.
On 3 October 2012 12:21, Ben Rycroft <br...@gmail.com> wrote:

> Hi all,
>
> I have a jar that uses the Hadoop API to launch various remote mapreduce
> jobs (ie, im not using the command-line to initiate the job). The service
> jar that executes the various jobs is built with maven's
> "jar-with-dependencies".
>
> My jobs all run fine except one that uses commons-codec 1.7, I get:
>
> FATAL org.apache.hadoop.mapred.Child: Error running child :
> java.lang.NoSuchMethodError:
> org.apache.commons.codec.binary.Base64.encodeAsString([B)Ljava/lang/String;
>
> I think this is because my jar is including commons-codec 1.7 whereas my
> Hadoop install's lib has commons-codec 1.4 ...
>
> Is their any way to instruct Hadoop to use the distributed commons-codec
> 1.7 (I assume this is distributed as a job dependency) rather than the
> commons-codec 1.4 in the hadoop 1.0.3 core lib?
>
>
>
Maven's urge to do recursive resolution doesn't help. Play with exclusion
to force the 1.4 out.

FWIW Ivy can be configured to fail if there is any version conflict down
the tree, rather than apply a policy such as "latest wins" or "closest to
root wins"; I don't know if m2 has a similar option

Re: Lib conflicts

Posted by Steve Loughran <st...@hortonworks.com>.
On 3 October 2012 12:21, Ben Rycroft <br...@gmail.com> wrote:

> Hi all,
>
> I have a jar that uses the Hadoop API to launch various remote mapreduce
> jobs (ie, im not using the command-line to initiate the job). The service
> jar that executes the various jobs is built with maven's
> "jar-with-dependencies".
>
> My jobs all run fine except one that uses commons-codec 1.7, I get:
>
> FATAL org.apache.hadoop.mapred.Child: Error running child :
> java.lang.NoSuchMethodError:
> org.apache.commons.codec.binary.Base64.encodeAsString([B)Ljava/lang/String;
>
> I think this is because my jar is including commons-codec 1.7 whereas my
> Hadoop install's lib has commons-codec 1.4 ...
>
> Is their any way to instruct Hadoop to use the distributed commons-codec
> 1.7 (I assume this is distributed as a job dependency) rather than the
> commons-codec 1.4 in the hadoop 1.0.3 core lib?
>
>
>
Maven's urge to do recursive resolution doesn't help. Play with exclusion
to force the 1.4 out.

FWIW Ivy can be configured to fail if there is any version conflict down
the tree, rather than apply a policy such as "latest wins" or "closest to
root wins"; I don't know if m2 has a similar option

Re: Lib conflicts

Posted by Michael Segel <mi...@hotmail.com>.
Yup, I hate that when it happens. 

You tend to see this more with Avro than anything else. 

The issue is that in Java, the first class loaded wins. So when Hadoop loads 1.4 first, you can't unload it and replace it with 1.7. 

The only solution that we found to be workable is to replace the jars on all of the nodes in the cluster with the latest. 
The only drawback is if there are incompatibilities between releases. 

The other option is to roll back to the early version.  (Yeah I know. You need that method, which is why I made the first recommendation. ;-) 

HTH

-Mike

On Oct 3, 2012, at 6:21 AM, Ben Rycroft <br...@gmail.com> wrote:

> Hi all,
> 
> I have a jar that uses the Hadoop API to launch various remote mapreduce jobs (ie, im not using the command-line to initiate the job). The service jar that executes the various jobs is built with maven's "jar-with-dependencies".
> 
> My jobs all run fine except one that uses commons-codec 1.7, I get:
> 
> FATAL org.apache.hadoop.mapred.Child: Error running child : java.lang.NoSuchMethodError: org.apache.commons.codec.binary.Base64.encodeAsString([B)Ljava/lang/String;
> 
> I think this is because my jar is including commons-codec 1.7 whereas my Hadoop install's lib has commons-codec 1.4 ...
> 
> Is their any way to instruct Hadoop to use the distributed commons-codec 1.7 (I assume this is distributed as a job dependency) rather than the commons-codec 1.4 in the hadoop 1.0.3 core lib?
> 
> Removing commons-codec-1.4.jar from my Hadoop library folder did seem to solve the problem for a bit, but is not working on another VM. Replacing the 1.4 jar with the 1.7 does seem to fix the problem but this doesn't seem too sane. Hopefully there is a better alternative.
> 
> Thanks!
> 


Re: Lib conflicts

Posted by Harsh J <ha...@cloudera.com>.
Hi Ben,

As long as the switch of libraries doesn't impact the execution of the
Child task code itself, for Apache Hadoop 1.x, using the config
"mapreduce.user.classpath.first" set to true may solve your trouble.

On Wed, Oct 3, 2012 at 4:51 PM, Ben Rycroft <br...@gmail.com> wrote:
> Hi all,
>
> I have a jar that uses the Hadoop API to launch various remote mapreduce
> jobs (ie, im not using the command-line to initiate the job). The service
> jar that executes the various jobs is built with maven's
> "jar-with-dependencies".
>
> My jobs all run fine except one that uses commons-codec 1.7, I get:
>
> FATAL org.apache.hadoop.mapred.Child: Error running child :
> java.lang.NoSuchMethodError:
> org.apache.commons.codec.binary.Base64.encodeAsString([B)Ljava/lang/String;
>
> I think this is because my jar is including commons-codec 1.7 whereas my
> Hadoop install's lib has commons-codec 1.4 ...
>
> Is their any way to instruct Hadoop to use the distributed commons-codec 1.7
> (I assume this is distributed as a job dependency) rather than the
> commons-codec 1.4 in the hadoop 1.0.3 core lib?
>
> Removing commons-codec-1.4.jar from my Hadoop library folder did seem to
> solve the problem for a bit, but is not working on another VM. Replacing the
> 1.4 jar with the 1.7 does seem to fix the problem but this doesn't seem too
> sane. Hopefully there is a better alternative.
>
> Thanks!



-- 
Harsh J