You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Albert Shau <al...@continuuity.com> on 2013/10/11 00:27:04 UTC

Conflicting dependency versions

Hi,

I have a yarn application that launches a mapreduce job that has a mapper
that uses a newer version of guava than the one hadoop is using.  Because
of this, the mapper fails and gets a NoSuchMethod exception.  Is there a
way to indicate that application dependencies should be used over hadoop
dependencies?

Thanks,
Albert

Re: Conflicting dependency versions

Posted by Hitesh Shah <hi...@apache.org>.
Hi Albert, 

If you are using distributed cache to push the newer version of the guava jars, you can try setting "mapreduce.job.user.classpath.first" to true. If not, you can try overriding the value of mapreduce.application.classpath to ensure that the dir where the newer guava jars are present is referenced first in the classpath.

-- Hitesh

On Oct 10, 2013, at 3:27 PM, Albert Shau wrote:

> Hi,
> 
> I have a yarn application that launches a mapreduce job that has a mapper that uses a newer version of guava than the one hadoop is using.  Because of this, the mapper fails and gets a NoSuchMethod exception.  Is there a way to indicate that application dependencies should be used over hadoop dependencies?
> 
> Thanks,
> Albert


Re: Conflicting dependency versions

Posted by Hitesh Shah <hi...@apache.org>.
Hi Albert, 

If you are using distributed cache to push the newer version of the guava jars, you can try setting "mapreduce.job.user.classpath.first" to true. If not, you can try overriding the value of mapreduce.application.classpath to ensure that the dir where the newer guava jars are present is referenced first in the classpath.

-- Hitesh

On Oct 10, 2013, at 3:27 PM, Albert Shau wrote:

> Hi,
> 
> I have a yarn application that launches a mapreduce job that has a mapper that uses a newer version of guava than the one hadoop is using.  Because of this, the mapper fails and gets a NoSuchMethod exception.  Is there a way to indicate that application dependencies should be used over hadoop dependencies?
> 
> Thanks,
> Albert


Re: Conflicting dependency versions

Posted by Hitesh Shah <hi...@apache.org>.
Hi Albert, 

If you are using distributed cache to push the newer version of the guava jars, you can try setting "mapreduce.job.user.classpath.first" to true. If not, you can try overriding the value of mapreduce.application.classpath to ensure that the dir where the newer guava jars are present is referenced first in the classpath.

-- Hitesh

On Oct 10, 2013, at 3:27 PM, Albert Shau wrote:

> Hi,
> 
> I have a yarn application that launches a mapreduce job that has a mapper that uses a newer version of guava than the one hadoop is using.  Because of this, the mapper fails and gets a NoSuchMethod exception.  Is there a way to indicate that application dependencies should be used over hadoop dependencies?
> 
> Thanks,
> Albert


Re: Conflicting dependency versions

Posted by Hitesh Shah <hi...@apache.org>.
Hi Albert, 

If you are using distributed cache to push the newer version of the guava jars, you can try setting "mapreduce.job.user.classpath.first" to true. If not, you can try overriding the value of mapreduce.application.classpath to ensure that the dir where the newer guava jars are present is referenced first in the classpath.

-- Hitesh

On Oct 10, 2013, at 3:27 PM, Albert Shau wrote:

> Hi,
> 
> I have a yarn application that launches a mapreduce job that has a mapper that uses a newer version of guava than the one hadoop is using.  Because of this, the mapper fails and gets a NoSuchMethod exception.  Is there a way to indicate that application dependencies should be used over hadoop dependencies?
> 
> Thanks,
> Albert