You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Sultan Alamro <su...@gmail.com> on 2015/12/03 01:05:53 UTC
Running multiple copies of each task
Hi there,
I have been looking at the hadoop source code 2.6.0 trying to understand
the low level details and how the framework is actually working.
I have a simple idea and I am trying to figure out where and how the idea
can be implemented. The idea can be described in one sentence: "Running
multiple copies of each task". However, implementing the idea is not as
simple as I think.
What I am aware of is that I only need to modify a few classes. But, which
classes?
I just need someone to guide me to the right direction.
Best,
Sultan
Re: Running multiple copies of each task
Posted by Namikaze Minato <ll...@gmail.com>.
Talking about priority without telling us which scheduler and which
scheduling method you are using is like forgetting to say in which unit you
are measuring somehting :)
Regards,
LLoyd
On 17 December 2015 at 20:09, Sultan Alamro <su...@gmail.com> wrote:
> Thanks Namikaze!
>
> Another question:
>
> New tasks in hadoop always have higher priority than speculative tasks.
> Does anyone know how and where I can change this priority?
>
>
> Thanks,
> Sultan
>
>
> On Thu, Dec 3, 2015 at 7:46 AM, Namikaze Minato <ll...@gmail.com>
> wrote:
>
>> I think you are looking for mapreduce.reduce.speculative
>> Be careful, for some reason, this fell into my spam folder.
>>
>> Regards,
>> LLoyd
>>
>> On 3 December 2015 at 01:05, Sultan Alamro <su...@gmail.com>
>> wrote:
>> > Hi there,
>> >
>> > I have been looking at the hadoop source code 2.6.0 trying to
>> understand the
>> > low level details and how the framework is actually working.
>> >
>> > I have a simple idea and I am trying to figure out where and how the
>> idea
>> > can be implemented. The idea can be described in one sentence: "Running
>> > multiple copies of each task". However, implementing the idea is not as
>> > simple as I think.
>> >
>> > What I am aware of is that I only need to modify a few classes. But,
>> which
>> > classes?
>> >
>> > I just need someone to guide me to the right direction.
>> >
>> >
>> > Best,
>> > Sultan
>>
>
>
Re: Running multiple copies of each task
Posted by Namikaze Minato <ll...@gmail.com>.
Talking about priority without telling us which scheduler and which
scheduling method you are using is like forgetting to say in which unit you
are measuring somehting :)
Regards,
LLoyd
On 17 December 2015 at 20:09, Sultan Alamro <su...@gmail.com> wrote:
> Thanks Namikaze!
>
> Another question:
>
> New tasks in hadoop always have higher priority than speculative tasks.
> Does anyone know how and where I can change this priority?
>
>
> Thanks,
> Sultan
>
>
> On Thu, Dec 3, 2015 at 7:46 AM, Namikaze Minato <ll...@gmail.com>
> wrote:
>
>> I think you are looking for mapreduce.reduce.speculative
>> Be careful, for some reason, this fell into my spam folder.
>>
>> Regards,
>> LLoyd
>>
>> On 3 December 2015 at 01:05, Sultan Alamro <su...@gmail.com>
>> wrote:
>> > Hi there,
>> >
>> > I have been looking at the hadoop source code 2.6.0 trying to
>> understand the
>> > low level details and how the framework is actually working.
>> >
>> > I have a simple idea and I am trying to figure out where and how the
>> idea
>> > can be implemented. The idea can be described in one sentence: "Running
>> > multiple copies of each task". However, implementing the idea is not as
>> > simple as I think.
>> >
>> > What I am aware of is that I only need to modify a few classes. But,
>> which
>> > classes?
>> >
>> > I just need someone to guide me to the right direction.
>> >
>> >
>> > Best,
>> > Sultan
>>
>
>
Re: Running multiple copies of each task
Posted by Namikaze Minato <ll...@gmail.com>.
Talking about priority without telling us which scheduler and which
scheduling method you are using is like forgetting to say in which unit you
are measuring somehting :)
Regards,
LLoyd
On 17 December 2015 at 20:09, Sultan Alamro <su...@gmail.com> wrote:
> Thanks Namikaze!
>
> Another question:
>
> New tasks in hadoop always have higher priority than speculative tasks.
> Does anyone know how and where I can change this priority?
>
>
> Thanks,
> Sultan
>
>
> On Thu, Dec 3, 2015 at 7:46 AM, Namikaze Minato <ll...@gmail.com>
> wrote:
>
>> I think you are looking for mapreduce.reduce.speculative
>> Be careful, for some reason, this fell into my spam folder.
>>
>> Regards,
>> LLoyd
>>
>> On 3 December 2015 at 01:05, Sultan Alamro <su...@gmail.com>
>> wrote:
>> > Hi there,
>> >
>> > I have been looking at the hadoop source code 2.6.0 trying to
>> understand the
>> > low level details and how the framework is actually working.
>> >
>> > I have a simple idea and I am trying to figure out where and how the
>> idea
>> > can be implemented. The idea can be described in one sentence: "Running
>> > multiple copies of each task". However, implementing the idea is not as
>> > simple as I think.
>> >
>> > What I am aware of is that I only need to modify a few classes. But,
>> which
>> > classes?
>> >
>> > I just need someone to guide me to the right direction.
>> >
>> >
>> > Best,
>> > Sultan
>>
>
>
Re: Running multiple copies of each task
Posted by Namikaze Minato <ll...@gmail.com>.
Talking about priority without telling us which scheduler and which
scheduling method you are using is like forgetting to say in which unit you
are measuring somehting :)
Regards,
LLoyd
On 17 December 2015 at 20:09, Sultan Alamro <su...@gmail.com> wrote:
> Thanks Namikaze!
>
> Another question:
>
> New tasks in hadoop always have higher priority than speculative tasks.
> Does anyone know how and where I can change this priority?
>
>
> Thanks,
> Sultan
>
>
> On Thu, Dec 3, 2015 at 7:46 AM, Namikaze Minato <ll...@gmail.com>
> wrote:
>
>> I think you are looking for mapreduce.reduce.speculative
>> Be careful, for some reason, this fell into my spam folder.
>>
>> Regards,
>> LLoyd
>>
>> On 3 December 2015 at 01:05, Sultan Alamro <su...@gmail.com>
>> wrote:
>> > Hi there,
>> >
>> > I have been looking at the hadoop source code 2.6.0 trying to
>> understand the
>> > low level details and how the framework is actually working.
>> >
>> > I have a simple idea and I am trying to figure out where and how the
>> idea
>> > can be implemented. The idea can be described in one sentence: "Running
>> > multiple copies of each task". However, implementing the idea is not as
>> > simple as I think.
>> >
>> > What I am aware of is that I only need to modify a few classes. But,
>> which
>> > classes?
>> >
>> > I just need someone to guide me to the right direction.
>> >
>> >
>> > Best,
>> > Sultan
>>
>
>
Re: Running multiple copies of each task
Posted by Sultan Alamro <su...@gmail.com>.
Thanks Namikaze!
Another question:
New tasks in hadoop always have higher priority than speculative tasks.
Does anyone know how and where I can change this priority?
Thanks,
Sultan
On Thu, Dec 3, 2015 at 7:46 AM, Namikaze Minato <ll...@gmail.com>
wrote:
> I think you are looking for mapreduce.reduce.speculative
> Be careful, for some reason, this fell into my spam folder.
>
> Regards,
> LLoyd
>
> On 3 December 2015 at 01:05, Sultan Alamro <su...@gmail.com>
> wrote:
> > Hi there,
> >
> > I have been looking at the hadoop source code 2.6.0 trying to understand
> the
> > low level details and how the framework is actually working.
> >
> > I have a simple idea and I am trying to figure out where and how the idea
> > can be implemented. The idea can be described in one sentence: "Running
> > multiple copies of each task". However, implementing the idea is not as
> > simple as I think.
> >
> > What I am aware of is that I only need to modify a few classes. But,
> which
> > classes?
> >
> > I just need someone to guide me to the right direction.
> >
> >
> > Best,
> > Sultan
>
Re: Running multiple copies of each task
Posted by Sultan Alamro <su...@gmail.com>.
Thanks Namikaze!
Another question:
New tasks in hadoop always have higher priority than speculative tasks.
Does anyone know how and where I can change this priority?
Thanks,
Sultan
On Thu, Dec 3, 2015 at 7:46 AM, Namikaze Minato <ll...@gmail.com>
wrote:
> I think you are looking for mapreduce.reduce.speculative
> Be careful, for some reason, this fell into my spam folder.
>
> Regards,
> LLoyd
>
> On 3 December 2015 at 01:05, Sultan Alamro <su...@gmail.com>
> wrote:
> > Hi there,
> >
> > I have been looking at the hadoop source code 2.6.0 trying to understand
> the
> > low level details and how the framework is actually working.
> >
> > I have a simple idea and I am trying to figure out where and how the idea
> > can be implemented. The idea can be described in one sentence: "Running
> > multiple copies of each task". However, implementing the idea is not as
> > simple as I think.
> >
> > What I am aware of is that I only need to modify a few classes. But,
> which
> > classes?
> >
> > I just need someone to guide me to the right direction.
> >
> >
> > Best,
> > Sultan
>
Re: Running multiple copies of each task
Posted by Sultan Alamro <su...@gmail.com>.
Thanks Namikaze!
Another question:
New tasks in hadoop always have higher priority than speculative tasks.
Does anyone know how and where I can change this priority?
Thanks,
Sultan
On Thu, Dec 3, 2015 at 7:46 AM, Namikaze Minato <ll...@gmail.com>
wrote:
> I think you are looking for mapreduce.reduce.speculative
> Be careful, for some reason, this fell into my spam folder.
>
> Regards,
> LLoyd
>
> On 3 December 2015 at 01:05, Sultan Alamro <su...@gmail.com>
> wrote:
> > Hi there,
> >
> > I have been looking at the hadoop source code 2.6.0 trying to understand
> the
> > low level details and how the framework is actually working.
> >
> > I have a simple idea and I am trying to figure out where and how the idea
> > can be implemented. The idea can be described in one sentence: "Running
> > multiple copies of each task". However, implementing the idea is not as
> > simple as I think.
> >
> > What I am aware of is that I only need to modify a few classes. But,
> which
> > classes?
> >
> > I just need someone to guide me to the right direction.
> >
> >
> > Best,
> > Sultan
>
Re: Running multiple copies of each task
Posted by Sultan Alamro <su...@gmail.com>.
Thanks Namikaze!
Another question:
New tasks in hadoop always have higher priority than speculative tasks.
Does anyone know how and where I can change this priority?
Thanks,
Sultan
On Thu, Dec 3, 2015 at 7:46 AM, Namikaze Minato <ll...@gmail.com>
wrote:
> I think you are looking for mapreduce.reduce.speculative
> Be careful, for some reason, this fell into my spam folder.
>
> Regards,
> LLoyd
>
> On 3 December 2015 at 01:05, Sultan Alamro <su...@gmail.com>
> wrote:
> > Hi there,
> >
> > I have been looking at the hadoop source code 2.6.0 trying to understand
> the
> > low level details and how the framework is actually working.
> >
> > I have a simple idea and I am trying to figure out where and how the idea
> > can be implemented. The idea can be described in one sentence: "Running
> > multiple copies of each task". However, implementing the idea is not as
> > simple as I think.
> >
> > What I am aware of is that I only need to modify a few classes. But,
> which
> > classes?
> >
> > I just need someone to guide me to the right direction.
> >
> >
> > Best,
> > Sultan
>
Re: Running multiple copies of each task
Posted by Namikaze Minato <ll...@gmail.com>.
I think you are looking for mapreduce.reduce.speculative
Be careful, for some reason, this fell into my spam folder.
Regards,
LLoyd
On 3 December 2015 at 01:05, Sultan Alamro <su...@gmail.com> wrote:
> Hi there,
>
> I have been looking at the hadoop source code 2.6.0 trying to understand the
> low level details and how the framework is actually working.
>
> I have a simple idea and I am trying to figure out where and how the idea
> can be implemented. The idea can be described in one sentence: "Running
> multiple copies of each task". However, implementing the idea is not as
> simple as I think.
>
> What I am aware of is that I only need to modify a few classes. But, which
> classes?
>
> I just need someone to guide me to the right direction.
>
>
> Best,
> Sultan
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@hadoop.apache.org
For additional commands, e-mail: user-help@hadoop.apache.org
Re: Running multiple copies of each task
Posted by Namikaze Minato <ll...@gmail.com>.
I think you are looking for mapreduce.reduce.speculative
Be careful, for some reason, this fell into my spam folder.
Regards,
LLoyd
On 3 December 2015 at 01:05, Sultan Alamro <su...@gmail.com> wrote:
> Hi there,
>
> I have been looking at the hadoop source code 2.6.0 trying to understand the
> low level details and how the framework is actually working.
>
> I have a simple idea and I am trying to figure out where and how the idea
> can be implemented. The idea can be described in one sentence: "Running
> multiple copies of each task". However, implementing the idea is not as
> simple as I think.
>
> What I am aware of is that I only need to modify a few classes. But, which
> classes?
>
> I just need someone to guide me to the right direction.
>
>
> Best,
> Sultan
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@hadoop.apache.org
For additional commands, e-mail: user-help@hadoop.apache.org
Re: Running multiple copies of each task
Posted by Namikaze Minato <ll...@gmail.com>.
I think you are looking for mapreduce.reduce.speculative
Be careful, for some reason, this fell into my spam folder.
Regards,
LLoyd
On 3 December 2015 at 01:05, Sultan Alamro <su...@gmail.com> wrote:
> Hi there,
>
> I have been looking at the hadoop source code 2.6.0 trying to understand the
> low level details and how the framework is actually working.
>
> I have a simple idea and I am trying to figure out where and how the idea
> can be implemented. The idea can be described in one sentence: "Running
> multiple copies of each task". However, implementing the idea is not as
> simple as I think.
>
> What I am aware of is that I only need to modify a few classes. But, which
> classes?
>
> I just need someone to guide me to the right direction.
>
>
> Best,
> Sultan
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@hadoop.apache.org
For additional commands, e-mail: user-help@hadoop.apache.org
Re: Running multiple copies of each task
Posted by Namikaze Minato <ll...@gmail.com>.
I think you are looking for mapreduce.reduce.speculative
Be careful, for some reason, this fell into my spam folder.
Regards,
LLoyd
On 3 December 2015 at 01:05, Sultan Alamro <su...@gmail.com> wrote:
> Hi there,
>
> I have been looking at the hadoop source code 2.6.0 trying to understand the
> low level details and how the framework is actually working.
>
> I have a simple idea and I am trying to figure out where and how the idea
> can be implemented. The idea can be described in one sentence: "Running
> multiple copies of each task". However, implementing the idea is not as
> simple as I think.
>
> What I am aware of is that I only need to modify a few classes. But, which
> classes?
>
> I just need someone to guide me to the right direction.
>
>
> Best,
> Sultan
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@hadoop.apache.org
For additional commands, e-mail: user-help@hadoop.apache.org