You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by mallik arjun <ma...@gmail.com> on 2012/09/03 18:19:44 UTC

how to execute different tasks on data nodes(simultaneously in hadoop).

genrally in hadoop map function will be exeucted by all the data nodes on
the input data set ,against this how can i do the following.
i have some filter programs , and what i want to do is each data
node(slave) has to execute one filter alogrithm  simultaneously, diffent
from other data nodes executions.

thanks in advance.

Re: how to execute different tasks on data nodes(simultaneously in hadoop).

Posted by Narasingu Ramesh <ra...@gmail.com>.
Hi Users,
              Hadoop can distribute all the data into HDFS inside MapReduce
tasks can work together. which one is goes to which data node and how it
works all those things it can maintain each task has own JVM in each data
node. JVM can handle hell number of data to process to the all data nodes
and then simplifies the each task.
Thanks & Regards,
Ramesh.Narasingu

On Mon, Sep 3, 2012 at 11:11 PM, Bertrand Dechoux <de...@gmail.com>wrote:

> Hi,
>
> Assuming you have to compute these value for every RGB pixel.
> Why couldn't you compute all these values at the same time on the same
> node?
>
> Hadoop let you distribute your computation but it doesn't mean each node
> has to compute only a part of the equations.
> Each node can compute all equations but for a 'small' part of the data.
> That's Hadoop strategy. That way, sequential read and data locality will
> improve your performances.
>
> Regards
>
> Bertrand
>
>
> On Mon, Sep 3, 2012 at 6:35 PM, mallik arjun <ma...@gmail.com>wrote:
>
>> [image: Inline image 1]
>>
>> On Mon, Sep 3, 2012 at 10:01 PM, Bertrand Dechoux <de...@gmail.com>wrote:
>>
>>> You can check the value of "map.input.file" in order to apply a
>>> different logic for each type of files (in the mapper).
>>> More information about your problem/context would help the readers to
>>> provide a more extensive reply.
>>>
>>> Regards
>>>
>>> Bertrand
>>>
>>> each data node has to process one equation of above simultaneously.
>>
>>
>>> On Mon, Sep 3, 2012 at 6:25 PM, Michael Segel <michael_segel@hotmail.com
>>> > wrote:
>>>
>>>> Not sure what you are trying to do...
>>>>
>>>> You want to pass through the entire data set on all nodes where each
>>>> node runs a single filter?
>>>>
>>>> You're thinking is orthogonal to how Hadoop works.
>>>>
>>>> You would be better off letting each node work on a portion of the data
>>>> which is local to that node running the entire filter set.
>>>>
>>>>
>>>> On Sep 3, 2012, at 11:19 AM, mallik arjun <ma...@gmail.com>
>>>> wrote:
>>>>
>>>> > genrally in hadoop map function will be exeucted by all the data
>>>> nodes on the input data set ,against this how can i do the following.
>>>> > i have some filter programs , and what i want to do is each data
>>>> node(slave) has to execute one filter alogrithm  simultaneously, diffent
>>>> from other data nodes executions.
>>>> >
>>>> > thanks in advance.
>>>> >
>>>> >
>>>>
>>>>
>>>
>>>
>>> --
>>> Bertrand Dechoux
>>>
>>
>>
>
>
> --
> Bertrand Dechoux
>

Re: how to execute different tasks on data nodes(simultaneously in hadoop).

Posted by Narasingu Ramesh <ra...@gmail.com>.
Hi Users,
              Hadoop can distribute all the data into HDFS inside MapReduce
tasks can work together. which one is goes to which data node and how it
works all those things it can maintain each task has own JVM in each data
node. JVM can handle hell number of data to process to the all data nodes
and then simplifies the each task.
Thanks & Regards,
Ramesh.Narasingu

On Mon, Sep 3, 2012 at 11:11 PM, Bertrand Dechoux <de...@gmail.com>wrote:

> Hi,
>
> Assuming you have to compute these value for every RGB pixel.
> Why couldn't you compute all these values at the same time on the same
> node?
>
> Hadoop let you distribute your computation but it doesn't mean each node
> has to compute only a part of the equations.
> Each node can compute all equations but for a 'small' part of the data.
> That's Hadoop strategy. That way, sequential read and data locality will
> improve your performances.
>
> Regards
>
> Bertrand
>
>
> On Mon, Sep 3, 2012 at 6:35 PM, mallik arjun <ma...@gmail.com>wrote:
>
>> [image: Inline image 1]
>>
>> On Mon, Sep 3, 2012 at 10:01 PM, Bertrand Dechoux <de...@gmail.com>wrote:
>>
>>> You can check the value of "map.input.file" in order to apply a
>>> different logic for each type of files (in the mapper).
>>> More information about your problem/context would help the readers to
>>> provide a more extensive reply.
>>>
>>> Regards
>>>
>>> Bertrand
>>>
>>> each data node has to process one equation of above simultaneously.
>>
>>
>>> On Mon, Sep 3, 2012 at 6:25 PM, Michael Segel <michael_segel@hotmail.com
>>> > wrote:
>>>
>>>> Not sure what you are trying to do...
>>>>
>>>> You want to pass through the entire data set on all nodes where each
>>>> node runs a single filter?
>>>>
>>>> You're thinking is orthogonal to how Hadoop works.
>>>>
>>>> You would be better off letting each node work on a portion of the data
>>>> which is local to that node running the entire filter set.
>>>>
>>>>
>>>> On Sep 3, 2012, at 11:19 AM, mallik arjun <ma...@gmail.com>
>>>> wrote:
>>>>
>>>> > genrally in hadoop map function will be exeucted by all the data
>>>> nodes on the input data set ,against this how can i do the following.
>>>> > i have some filter programs , and what i want to do is each data
>>>> node(slave) has to execute one filter alogrithm  simultaneously, diffent
>>>> from other data nodes executions.
>>>> >
>>>> > thanks in advance.
>>>> >
>>>> >
>>>>
>>>>
>>>
>>>
>>> --
>>> Bertrand Dechoux
>>>
>>
>>
>
>
> --
> Bertrand Dechoux
>

Re: how to execute different tasks on data nodes(simultaneously in hadoop).

Posted by Narasingu Ramesh <ra...@gmail.com>.
Hi Users,
              Hadoop can distribute all the data into HDFS inside MapReduce
tasks can work together. which one is goes to which data node and how it
works all those things it can maintain each task has own JVM in each data
node. JVM can handle hell number of data to process to the all data nodes
and then simplifies the each task.
Thanks & Regards,
Ramesh.Narasingu

On Mon, Sep 3, 2012 at 11:11 PM, Bertrand Dechoux <de...@gmail.com>wrote:

> Hi,
>
> Assuming you have to compute these value for every RGB pixel.
> Why couldn't you compute all these values at the same time on the same
> node?
>
> Hadoop let you distribute your computation but it doesn't mean each node
> has to compute only a part of the equations.
> Each node can compute all equations but for a 'small' part of the data.
> That's Hadoop strategy. That way, sequential read and data locality will
> improve your performances.
>
> Regards
>
> Bertrand
>
>
> On Mon, Sep 3, 2012 at 6:35 PM, mallik arjun <ma...@gmail.com>wrote:
>
>> [image: Inline image 1]
>>
>> On Mon, Sep 3, 2012 at 10:01 PM, Bertrand Dechoux <de...@gmail.com>wrote:
>>
>>> You can check the value of "map.input.file" in order to apply a
>>> different logic for each type of files (in the mapper).
>>> More information about your problem/context would help the readers to
>>> provide a more extensive reply.
>>>
>>> Regards
>>>
>>> Bertrand
>>>
>>> each data node has to process one equation of above simultaneously.
>>
>>
>>> On Mon, Sep 3, 2012 at 6:25 PM, Michael Segel <michael_segel@hotmail.com
>>> > wrote:
>>>
>>>> Not sure what you are trying to do...
>>>>
>>>> You want to pass through the entire data set on all nodes where each
>>>> node runs a single filter?
>>>>
>>>> You're thinking is orthogonal to how Hadoop works.
>>>>
>>>> You would be better off letting each node work on a portion of the data
>>>> which is local to that node running the entire filter set.
>>>>
>>>>
>>>> On Sep 3, 2012, at 11:19 AM, mallik arjun <ma...@gmail.com>
>>>> wrote:
>>>>
>>>> > genrally in hadoop map function will be exeucted by all the data
>>>> nodes on the input data set ,against this how can i do the following.
>>>> > i have some filter programs , and what i want to do is each data
>>>> node(slave) has to execute one filter alogrithm  simultaneously, diffent
>>>> from other data nodes executions.
>>>> >
>>>> > thanks in advance.
>>>> >
>>>> >
>>>>
>>>>
>>>
>>>
>>> --
>>> Bertrand Dechoux
>>>
>>
>>
>
>
> --
> Bertrand Dechoux
>

Re: how to execute different tasks on data nodes(simultaneously in hadoop).

Posted by Narasingu Ramesh <ra...@gmail.com>.
Hi Users,
              Hadoop can distribute all the data into HDFS inside MapReduce
tasks can work together. which one is goes to which data node and how it
works all those things it can maintain each task has own JVM in each data
node. JVM can handle hell number of data to process to the all data nodes
and then simplifies the each task.
Thanks & Regards,
Ramesh.Narasingu

On Mon, Sep 3, 2012 at 11:11 PM, Bertrand Dechoux <de...@gmail.com>wrote:

> Hi,
>
> Assuming you have to compute these value for every RGB pixel.
> Why couldn't you compute all these values at the same time on the same
> node?
>
> Hadoop let you distribute your computation but it doesn't mean each node
> has to compute only a part of the equations.
> Each node can compute all equations but for a 'small' part of the data.
> That's Hadoop strategy. That way, sequential read and data locality will
> improve your performances.
>
> Regards
>
> Bertrand
>
>
> On Mon, Sep 3, 2012 at 6:35 PM, mallik arjun <ma...@gmail.com>wrote:
>
>> [image: Inline image 1]
>>
>> On Mon, Sep 3, 2012 at 10:01 PM, Bertrand Dechoux <de...@gmail.com>wrote:
>>
>>> You can check the value of "map.input.file" in order to apply a
>>> different logic for each type of files (in the mapper).
>>> More information about your problem/context would help the readers to
>>> provide a more extensive reply.
>>>
>>> Regards
>>>
>>> Bertrand
>>>
>>> each data node has to process one equation of above simultaneously.
>>
>>
>>> On Mon, Sep 3, 2012 at 6:25 PM, Michael Segel <michael_segel@hotmail.com
>>> > wrote:
>>>
>>>> Not sure what you are trying to do...
>>>>
>>>> You want to pass through the entire data set on all nodes where each
>>>> node runs a single filter?
>>>>
>>>> You're thinking is orthogonal to how Hadoop works.
>>>>
>>>> You would be better off letting each node work on a portion of the data
>>>> which is local to that node running the entire filter set.
>>>>
>>>>
>>>> On Sep 3, 2012, at 11:19 AM, mallik arjun <ma...@gmail.com>
>>>> wrote:
>>>>
>>>> > genrally in hadoop map function will be exeucted by all the data
>>>> nodes on the input data set ,against this how can i do the following.
>>>> > i have some filter programs , and what i want to do is each data
>>>> node(slave) has to execute one filter alogrithm  simultaneously, diffent
>>>> from other data nodes executions.
>>>> >
>>>> > thanks in advance.
>>>> >
>>>> >
>>>>
>>>>
>>>
>>>
>>> --
>>> Bertrand Dechoux
>>>
>>
>>
>
>
> --
> Bertrand Dechoux
>

Re: how to execute different tasks on data nodes(simultaneously in hadoop).

Posted by Bertrand Dechoux <de...@gmail.com>.
Hi,

Assuming you have to compute these value for every RGB pixel.
Why couldn't you compute all these values at the same time on the same node?

Hadoop let you distribute your computation but it doesn't mean each node
has to compute only a part of the equations.
Each node can compute all equations but for a 'small' part of the data.
That's Hadoop strategy. That way, sequential read and data locality will
improve your performances.

Regards

Bertrand

On Mon, Sep 3, 2012 at 6:35 PM, mallik arjun <ma...@gmail.com> wrote:

> [image: Inline image 1]
>
> On Mon, Sep 3, 2012 at 10:01 PM, Bertrand Dechoux <de...@gmail.com>wrote:
>
>> You can check the value of "map.input.file" in order to apply a different
>> logic for each type of files (in the mapper).
>> More information about your problem/context would help the readers to
>> provide a more extensive reply.
>>
>> Regards
>>
>> Bertrand
>>
>> each data node has to process one equation of above simultaneously.
>
>
>> On Mon, Sep 3, 2012 at 6:25 PM, Michael Segel <mi...@hotmail.com>wrote:
>>
>>> Not sure what you are trying to do...
>>>
>>> You want to pass through the entire data set on all nodes where each
>>> node runs a single filter?
>>>
>>> You're thinking is orthogonal to how Hadoop works.
>>>
>>> You would be better off letting each node work on a portion of the data
>>> which is local to that node running the entire filter set.
>>>
>>>
>>> On Sep 3, 2012, at 11:19 AM, mallik arjun <ma...@gmail.com>
>>> wrote:
>>>
>>> > genrally in hadoop map function will be exeucted by all the data nodes
>>> on the input data set ,against this how can i do the following.
>>> > i have some filter programs , and what i want to do is each data
>>> node(slave) has to execute one filter alogrithm  simultaneously, diffent
>>> from other data nodes executions.
>>> >
>>> > thanks in advance.
>>> >
>>> >
>>>
>>>
>>
>>
>> --
>> Bertrand Dechoux
>>
>
>


-- 
Bertrand Dechoux

Re: how to execute different tasks on data nodes(simultaneously in hadoop).

Posted by Bertrand Dechoux <de...@gmail.com>.
Hi,

Assuming you have to compute these value for every RGB pixel.
Why couldn't you compute all these values at the same time on the same node?

Hadoop let you distribute your computation but it doesn't mean each node
has to compute only a part of the equations.
Each node can compute all equations but for a 'small' part of the data.
That's Hadoop strategy. That way, sequential read and data locality will
improve your performances.

Regards

Bertrand

On Mon, Sep 3, 2012 at 6:35 PM, mallik arjun <ma...@gmail.com> wrote:

> [image: Inline image 1]
>
> On Mon, Sep 3, 2012 at 10:01 PM, Bertrand Dechoux <de...@gmail.com>wrote:
>
>> You can check the value of "map.input.file" in order to apply a different
>> logic for each type of files (in the mapper).
>> More information about your problem/context would help the readers to
>> provide a more extensive reply.
>>
>> Regards
>>
>> Bertrand
>>
>> each data node has to process one equation of above simultaneously.
>
>
>> On Mon, Sep 3, 2012 at 6:25 PM, Michael Segel <mi...@hotmail.com>wrote:
>>
>>> Not sure what you are trying to do...
>>>
>>> You want to pass through the entire data set on all nodes where each
>>> node runs a single filter?
>>>
>>> You're thinking is orthogonal to how Hadoop works.
>>>
>>> You would be better off letting each node work on a portion of the data
>>> which is local to that node running the entire filter set.
>>>
>>>
>>> On Sep 3, 2012, at 11:19 AM, mallik arjun <ma...@gmail.com>
>>> wrote:
>>>
>>> > genrally in hadoop map function will be exeucted by all the data nodes
>>> on the input data set ,against this how can i do the following.
>>> > i have some filter programs , and what i want to do is each data
>>> node(slave) has to execute one filter alogrithm  simultaneously, diffent
>>> from other data nodes executions.
>>> >
>>> > thanks in advance.
>>> >
>>> >
>>>
>>>
>>
>>
>> --
>> Bertrand Dechoux
>>
>
>


-- 
Bertrand Dechoux

Re: how to execute different tasks on data nodes(simultaneously in hadoop).

Posted by Bertrand Dechoux <de...@gmail.com>.
Hi,

Assuming you have to compute these value for every RGB pixel.
Why couldn't you compute all these values at the same time on the same node?

Hadoop let you distribute your computation but it doesn't mean each node
has to compute only a part of the equations.
Each node can compute all equations but for a 'small' part of the data.
That's Hadoop strategy. That way, sequential read and data locality will
improve your performances.

Regards

Bertrand

On Mon, Sep 3, 2012 at 6:35 PM, mallik arjun <ma...@gmail.com> wrote:

> [image: Inline image 1]
>
> On Mon, Sep 3, 2012 at 10:01 PM, Bertrand Dechoux <de...@gmail.com>wrote:
>
>> You can check the value of "map.input.file" in order to apply a different
>> logic for each type of files (in the mapper).
>> More information about your problem/context would help the readers to
>> provide a more extensive reply.
>>
>> Regards
>>
>> Bertrand
>>
>> each data node has to process one equation of above simultaneously.
>
>
>> On Mon, Sep 3, 2012 at 6:25 PM, Michael Segel <mi...@hotmail.com>wrote:
>>
>>> Not sure what you are trying to do...
>>>
>>> You want to pass through the entire data set on all nodes where each
>>> node runs a single filter?
>>>
>>> You're thinking is orthogonal to how Hadoop works.
>>>
>>> You would be better off letting each node work on a portion of the data
>>> which is local to that node running the entire filter set.
>>>
>>>
>>> On Sep 3, 2012, at 11:19 AM, mallik arjun <ma...@gmail.com>
>>> wrote:
>>>
>>> > genrally in hadoop map function will be exeucted by all the data nodes
>>> on the input data set ,against this how can i do the following.
>>> > i have some filter programs , and what i want to do is each data
>>> node(slave) has to execute one filter alogrithm  simultaneously, diffent
>>> from other data nodes executions.
>>> >
>>> > thanks in advance.
>>> >
>>> >
>>>
>>>
>>
>>
>> --
>> Bertrand Dechoux
>>
>
>


-- 
Bertrand Dechoux

Re: how to execute different tasks on data nodes(simultaneously in hadoop).

Posted by Bertrand Dechoux <de...@gmail.com>.
Hi,

Assuming you have to compute these value for every RGB pixel.
Why couldn't you compute all these values at the same time on the same node?

Hadoop let you distribute your computation but it doesn't mean each node
has to compute only a part of the equations.
Each node can compute all equations but for a 'small' part of the data.
That's Hadoop strategy. That way, sequential read and data locality will
improve your performances.

Regards

Bertrand

On Mon, Sep 3, 2012 at 6:35 PM, mallik arjun <ma...@gmail.com> wrote:

> [image: Inline image 1]
>
> On Mon, Sep 3, 2012 at 10:01 PM, Bertrand Dechoux <de...@gmail.com>wrote:
>
>> You can check the value of "map.input.file" in order to apply a different
>> logic for each type of files (in the mapper).
>> More information about your problem/context would help the readers to
>> provide a more extensive reply.
>>
>> Regards
>>
>> Bertrand
>>
>> each data node has to process one equation of above simultaneously.
>
>
>> On Mon, Sep 3, 2012 at 6:25 PM, Michael Segel <mi...@hotmail.com>wrote:
>>
>>> Not sure what you are trying to do...
>>>
>>> You want to pass through the entire data set on all nodes where each
>>> node runs a single filter?
>>>
>>> You're thinking is orthogonal to how Hadoop works.
>>>
>>> You would be better off letting each node work on a portion of the data
>>> which is local to that node running the entire filter set.
>>>
>>>
>>> On Sep 3, 2012, at 11:19 AM, mallik arjun <ma...@gmail.com>
>>> wrote:
>>>
>>> > genrally in hadoop map function will be exeucted by all the data nodes
>>> on the input data set ,against this how can i do the following.
>>> > i have some filter programs , and what i want to do is each data
>>> node(slave) has to execute one filter alogrithm  simultaneously, diffent
>>> from other data nodes executions.
>>> >
>>> > thanks in advance.
>>> >
>>> >
>>>
>>>
>>
>>
>> --
>> Bertrand Dechoux
>>
>
>


-- 
Bertrand Dechoux

Re: how to execute different tasks on data nodes(simultaneously in hadoop).

Posted by mallik arjun <ma...@gmail.com>.
[image: Inline image 1]

On Mon, Sep 3, 2012 at 10:01 PM, Bertrand Dechoux <de...@gmail.com>wrote:

> You can check the value of "map.input.file" in order to apply a different
> logic for each type of files (in the mapper).
> More information about your problem/context would help the readers to
> provide a more extensive reply.
>
> Regards
>
> Bertrand
>
> each data node has to process one equation of above simultaneously.


> On Mon, Sep 3, 2012 at 6:25 PM, Michael Segel <mi...@hotmail.com>wrote:
>
>> Not sure what you are trying to do...
>>
>> You want to pass through the entire data set on all nodes where each node
>> runs a single filter?
>>
>> You're thinking is orthogonal to how Hadoop works.
>>
>> You would be better off letting each node work on a portion of the data
>> which is local to that node running the entire filter set.
>>
>>
>> On Sep 3, 2012, at 11:19 AM, mallik arjun <ma...@gmail.com> wrote:
>>
>> > genrally in hadoop map function will be exeucted by all the data nodes
>> on the input data set ,against this how can i do the following.
>> > i have some filter programs , and what i want to do is each data
>> node(slave) has to execute one filter alogrithm  simultaneously, diffent
>> from other data nodes executions.
>> >
>> > thanks in advance.
>> >
>> >
>>
>>
>
>
> --
> Bertrand Dechoux
>

Re: how to execute different tasks on data nodes(simultaneously in hadoop).

Posted by mallik arjun <ma...@gmail.com>.
[image: Inline image 1]

On Mon, Sep 3, 2012 at 10:01 PM, Bertrand Dechoux <de...@gmail.com>wrote:

> You can check the value of "map.input.file" in order to apply a different
> logic for each type of files (in the mapper).
> More information about your problem/context would help the readers to
> provide a more extensive reply.
>
> Regards
>
> Bertrand
>
> each data node has to process one equation of above simultaneously.


> On Mon, Sep 3, 2012 at 6:25 PM, Michael Segel <mi...@hotmail.com>wrote:
>
>> Not sure what you are trying to do...
>>
>> You want to pass through the entire data set on all nodes where each node
>> runs a single filter?
>>
>> You're thinking is orthogonal to how Hadoop works.
>>
>> You would be better off letting each node work on a portion of the data
>> which is local to that node running the entire filter set.
>>
>>
>> On Sep 3, 2012, at 11:19 AM, mallik arjun <ma...@gmail.com> wrote:
>>
>> > genrally in hadoop map function will be exeucted by all the data nodes
>> on the input data set ,against this how can i do the following.
>> > i have some filter programs , and what i want to do is each data
>> node(slave) has to execute one filter alogrithm  simultaneously, diffent
>> from other data nodes executions.
>> >
>> > thanks in advance.
>> >
>> >
>>
>>
>
>
> --
> Bertrand Dechoux
>

Re: how to execute different tasks on data nodes(simultaneously in hadoop).

Posted by mallik arjun <ma...@gmail.com>.
[image: Inline image 1]

On Mon, Sep 3, 2012 at 10:01 PM, Bertrand Dechoux <de...@gmail.com>wrote:

> You can check the value of "map.input.file" in order to apply a different
> logic for each type of files (in the mapper).
> More information about your problem/context would help the readers to
> provide a more extensive reply.
>
> Regards
>
> Bertrand
>
> each data node has to process one equation of above simultaneously.


> On Mon, Sep 3, 2012 at 6:25 PM, Michael Segel <mi...@hotmail.com>wrote:
>
>> Not sure what you are trying to do...
>>
>> You want to pass through the entire data set on all nodes where each node
>> runs a single filter?
>>
>> You're thinking is orthogonal to how Hadoop works.
>>
>> You would be better off letting each node work on a portion of the data
>> which is local to that node running the entire filter set.
>>
>>
>> On Sep 3, 2012, at 11:19 AM, mallik arjun <ma...@gmail.com> wrote:
>>
>> > genrally in hadoop map function will be exeucted by all the data nodes
>> on the input data set ,against this how can i do the following.
>> > i have some filter programs , and what i want to do is each data
>> node(slave) has to execute one filter alogrithm  simultaneously, diffent
>> from other data nodes executions.
>> >
>> > thanks in advance.
>> >
>> >
>>
>>
>
>
> --
> Bertrand Dechoux
>

Re: how to execute different tasks on data nodes(simultaneously in hadoop).

Posted by mallik arjun <ma...@gmail.com>.
[image: Inline image 1]

On Mon, Sep 3, 2012 at 10:01 PM, Bertrand Dechoux <de...@gmail.com>wrote:

> You can check the value of "map.input.file" in order to apply a different
> logic for each type of files (in the mapper).
> More information about your problem/context would help the readers to
> provide a more extensive reply.
>
> Regards
>
> Bertrand
>
> each data node has to process one equation of above simultaneously.


> On Mon, Sep 3, 2012 at 6:25 PM, Michael Segel <mi...@hotmail.com>wrote:
>
>> Not sure what you are trying to do...
>>
>> You want to pass through the entire data set on all nodes where each node
>> runs a single filter?
>>
>> You're thinking is orthogonal to how Hadoop works.
>>
>> You would be better off letting each node work on a portion of the data
>> which is local to that node running the entire filter set.
>>
>>
>> On Sep 3, 2012, at 11:19 AM, mallik arjun <ma...@gmail.com> wrote:
>>
>> > genrally in hadoop map function will be exeucted by all the data nodes
>> on the input data set ,against this how can i do the following.
>> > i have some filter programs , and what i want to do is each data
>> node(slave) has to execute one filter alogrithm  simultaneously, diffent
>> from other data nodes executions.
>> >
>> > thanks in advance.
>> >
>> >
>>
>>
>
>
> --
> Bertrand Dechoux
>

Re: how to execute different tasks on data nodes(simultaneously in hadoop).

Posted by Bertrand Dechoux <de...@gmail.com>.
You can check the value of "map.input.file" in order to apply a different
logic for each type of files (in the mapper).
More information about your problem/context would help the readers to
provide a more extensive reply.

Regards

Bertrand

On Mon, Sep 3, 2012 at 6:25 PM, Michael Segel <mi...@hotmail.com>wrote:

> Not sure what you are trying to do...
>
> You want to pass through the entire data set on all nodes where each node
> runs a single filter?
>
> You're thinking is orthogonal to how Hadoop works.
>
> You would be better off letting each node work on a portion of the data
> which is local to that node running the entire filter set.
>
>
> On Sep 3, 2012, at 11:19 AM, mallik arjun <ma...@gmail.com> wrote:
>
> > genrally in hadoop map function will be exeucted by all the data nodes
> on the input data set ,against this how can i do the following.
> > i have some filter programs , and what i want to do is each data
> node(slave) has to execute one filter alogrithm  simultaneously, diffent
> from other data nodes executions.
> >
> > thanks in advance.
> >
> >
>
>


-- 
Bertrand Dechoux

Re: how to execute different tasks on data nodes(simultaneously in hadoop).

Posted by Bertrand Dechoux <de...@gmail.com>.
You can check the value of "map.input.file" in order to apply a different
logic for each type of files (in the mapper).
More information about your problem/context would help the readers to
provide a more extensive reply.

Regards

Bertrand

On Mon, Sep 3, 2012 at 6:25 PM, Michael Segel <mi...@hotmail.com>wrote:

> Not sure what you are trying to do...
>
> You want to pass through the entire data set on all nodes where each node
> runs a single filter?
>
> You're thinking is orthogonal to how Hadoop works.
>
> You would be better off letting each node work on a portion of the data
> which is local to that node running the entire filter set.
>
>
> On Sep 3, 2012, at 11:19 AM, mallik arjun <ma...@gmail.com> wrote:
>
> > genrally in hadoop map function will be exeucted by all the data nodes
> on the input data set ,against this how can i do the following.
> > i have some filter programs , and what i want to do is each data
> node(slave) has to execute one filter alogrithm  simultaneously, diffent
> from other data nodes executions.
> >
> > thanks in advance.
> >
> >
>
>


-- 
Bertrand Dechoux

Re: how to execute different tasks on data nodes(simultaneously in hadoop).

Posted by Bertrand Dechoux <de...@gmail.com>.
You can check the value of "map.input.file" in order to apply a different
logic for each type of files (in the mapper).
More information about your problem/context would help the readers to
provide a more extensive reply.

Regards

Bertrand

On Mon, Sep 3, 2012 at 6:25 PM, Michael Segel <mi...@hotmail.com>wrote:

> Not sure what you are trying to do...
>
> You want to pass through the entire data set on all nodes where each node
> runs a single filter?
>
> You're thinking is orthogonal to how Hadoop works.
>
> You would be better off letting each node work on a portion of the data
> which is local to that node running the entire filter set.
>
>
> On Sep 3, 2012, at 11:19 AM, mallik arjun <ma...@gmail.com> wrote:
>
> > genrally in hadoop map function will be exeucted by all the data nodes
> on the input data set ,against this how can i do the following.
> > i have some filter programs , and what i want to do is each data
> node(slave) has to execute one filter alogrithm  simultaneously, diffent
> from other data nodes executions.
> >
> > thanks in advance.
> >
> >
>
>


-- 
Bertrand Dechoux

Re: how to execute different tasks on data nodes(simultaneously in hadoop).

Posted by Bertrand Dechoux <de...@gmail.com>.
You can check the value of "map.input.file" in order to apply a different
logic for each type of files (in the mapper).
More information about your problem/context would help the readers to
provide a more extensive reply.

Regards

Bertrand

On Mon, Sep 3, 2012 at 6:25 PM, Michael Segel <mi...@hotmail.com>wrote:

> Not sure what you are trying to do...
>
> You want to pass through the entire data set on all nodes where each node
> runs a single filter?
>
> You're thinking is orthogonal to how Hadoop works.
>
> You would be better off letting each node work on a portion of the data
> which is local to that node running the entire filter set.
>
>
> On Sep 3, 2012, at 11:19 AM, mallik arjun <ma...@gmail.com> wrote:
>
> > genrally in hadoop map function will be exeucted by all the data nodes
> on the input data set ,against this how can i do the following.
> > i have some filter programs , and what i want to do is each data
> node(slave) has to execute one filter alogrithm  simultaneously, diffent
> from other data nodes executions.
> >
> > thanks in advance.
> >
> >
>
>


-- 
Bertrand Dechoux

Re: how to execute different tasks on data nodes(simultaneously in hadoop).

Posted by Michael Segel <mi...@hotmail.com>.
Not sure what you are trying to do...

You want to pass through the entire data set on all nodes where each node runs a single filter? 

You're thinking is orthogonal to how Hadoop works. 

You would be better off letting each node work on a portion of the data which is local to that node running the entire filter set. 


On Sep 3, 2012, at 11:19 AM, mallik arjun <ma...@gmail.com> wrote:

> genrally in hadoop map function will be exeucted by all the data nodes on the input data set ,against this how can i do the following.
> i have some filter programs , and what i want to do is each data node(slave) has to execute one filter alogrithm  simultaneously, diffent from other data nodes executions.
> 
> thanks in advance.
> 
> 


Re: how to execute different tasks on data nodes(simultaneously in hadoop).

Posted by Michael Segel <mi...@hotmail.com>.
Not sure what you are trying to do...

You want to pass through the entire data set on all nodes where each node runs a single filter? 

You're thinking is orthogonal to how Hadoop works. 

You would be better off letting each node work on a portion of the data which is local to that node running the entire filter set. 


On Sep 3, 2012, at 11:19 AM, mallik arjun <ma...@gmail.com> wrote:

> genrally in hadoop map function will be exeucted by all the data nodes on the input data set ,against this how can i do the following.
> i have some filter programs , and what i want to do is each data node(slave) has to execute one filter alogrithm  simultaneously, diffent from other data nodes executions.
> 
> thanks in advance.
> 
> 


Re: how to execute different tasks on data nodes(simultaneously in hadoop).

Posted by Michael Segel <mi...@hotmail.com>.
Not sure what you are trying to do...

You want to pass through the entire data set on all nodes where each node runs a single filter? 

You're thinking is orthogonal to how Hadoop works. 

You would be better off letting each node work on a portion of the data which is local to that node running the entire filter set. 


On Sep 3, 2012, at 11:19 AM, mallik arjun <ma...@gmail.com> wrote:

> genrally in hadoop map function will be exeucted by all the data nodes on the input data set ,against this how can i do the following.
> i have some filter programs , and what i want to do is each data node(slave) has to execute one filter alogrithm  simultaneously, diffent from other data nodes executions.
> 
> thanks in advance.
> 
> 


Re: how to execute different tasks on data nodes(simultaneously in hadoop).

Posted by Michael Segel <mi...@hotmail.com>.
Not sure what you are trying to do...

You want to pass through the entire data set on all nodes where each node runs a single filter? 

You're thinking is orthogonal to how Hadoop works. 

You would be better off letting each node work on a portion of the data which is local to that node running the entire filter set. 


On Sep 3, 2012, at 11:19 AM, mallik arjun <ma...@gmail.com> wrote:

> genrally in hadoop map function will be exeucted by all the data nodes on the input data set ,against this how can i do the following.
> i have some filter programs , and what i want to do is each data node(slave) has to execute one filter alogrithm  simultaneously, diffent from other data nodes executions.
> 
> thanks in advance.
> 
>