You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Jonathan Aquilina <ja...@eagleeyet.net> on 2015/02/20 06:40:21 UTC

writing mappers and reducers question

 

Hey guys Is it safe to guess that one would need a single node setup to
be able to write mappers and reducers for hadoop? 

-- 
Regards,
Jonathan Aquilina
Founder Eagle Eye T
 

Re: writing mappers and reducers question

Posted by Drake민영근 <dr...@nexr.com>.
I suggest Standalone mode for developing mapper or reducer. But in case of
partitioner or combiner, you need to setup Pseudo-Distributed mode.

Drake 민영근 Ph.D
kt NexR

On Fri, Feb 20, 2015 at 3:18 PM, unmesha sreeveni <un...@gmail.com>
wrote:

> You can write MapReduce jobs in eclipse also for testing purpose. Once it
> is done u can create jar and run that in your single node or multinode.
> But plese note while doing in such IDE s using hadoop dependecies, There
> will not be input splits, different mappers etc..
>
>
>

Re: writing mappers and reducers question

Posted by Drake민영근 <dr...@nexr.com>.
I suggest Standalone mode for developing mapper or reducer. But in case of
partitioner or combiner, you need to setup Pseudo-Distributed mode.

Drake 민영근 Ph.D
kt NexR

On Fri, Feb 20, 2015 at 3:18 PM, unmesha sreeveni <un...@gmail.com>
wrote:

> You can write MapReduce jobs in eclipse also for testing purpose. Once it
> is done u can create jar and run that in your single node or multinode.
> But plese note while doing in such IDE s using hadoop dependecies, There
> will not be input splits, different mappers etc..
>
>
>

Re: writing mappers and reducers question

Posted by Drake민영근 <dr...@nexr.com>.
I suggest Standalone mode for developing mapper or reducer. But in case of
partitioner or combiner, you need to setup Pseudo-Distributed mode.

Drake 민영근 Ph.D
kt NexR

On Fri, Feb 20, 2015 at 3:18 PM, unmesha sreeveni <un...@gmail.com>
wrote:

> You can write MapReduce jobs in eclipse also for testing purpose. Once it
> is done u can create jar and run that in your single node or multinode.
> But plese note while doing in such IDE s using hadoop dependecies, There
> will not be input splits, different mappers etc..
>
>
>

Re: writing mappers and reducers question

Posted by Drake민영근 <dr...@nexr.com>.
I suggest Standalone mode for developing mapper or reducer. But in case of
partitioner or combiner, you need to setup Pseudo-Distributed mode.

Drake 민영근 Ph.D
kt NexR

On Fri, Feb 20, 2015 at 3:18 PM, unmesha sreeveni <un...@gmail.com>
wrote:

> You can write MapReduce jobs in eclipse also for testing purpose. Once it
> is done u can create jar and run that in your single node or multinode.
> But plese note while doing in such IDE s using hadoop dependecies, There
> will not be input splits, different mappers etc..
>
>
>

Re: writing mappers and reducers question

Posted by unmesha sreeveni <un...@gmail.com>.
You can write MapReduce jobs in eclipse also for testing purpose. Once it
is done u can create jar and run that in your single node or multinode.
But plese note while doing in such IDE s using hadoop dependecies, There
will not be input splits, different mappers etc..

Re: writing mappers and reducers question

Posted by unmesha sreeveni <un...@gmail.com>.
You can write MapReduce jobs in eclipse also for testing purpose. Once it
is done u can create jar and run that in your single node or multinode.
But plese note while doing in such IDE s using hadoop dependecies, There
will not be input splits, different mappers etc..

Re: writing mappers and reducers question

Posted by unmesha sreeveni <un...@gmail.com>.
You can write MapReduce jobs in eclipse also for testing purpose. Once it
is done u can create jar and run that in your single node or multinode.
But plese note while doing in such IDE s using hadoop dependecies, There
will not be input splits, different mappers etc..

Re: writing mappers and reducers question

Posted by unmesha sreeveni <un...@gmail.com>.
You can write MapReduce jobs in eclipse also for testing purpose. Once it
is done u can create jar and run that in your single node or multinode.
But plese note while doing in such IDE s using hadoop dependecies, There
will not be input splits, different mappers etc..

Re: writing mappers and reducers question

Posted by Shahab Yunus <sh...@gmail.com>.
Nope. You can use the Standalone setup too to test things. Details here:
http://hadoop.apache.org/docs/r2.2.0/hadoop-project-dist/hadoop-common/SingleNodeSetup.html#Standalone_Operation

Regards,
Shahab

On Fri, Feb 20, 2015 at 12:40 AM, Jonathan Aquilina <jaquilina@eagleeyet.net
> wrote:

>  Hey guys Is it safe to guess that one would need a single node setup to
> be able to write mappers and reducers for hadoop?
>
>
>
> --
> Regards,
> Jonathan Aquilina
> Founder Eagle Eye T
>
>

Re: writing mappers and reducers question

Posted by Shahab Yunus <sh...@gmail.com>.
Nope. You can use the Standalone setup too to test things. Details here:
http://hadoop.apache.org/docs/r2.2.0/hadoop-project-dist/hadoop-common/SingleNodeSetup.html#Standalone_Operation

Regards,
Shahab

On Fri, Feb 20, 2015 at 12:40 AM, Jonathan Aquilina <jaquilina@eagleeyet.net
> wrote:

>  Hey guys Is it safe to guess that one would need a single node setup to
> be able to write mappers and reducers for hadoop?
>
>
>
> --
> Regards,
> Jonathan Aquilina
> Founder Eagle Eye T
>
>

Re: writing mappers and reducers question

Posted by Shahab Yunus <sh...@gmail.com>.
Nope. You can use the Standalone setup too to test things. Details here:
http://hadoop.apache.org/docs/r2.2.0/hadoop-project-dist/hadoop-common/SingleNodeSetup.html#Standalone_Operation

Regards,
Shahab

On Fri, Feb 20, 2015 at 12:40 AM, Jonathan Aquilina <jaquilina@eagleeyet.net
> wrote:

>  Hey guys Is it safe to guess that one would need a single node setup to
> be able to write mappers and reducers for hadoop?
>
>
>
> --
> Regards,
> Jonathan Aquilina
> Founder Eagle Eye T
>
>

Re: writing mappers and reducers question

Posted by Shahab Yunus <sh...@gmail.com>.
Nope. You can use the Standalone setup too to test things. Details here:
http://hadoop.apache.org/docs/r2.2.0/hadoop-project-dist/hadoop-common/SingleNodeSetup.html#Standalone_Operation

Regards,
Shahab

On Fri, Feb 20, 2015 at 12:40 AM, Jonathan Aquilina <jaquilina@eagleeyet.net
> wrote:

>  Hey guys Is it safe to guess that one would need a single node setup to
> be able to write mappers and reducers for hadoop?
>
>
>
> --
> Regards,
> Jonathan Aquilina
> Founder Eagle Eye T
>
>