You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by Jonathan Poon <jk...@ucdavis.edu> on 2014/03/27 00:34:29 UTC

Hadoop 2.2.0 Distributed Cache

Hi Everyone,

I'm submitting a MapReduce job using the -files option to copy a text file
that contains properties I use for the map and reduce functions.

I'm trying to obtain the local cache files in my mapper function using:

Path[] paths = context.getLocalCacheFiles();

However, i get an error saying getLocalCacheFiles() is undefined.  I've
imported the hadoop-mapreduce-client-core-2.2.0.jar as part of my build
environment in Eclipse.

Any ideas on what could be incorrect?

If I'm incorrectly using the distributed cache, could someone point me to
an example using the distributed cache with Hadoop 2.2.0?

Thanks for your help!

Jonathan

Re: Hadoop 2.2.0 Distributed Cache

Posted by Azuryy <az...@gmail.com>.
-files was used by hive, not MR.
So it cannot be recognized by your MR job.

Sent from my iPhone5s

> On 2014年3月28日, at 2:31, Jonathan Poon <jk...@ucdavis.edu> wrote:
> 
> Hi Serge,
> 
> I'm using the -files option through the hadoop cli.
> 
> The following lines of code works
> 
> Path[] localPaths = context.getLocalCacheFiles();
> String configFilename = localPaths[0].toString();
> 
> However, context.getLocalCacheFiles() is deprecated.  What is the correct equivalent function in the 2.2.0 API?
> 
> Jonathan
> 
> 
>> On Thu, Mar 27, 2014 at 11:17 AM, Serge Blazhievsky <ha...@gmail.com> wrote:
>> How are you putting files in distributed cache ? 
>> 
>> Sent from my iPhone
>> 
>>> On Mar 27, 2014, at 9:20 AM, Jonathan Poon <jk...@ucdavis.edu> wrote:
>>> 
>>> 
>>> Hi Stanley,
>>> 
>>> Sorry about the confusion, but I'm trying to read a txt file into my Mapper function.  I am trying to copy the file using the -files option when submitting the Hadoop job.
>>> 
>>> I try to obtain the filename using the following lines of code in my Mapper:
>>> 
>>> URI[] localPaths = context.getCacheFiles();
>>> String configFilename = localPaths[0].toString();
>>> 
>>> However, when I run the JAR in hadoop, I get a NullPointerException.  
>>> 
>>> Error: java.lang.NullPointerException
>>> 
>>> I'm running Hadoop 2.2 in Single Node mode.  Not sure if that affects things...
>>> 
>>> 
>>> 
>>> 
>>>> On Wed, Mar 26, 2014 at 8:21 PM, Stanley Shi <ss...@gopivotal.com> wrote:
>>>> where did you get the error? from the compiler or the runtime?
>>>> 
>>>> Regards,
>>>> Stanley Shi,
>>>> 
>>>> 
>>>> 
>>>>> On Thu, Mar 27, 2014 at 7:34 AM, Jonathan Poon <jk...@ucdavis.edu> wrote:
>>>>> Hi Everyone,
>>>>> 
>>>>> I'm submitting a MapReduce job using the -files option to copy a text file that contains properties I use for the map and reduce functions.  
>>>>> 
>>>>> I'm trying to obtain the local cache files in my mapper function using:
>>>>> 
>>>>> Path[] paths = context.getLocalCacheFiles();
>>>>> 
>>>>> However, i get an error saying getLocalCacheFiles() is undefined.  I've imported the hadoop-mapreduce-client-core-2.2.0.jar as part of my build environment in Eclipse.  
>>>>> 
>>>>> Any ideas on what could be incorrect?  
>>>>> 
>>>>> If I'm incorrectly using the distributed cache, could someone point me to an example using the distributed cache with Hadoop 2.2.0?  
>>>>> 
>>>>> Thanks for your help!
>>>>> 
>>>>> Jonathan 
> 

Re: Hadoop 2.2.0 Distributed Cache

Posted by Azuryy <az...@gmail.com>.
-files was used by hive, not MR.
So it cannot be recognized by your MR job.

Sent from my iPhone5s

> On 2014��3��28��, at 2:31, Jonathan Poon <jk...@ucdavis.edu> wrote:
> 
> Hi Serge,
> 
> I'm using the -files option through the hadoop cli.
> 
> The following lines of code works
> 
> Path[] localPaths = context.getLocalCacheFiles();
> String configFilename = localPaths[0].toString();
> 
> However, context.getLocalCacheFiles() is deprecated.  What is the correct equivalent function in the 2.2.0 API?
> 
> Jonathan
> 
> 
>> On Thu, Mar 27, 2014 at 11:17 AM, Serge Blazhievsky <ha...@gmail.com> wrote:
>> How are you putting files in distributed cache ? 
>> 
>> Sent from my iPhone
>> 
>>> On Mar 27, 2014, at 9:20 AM, Jonathan Poon <jk...@ucdavis.edu> wrote:
>>> 
>>> 
>>> Hi Stanley,
>>> 
>>> Sorry about the confusion, but I'm trying to read a txt file into my Mapper function.  I am trying to copy the file using the -files option when submitting the Hadoop job.
>>> 
>>> I try to obtain the filename using the following lines of code in my Mapper:
>>> 
>>> URI[] localPaths = context.getCacheFiles();
>>> String configFilename = localPaths[0].toString();
>>> 
>>> However, when I run the JAR in hadoop, I get a NullPointerException.  
>>> 
>>> Error: java.lang.NullPointerException
>>> 
>>> I'm running Hadoop 2.2 in Single Node mode.  Not sure if that affects things...
>>> 
>>> 
>>> 
>>> 
>>>> On Wed, Mar 26, 2014 at 8:21 PM, Stanley Shi <ss...@gopivotal.com> wrote:
>>>> where did you get the error? from the compiler or the runtime?
>>>> 
>>>> Regards,
>>>> Stanley Shi,
>>>> 
>>>> 
>>>> 
>>>>> On Thu, Mar 27, 2014 at 7:34 AM, Jonathan Poon <jk...@ucdavis.edu> wrote:
>>>>> Hi Everyone,
>>>>> 
>>>>> I'm submitting a MapReduce job using the -files option to copy a text file that contains properties I use for the map and reduce functions.  
>>>>> 
>>>>> I'm trying to obtain the local cache files in my mapper function using:
>>>>> 
>>>>> Path[] paths = context.getLocalCacheFiles();
>>>>> 
>>>>> However, i get an error saying getLocalCacheFiles() is undefined.  I've imported the hadoop-mapreduce-client-core-2.2.0.jar as part of my build environment in Eclipse.  
>>>>> 
>>>>> Any ideas on what could be incorrect?  
>>>>> 
>>>>> If I'm incorrectly using the distributed cache, could someone point me to an example using the distributed cache with Hadoop 2.2.0?  
>>>>> 
>>>>> Thanks for your help!
>>>>> 
>>>>> Jonathan 
> 

Re: Hadoop 2.2.0 Distributed Cache

Posted by Azuryy <az...@gmail.com>.
-files was used by hive, not MR.
So it cannot be recognized by your MR job.

Sent from my iPhone5s

> On 2014年3月28日, at 2:31, Jonathan Poon <jk...@ucdavis.edu> wrote:
> 
> Hi Serge,
> 
> I'm using the -files option through the hadoop cli.
> 
> The following lines of code works
> 
> Path[] localPaths = context.getLocalCacheFiles();
> String configFilename = localPaths[0].toString();
> 
> However, context.getLocalCacheFiles() is deprecated.  What is the correct equivalent function in the 2.2.0 API?
> 
> Jonathan
> 
> 
>> On Thu, Mar 27, 2014 at 11:17 AM, Serge Blazhievsky <ha...@gmail.com> wrote:
>> How are you putting files in distributed cache ? 
>> 
>> Sent from my iPhone
>> 
>>> On Mar 27, 2014, at 9:20 AM, Jonathan Poon <jk...@ucdavis.edu> wrote:
>>> 
>>> 
>>> Hi Stanley,
>>> 
>>> Sorry about the confusion, but I'm trying to read a txt file into my Mapper function.  I am trying to copy the file using the -files option when submitting the Hadoop job.
>>> 
>>> I try to obtain the filename using the following lines of code in my Mapper:
>>> 
>>> URI[] localPaths = context.getCacheFiles();
>>> String configFilename = localPaths[0].toString();
>>> 
>>> However, when I run the JAR in hadoop, I get a NullPointerException.  
>>> 
>>> Error: java.lang.NullPointerException
>>> 
>>> I'm running Hadoop 2.2 in Single Node mode.  Not sure if that affects things...
>>> 
>>> 
>>> 
>>> 
>>>> On Wed, Mar 26, 2014 at 8:21 PM, Stanley Shi <ss...@gopivotal.com> wrote:
>>>> where did you get the error? from the compiler or the runtime?
>>>> 
>>>> Regards,
>>>> Stanley Shi,
>>>> 
>>>> 
>>>> 
>>>>> On Thu, Mar 27, 2014 at 7:34 AM, Jonathan Poon <jk...@ucdavis.edu> wrote:
>>>>> Hi Everyone,
>>>>> 
>>>>> I'm submitting a MapReduce job using the -files option to copy a text file that contains properties I use for the map and reduce functions.  
>>>>> 
>>>>> I'm trying to obtain the local cache files in my mapper function using:
>>>>> 
>>>>> Path[] paths = context.getLocalCacheFiles();
>>>>> 
>>>>> However, i get an error saying getLocalCacheFiles() is undefined.  I've imported the hadoop-mapreduce-client-core-2.2.0.jar as part of my build environment in Eclipse.  
>>>>> 
>>>>> Any ideas on what could be incorrect?  
>>>>> 
>>>>> If I'm incorrectly using the distributed cache, could someone point me to an example using the distributed cache with Hadoop 2.2.0?  
>>>>> 
>>>>> Thanks for your help!
>>>>> 
>>>>> Jonathan 
> 

Re: Hadoop 2.2.0 Distributed Cache

Posted by Azuryy <az...@gmail.com>.
-files was used by hive, not MR.
So it cannot be recognized by your MR job.

Sent from my iPhone5s

> On 2014��3��28��, at 2:31, Jonathan Poon <jk...@ucdavis.edu> wrote:
> 
> Hi Serge,
> 
> I'm using the -files option through the hadoop cli.
> 
> The following lines of code works
> 
> Path[] localPaths = context.getLocalCacheFiles();
> String configFilename = localPaths[0].toString();
> 
> However, context.getLocalCacheFiles() is deprecated.  What is the correct equivalent function in the 2.2.0 API?
> 
> Jonathan
> 
> 
>> On Thu, Mar 27, 2014 at 11:17 AM, Serge Blazhievsky <ha...@gmail.com> wrote:
>> How are you putting files in distributed cache ? 
>> 
>> Sent from my iPhone
>> 
>>> On Mar 27, 2014, at 9:20 AM, Jonathan Poon <jk...@ucdavis.edu> wrote:
>>> 
>>> 
>>> Hi Stanley,
>>> 
>>> Sorry about the confusion, but I'm trying to read a txt file into my Mapper function.  I am trying to copy the file using the -files option when submitting the Hadoop job.
>>> 
>>> I try to obtain the filename using the following lines of code in my Mapper:
>>> 
>>> URI[] localPaths = context.getCacheFiles();
>>> String configFilename = localPaths[0].toString();
>>> 
>>> However, when I run the JAR in hadoop, I get a NullPointerException.  
>>> 
>>> Error: java.lang.NullPointerException
>>> 
>>> I'm running Hadoop 2.2 in Single Node mode.  Not sure if that affects things...
>>> 
>>> 
>>> 
>>> 
>>>> On Wed, Mar 26, 2014 at 8:21 PM, Stanley Shi <ss...@gopivotal.com> wrote:
>>>> where did you get the error? from the compiler or the runtime?
>>>> 
>>>> Regards,
>>>> Stanley Shi,
>>>> 
>>>> 
>>>> 
>>>>> On Thu, Mar 27, 2014 at 7:34 AM, Jonathan Poon <jk...@ucdavis.edu> wrote:
>>>>> Hi Everyone,
>>>>> 
>>>>> I'm submitting a MapReduce job using the -files option to copy a text file that contains properties I use for the map and reduce functions.  
>>>>> 
>>>>> I'm trying to obtain the local cache files in my mapper function using:
>>>>> 
>>>>> Path[] paths = context.getLocalCacheFiles();
>>>>> 
>>>>> However, i get an error saying getLocalCacheFiles() is undefined.  I've imported the hadoop-mapreduce-client-core-2.2.0.jar as part of my build environment in Eclipse.  
>>>>> 
>>>>> Any ideas on what could be incorrect?  
>>>>> 
>>>>> If I'm incorrectly using the distributed cache, could someone point me to an example using the distributed cache with Hadoop 2.2.0?  
>>>>> 
>>>>> Thanks for your help!
>>>>> 
>>>>> Jonathan 
> 

Re: Hadoop 2.2.0 Distributed Cache

Posted by Jonathan Poon <jk...@ucdavis.edu>.
Hi Serge,

I'm using the -files option through the hadoop cli.

The following lines of code works

Path[] localPaths = context.getLocalCacheFiles();
String configFilename = localPaths[0].toString();

However, context.getLocalCacheFiles() is deprecated.  What is the correct
equivalent function in the 2.2.0 API?

Jonathan


On Thu, Mar 27, 2014 at 11:17 AM, Serge Blazhievsky <ha...@gmail.com>wrote:

> How are you putting files in distributed cache ?
>
> Sent from my iPhone
>
> On Mar 27, 2014, at 9:20 AM, Jonathan Poon <jk...@ucdavis.edu> wrote:
>
>
> Hi Stanley,
>
> Sorry about the confusion, but I'm trying to read a txt file into my
> Mapper function.  I am trying to copy the file using the -files option when
> submitting the Hadoop job.
>
> I try to obtain the filename using the following lines of code in my
> Mapper:
>
> URI[] localPaths = context.getCacheFiles();
> String configFilename = localPaths[0].toString();
>
> However, when I run the JAR in hadoop, I get a NullPointerException.
>
> Error: java.lang.NullPointerException
>
> I'm running Hadoop 2.2 in Single Node mode.  Not sure if that affects
> things...
>
>
>
>
> On Wed, Mar 26, 2014 at 8:21 PM, Stanley Shi <ss...@gopivotal.com> wrote:
>
>> where did you get the error? from the compiler or the runtime?
>>
>> Regards,
>> *Stanley Shi,*
>>
>>
>>
>> On Thu, Mar 27, 2014 at 7:34 AM, Jonathan Poon <jk...@ucdavis.edu>wrote:
>>
>>> Hi Everyone,
>>>
>>> I'm submitting a MapReduce job using the -files option to copy a text
>>> file that contains properties I use for the map and reduce functions.
>>>
>>> I'm trying to obtain the local cache files in my mapper function using:
>>>
>>> Path[] paths = context.getLocalCacheFiles();
>>>
>>> However, i get an error saying getLocalCacheFiles() is undefined.  I've
>>> imported the hadoop-mapreduce-client-core-2.2.0.jar as part of my build
>>> environment in Eclipse.
>>>
>>> Any ideas on what could be incorrect?
>>>
>>> If I'm incorrectly using the distributed cache, could someone point me
>>> to an example using the distributed cache with Hadoop 2.2.0?
>>>
>>> Thanks for your help!
>>>
>>> Jonathan
>>>
>>
>>
>

Re: Hadoop 2.2.0 Distributed Cache

Posted by Jonathan Poon <jk...@ucdavis.edu>.
Hi Serge,

I'm using the -files option through the hadoop cli.

The following lines of code works

Path[] localPaths = context.getLocalCacheFiles();
String configFilename = localPaths[0].toString();

However, context.getLocalCacheFiles() is deprecated.  What is the correct
equivalent function in the 2.2.0 API?

Jonathan


On Thu, Mar 27, 2014 at 11:17 AM, Serge Blazhievsky <ha...@gmail.com>wrote:

> How are you putting files in distributed cache ?
>
> Sent from my iPhone
>
> On Mar 27, 2014, at 9:20 AM, Jonathan Poon <jk...@ucdavis.edu> wrote:
>
>
> Hi Stanley,
>
> Sorry about the confusion, but I'm trying to read a txt file into my
> Mapper function.  I am trying to copy the file using the -files option when
> submitting the Hadoop job.
>
> I try to obtain the filename using the following lines of code in my
> Mapper:
>
> URI[] localPaths = context.getCacheFiles();
> String configFilename = localPaths[0].toString();
>
> However, when I run the JAR in hadoop, I get a NullPointerException.
>
> Error: java.lang.NullPointerException
>
> I'm running Hadoop 2.2 in Single Node mode.  Not sure if that affects
> things...
>
>
>
>
> On Wed, Mar 26, 2014 at 8:21 PM, Stanley Shi <ss...@gopivotal.com> wrote:
>
>> where did you get the error? from the compiler or the runtime?
>>
>> Regards,
>> *Stanley Shi,*
>>
>>
>>
>> On Thu, Mar 27, 2014 at 7:34 AM, Jonathan Poon <jk...@ucdavis.edu>wrote:
>>
>>> Hi Everyone,
>>>
>>> I'm submitting a MapReduce job using the -files option to copy a text
>>> file that contains properties I use for the map and reduce functions.
>>>
>>> I'm trying to obtain the local cache files in my mapper function using:
>>>
>>> Path[] paths = context.getLocalCacheFiles();
>>>
>>> However, i get an error saying getLocalCacheFiles() is undefined.  I've
>>> imported the hadoop-mapreduce-client-core-2.2.0.jar as part of my build
>>> environment in Eclipse.
>>>
>>> Any ideas on what could be incorrect?
>>>
>>> If I'm incorrectly using the distributed cache, could someone point me
>>> to an example using the distributed cache with Hadoop 2.2.0?
>>>
>>> Thanks for your help!
>>>
>>> Jonathan
>>>
>>
>>
>

Re: Hadoop 2.2.0 Distributed Cache

Posted by Jonathan Poon <jk...@ucdavis.edu>.
Hi Serge,

I'm using the -files option through the hadoop cli.

The following lines of code works

Path[] localPaths = context.getLocalCacheFiles();
String configFilename = localPaths[0].toString();

However, context.getLocalCacheFiles() is deprecated.  What is the correct
equivalent function in the 2.2.0 API?

Jonathan


On Thu, Mar 27, 2014 at 11:17 AM, Serge Blazhievsky <ha...@gmail.com>wrote:

> How are you putting files in distributed cache ?
>
> Sent from my iPhone
>
> On Mar 27, 2014, at 9:20 AM, Jonathan Poon <jk...@ucdavis.edu> wrote:
>
>
> Hi Stanley,
>
> Sorry about the confusion, but I'm trying to read a txt file into my
> Mapper function.  I am trying to copy the file using the -files option when
> submitting the Hadoop job.
>
> I try to obtain the filename using the following lines of code in my
> Mapper:
>
> URI[] localPaths = context.getCacheFiles();
> String configFilename = localPaths[0].toString();
>
> However, when I run the JAR in hadoop, I get a NullPointerException.
>
> Error: java.lang.NullPointerException
>
> I'm running Hadoop 2.2 in Single Node mode.  Not sure if that affects
> things...
>
>
>
>
> On Wed, Mar 26, 2014 at 8:21 PM, Stanley Shi <ss...@gopivotal.com> wrote:
>
>> where did you get the error? from the compiler or the runtime?
>>
>> Regards,
>> *Stanley Shi,*
>>
>>
>>
>> On Thu, Mar 27, 2014 at 7:34 AM, Jonathan Poon <jk...@ucdavis.edu>wrote:
>>
>>> Hi Everyone,
>>>
>>> I'm submitting a MapReduce job using the -files option to copy a text
>>> file that contains properties I use for the map and reduce functions.
>>>
>>> I'm trying to obtain the local cache files in my mapper function using:
>>>
>>> Path[] paths = context.getLocalCacheFiles();
>>>
>>> However, i get an error saying getLocalCacheFiles() is undefined.  I've
>>> imported the hadoop-mapreduce-client-core-2.2.0.jar as part of my build
>>> environment in Eclipse.
>>>
>>> Any ideas on what could be incorrect?
>>>
>>> If I'm incorrectly using the distributed cache, could someone point me
>>> to an example using the distributed cache with Hadoop 2.2.0?
>>>
>>> Thanks for your help!
>>>
>>> Jonathan
>>>
>>
>>
>

Re: Hadoop 2.2.0 Distributed Cache

Posted by Jonathan Poon <jk...@ucdavis.edu>.
Hi Serge,

I'm using the -files option through the hadoop cli.

The following lines of code works

Path[] localPaths = context.getLocalCacheFiles();
String configFilename = localPaths[0].toString();

However, context.getLocalCacheFiles() is deprecated.  What is the correct
equivalent function in the 2.2.0 API?

Jonathan


On Thu, Mar 27, 2014 at 11:17 AM, Serge Blazhievsky <ha...@gmail.com>wrote:

> How are you putting files in distributed cache ?
>
> Sent from my iPhone
>
> On Mar 27, 2014, at 9:20 AM, Jonathan Poon <jk...@ucdavis.edu> wrote:
>
>
> Hi Stanley,
>
> Sorry about the confusion, but I'm trying to read a txt file into my
> Mapper function.  I am trying to copy the file using the -files option when
> submitting the Hadoop job.
>
> I try to obtain the filename using the following lines of code in my
> Mapper:
>
> URI[] localPaths = context.getCacheFiles();
> String configFilename = localPaths[0].toString();
>
> However, when I run the JAR in hadoop, I get a NullPointerException.
>
> Error: java.lang.NullPointerException
>
> I'm running Hadoop 2.2 in Single Node mode.  Not sure if that affects
> things...
>
>
>
>
> On Wed, Mar 26, 2014 at 8:21 PM, Stanley Shi <ss...@gopivotal.com> wrote:
>
>> where did you get the error? from the compiler or the runtime?
>>
>> Regards,
>> *Stanley Shi,*
>>
>>
>>
>> On Thu, Mar 27, 2014 at 7:34 AM, Jonathan Poon <jk...@ucdavis.edu>wrote:
>>
>>> Hi Everyone,
>>>
>>> I'm submitting a MapReduce job using the -files option to copy a text
>>> file that contains properties I use for the map and reduce functions.
>>>
>>> I'm trying to obtain the local cache files in my mapper function using:
>>>
>>> Path[] paths = context.getLocalCacheFiles();
>>>
>>> However, i get an error saying getLocalCacheFiles() is undefined.  I've
>>> imported the hadoop-mapreduce-client-core-2.2.0.jar as part of my build
>>> environment in Eclipse.
>>>
>>> Any ideas on what could be incorrect?
>>>
>>> If I'm incorrectly using the distributed cache, could someone point me
>>> to an example using the distributed cache with Hadoop 2.2.0?
>>>
>>> Thanks for your help!
>>>
>>> Jonathan
>>>
>>
>>
>

Re: Hadoop 2.2.0 Distributed Cache

Posted by Serge Blazhievsky <ha...@gmail.com>.
How are you putting files in distributed cache ? 

Sent from my iPhone

> On Mar 27, 2014, at 9:20 AM, Jonathan Poon <jk...@ucdavis.edu> wrote:
> 
> 
> Hi Stanley,
> 
> Sorry about the confusion, but I'm trying to read a txt file into my Mapper function.  I am trying to copy the file using the -files option when submitting the Hadoop job.
> 
> I try to obtain the filename using the following lines of code in my Mapper:
> 
> URI[] localPaths = context.getCacheFiles();
> String configFilename = localPaths[0].toString();
> 
> However, when I run the JAR in hadoop, I get a NullPointerException.  
> 
> Error: java.lang.NullPointerException
> 
> I'm running Hadoop 2.2 in Single Node mode.  Not sure if that affects things...
> 
> 
> 
> 
>> On Wed, Mar 26, 2014 at 8:21 PM, Stanley Shi <ss...@gopivotal.com> wrote:
>> where did you get the error? from the compiler or the runtime?
>> 
>> Regards,
>> Stanley Shi,
>> 
>> 
>> 
>>> On Thu, Mar 27, 2014 at 7:34 AM, Jonathan Poon <jk...@ucdavis.edu> wrote:
>>> Hi Everyone,
>>> 
>>> I'm submitting a MapReduce job using the -files option to copy a text file that contains properties I use for the map and reduce functions.  
>>> 
>>> I'm trying to obtain the local cache files in my mapper function using:
>>> 
>>> Path[] paths = context.getLocalCacheFiles();
>>> 
>>> However, i get an error saying getLocalCacheFiles() is undefined.  I've imported the hadoop-mapreduce-client-core-2.2.0.jar as part of my build environment in Eclipse. 
>>> 
>>> Any ideas on what could be incorrect?  
>>> 
>>> If I'm incorrectly using the distributed cache, could someone point me to an example using the distributed cache with Hadoop 2.2.0?  
>>> 
>>> Thanks for your help!
>>> 
>>> Jonathan 
>> 
> 

Re: Hadoop 2.2.0 Distributed Cache

Posted by Serge Blazhievsky <ha...@gmail.com>.
How are you putting files in distributed cache ? 

Sent from my iPhone

> On Mar 27, 2014, at 9:20 AM, Jonathan Poon <jk...@ucdavis.edu> wrote:
> 
> 
> Hi Stanley,
> 
> Sorry about the confusion, but I'm trying to read a txt file into my Mapper function.  I am trying to copy the file using the -files option when submitting the Hadoop job.
> 
> I try to obtain the filename using the following lines of code in my Mapper:
> 
> URI[] localPaths = context.getCacheFiles();
> String configFilename = localPaths[0].toString();
> 
> However, when I run the JAR in hadoop, I get a NullPointerException.  
> 
> Error: java.lang.NullPointerException
> 
> I'm running Hadoop 2.2 in Single Node mode.  Not sure if that affects things...
> 
> 
> 
> 
>> On Wed, Mar 26, 2014 at 8:21 PM, Stanley Shi <ss...@gopivotal.com> wrote:
>> where did you get the error? from the compiler or the runtime?
>> 
>> Regards,
>> Stanley Shi,
>> 
>> 
>> 
>>> On Thu, Mar 27, 2014 at 7:34 AM, Jonathan Poon <jk...@ucdavis.edu> wrote:
>>> Hi Everyone,
>>> 
>>> I'm submitting a MapReduce job using the -files option to copy a text file that contains properties I use for the map and reduce functions.  
>>> 
>>> I'm trying to obtain the local cache files in my mapper function using:
>>> 
>>> Path[] paths = context.getLocalCacheFiles();
>>> 
>>> However, i get an error saying getLocalCacheFiles() is undefined.  I've imported the hadoop-mapreduce-client-core-2.2.0.jar as part of my build environment in Eclipse. 
>>> 
>>> Any ideas on what could be incorrect?  
>>> 
>>> If I'm incorrectly using the distributed cache, could someone point me to an example using the distributed cache with Hadoop 2.2.0?  
>>> 
>>> Thanks for your help!
>>> 
>>> Jonathan 
>> 
> 

Re: Hadoop 2.2.0 Distributed Cache

Posted by Serge Blazhievsky <ha...@gmail.com>.
How are you putting files in distributed cache ? 

Sent from my iPhone

> On Mar 27, 2014, at 9:20 AM, Jonathan Poon <jk...@ucdavis.edu> wrote:
> 
> 
> Hi Stanley,
> 
> Sorry about the confusion, but I'm trying to read a txt file into my Mapper function.  I am trying to copy the file using the -files option when submitting the Hadoop job.
> 
> I try to obtain the filename using the following lines of code in my Mapper:
> 
> URI[] localPaths = context.getCacheFiles();
> String configFilename = localPaths[0].toString();
> 
> However, when I run the JAR in hadoop, I get a NullPointerException.  
> 
> Error: java.lang.NullPointerException
> 
> I'm running Hadoop 2.2 in Single Node mode.  Not sure if that affects things...
> 
> 
> 
> 
>> On Wed, Mar 26, 2014 at 8:21 PM, Stanley Shi <ss...@gopivotal.com> wrote:
>> where did you get the error? from the compiler or the runtime?
>> 
>> Regards,
>> Stanley Shi,
>> 
>> 
>> 
>>> On Thu, Mar 27, 2014 at 7:34 AM, Jonathan Poon <jk...@ucdavis.edu> wrote:
>>> Hi Everyone,
>>> 
>>> I'm submitting a MapReduce job using the -files option to copy a text file that contains properties I use for the map and reduce functions.  
>>> 
>>> I'm trying to obtain the local cache files in my mapper function using:
>>> 
>>> Path[] paths = context.getLocalCacheFiles();
>>> 
>>> However, i get an error saying getLocalCacheFiles() is undefined.  I've imported the hadoop-mapreduce-client-core-2.2.0.jar as part of my build environment in Eclipse. 
>>> 
>>> Any ideas on what could be incorrect?  
>>> 
>>> If I'm incorrectly using the distributed cache, could someone point me to an example using the distributed cache with Hadoop 2.2.0?  
>>> 
>>> Thanks for your help!
>>> 
>>> Jonathan 
>> 
> 

Re: Hadoop 2.2.0 Distributed Cache

Posted by Serge Blazhievsky <ha...@gmail.com>.
How are you putting files in distributed cache ? 

Sent from my iPhone

> On Mar 27, 2014, at 9:20 AM, Jonathan Poon <jk...@ucdavis.edu> wrote:
> 
> 
> Hi Stanley,
> 
> Sorry about the confusion, but I'm trying to read a txt file into my Mapper function.  I am trying to copy the file using the -files option when submitting the Hadoop job.
> 
> I try to obtain the filename using the following lines of code in my Mapper:
> 
> URI[] localPaths = context.getCacheFiles();
> String configFilename = localPaths[0].toString();
> 
> However, when I run the JAR in hadoop, I get a NullPointerException.  
> 
> Error: java.lang.NullPointerException
> 
> I'm running Hadoop 2.2 in Single Node mode.  Not sure if that affects things...
> 
> 
> 
> 
>> On Wed, Mar 26, 2014 at 8:21 PM, Stanley Shi <ss...@gopivotal.com> wrote:
>> where did you get the error? from the compiler or the runtime?
>> 
>> Regards,
>> Stanley Shi,
>> 
>> 
>> 
>>> On Thu, Mar 27, 2014 at 7:34 AM, Jonathan Poon <jk...@ucdavis.edu> wrote:
>>> Hi Everyone,
>>> 
>>> I'm submitting a MapReduce job using the -files option to copy a text file that contains properties I use for the map and reduce functions.  
>>> 
>>> I'm trying to obtain the local cache files in my mapper function using:
>>> 
>>> Path[] paths = context.getLocalCacheFiles();
>>> 
>>> However, i get an error saying getLocalCacheFiles() is undefined.  I've imported the hadoop-mapreduce-client-core-2.2.0.jar as part of my build environment in Eclipse. 
>>> 
>>> Any ideas on what could be incorrect?  
>>> 
>>> If I'm incorrectly using the distributed cache, could someone point me to an example using the distributed cache with Hadoop 2.2.0?  
>>> 
>>> Thanks for your help!
>>> 
>>> Jonathan 
>> 
> 

Re: Hadoop 2.2.0 Distributed Cache

Posted by Jonathan Poon <jk...@ucdavis.edu>.
Hi Stanley,

Sorry about the confusion, but I'm trying to read a txt file into my Mapper
function.  I am trying to copy the file using the -files option when
submitting the Hadoop job.

I try to obtain the filename using the following lines of code in my Mapper:

URI[] localPaths = context.getCacheFiles();
String configFilename = localPaths[0].toString();

However, when I run the JAR in hadoop, I get a NullPointerException.

Error: java.lang.NullPointerException

I'm running Hadoop 2.2 in Single Node mode.  Not sure if that affects
things...




On Wed, Mar 26, 2014 at 8:21 PM, Stanley Shi <ss...@gopivotal.com> wrote:

> where did you get the error? from the compiler or the runtime?
>
> Regards,
> *Stanley Shi,*
>
>
>
> On Thu, Mar 27, 2014 at 7:34 AM, Jonathan Poon <jk...@ucdavis.edu> wrote:
>
>> Hi Everyone,
>>
>> I'm submitting a MapReduce job using the -files option to copy a text
>> file that contains properties I use for the map and reduce functions.
>>
>> I'm trying to obtain the local cache files in my mapper function using:
>>
>> Path[] paths = context.getLocalCacheFiles();
>>
>> However, i get an error saying getLocalCacheFiles() is undefined.  I've
>> imported the hadoop-mapreduce-client-core-2.2.0.jar as part of my build
>> environment in Eclipse.
>>
>> Any ideas on what could be incorrect?
>>
>> If I'm incorrectly using the distributed cache, could someone point me to
>> an example using the distributed cache with Hadoop 2.2.0?
>>
>> Thanks for your help!
>>
>> Jonathan
>>
>
>

Re: Hadoop 2.2.0 Distributed Cache

Posted by Jonathan Poon <jk...@ucdavis.edu>.
Hi Stanley,

Sorry about the confusion, but I'm trying to read a txt file into my Mapper
function.  I am trying to copy the file using the -files option when
submitting the Hadoop job.

I try to obtain the filename using the following lines of code in my Mapper:

URI[] localPaths = context.getCacheFiles();
String configFilename = localPaths[0].toString();

However, when I run the JAR in hadoop, I get a NullPointerException.

Error: java.lang.NullPointerException

I'm running Hadoop 2.2 in Single Node mode.  Not sure if that affects
things...




On Wed, Mar 26, 2014 at 8:21 PM, Stanley Shi <ss...@gopivotal.com> wrote:

> where did you get the error? from the compiler or the runtime?
>
> Regards,
> *Stanley Shi,*
>
>
>
> On Thu, Mar 27, 2014 at 7:34 AM, Jonathan Poon <jk...@ucdavis.edu> wrote:
>
>> Hi Everyone,
>>
>> I'm submitting a MapReduce job using the -files option to copy a text
>> file that contains properties I use for the map and reduce functions.
>>
>> I'm trying to obtain the local cache files in my mapper function using:
>>
>> Path[] paths = context.getLocalCacheFiles();
>>
>> However, i get an error saying getLocalCacheFiles() is undefined.  I've
>> imported the hadoop-mapreduce-client-core-2.2.0.jar as part of my build
>> environment in Eclipse.
>>
>> Any ideas on what could be incorrect?
>>
>> If I'm incorrectly using the distributed cache, could someone point me to
>> an example using the distributed cache with Hadoop 2.2.0?
>>
>> Thanks for your help!
>>
>> Jonathan
>>
>
>

Re: Hadoop 2.2.0 Distributed Cache

Posted by Jonathan Poon <jk...@ucdavis.edu>.
Hi Stanley,

Sorry about the confusion, but I'm trying to read a txt file into my Mapper
function.  I am trying to copy the file using the -files option when
submitting the Hadoop job.

I try to obtain the filename using the following lines of code in my Mapper:

URI[] localPaths = context.getCacheFiles();
String configFilename = localPaths[0].toString();

However, when I run the JAR in hadoop, I get a NullPointerException.

Error: java.lang.NullPointerException

I'm running Hadoop 2.2 in Single Node mode.  Not sure if that affects
things...




On Wed, Mar 26, 2014 at 8:21 PM, Stanley Shi <ss...@gopivotal.com> wrote:

> where did you get the error? from the compiler or the runtime?
>
> Regards,
> *Stanley Shi,*
>
>
>
> On Thu, Mar 27, 2014 at 7:34 AM, Jonathan Poon <jk...@ucdavis.edu> wrote:
>
>> Hi Everyone,
>>
>> I'm submitting a MapReduce job using the -files option to copy a text
>> file that contains properties I use for the map and reduce functions.
>>
>> I'm trying to obtain the local cache files in my mapper function using:
>>
>> Path[] paths = context.getLocalCacheFiles();
>>
>> However, i get an error saying getLocalCacheFiles() is undefined.  I've
>> imported the hadoop-mapreduce-client-core-2.2.0.jar as part of my build
>> environment in Eclipse.
>>
>> Any ideas on what could be incorrect?
>>
>> If I'm incorrectly using the distributed cache, could someone point me to
>> an example using the distributed cache with Hadoop 2.2.0?
>>
>> Thanks for your help!
>>
>> Jonathan
>>
>
>

Re: Hadoop 2.2.0 Distributed Cache

Posted by Jonathan Poon <jk...@ucdavis.edu>.
Hi Stanley,

Sorry about the confusion, but I'm trying to read a txt file into my Mapper
function.  I am trying to copy the file using the -files option when
submitting the Hadoop job.

I try to obtain the filename using the following lines of code in my Mapper:

URI[] localPaths = context.getCacheFiles();
String configFilename = localPaths[0].toString();

However, when I run the JAR in hadoop, I get a NullPointerException.

Error: java.lang.NullPointerException

I'm running Hadoop 2.2 in Single Node mode.  Not sure if that affects
things...




On Wed, Mar 26, 2014 at 8:21 PM, Stanley Shi <ss...@gopivotal.com> wrote:

> where did you get the error? from the compiler or the runtime?
>
> Regards,
> *Stanley Shi,*
>
>
>
> On Thu, Mar 27, 2014 at 7:34 AM, Jonathan Poon <jk...@ucdavis.edu> wrote:
>
>> Hi Everyone,
>>
>> I'm submitting a MapReduce job using the -files option to copy a text
>> file that contains properties I use for the map and reduce functions.
>>
>> I'm trying to obtain the local cache files in my mapper function using:
>>
>> Path[] paths = context.getLocalCacheFiles();
>>
>> However, i get an error saying getLocalCacheFiles() is undefined.  I've
>> imported the hadoop-mapreduce-client-core-2.2.0.jar as part of my build
>> environment in Eclipse.
>>
>> Any ideas on what could be incorrect?
>>
>> If I'm incorrectly using the distributed cache, could someone point me to
>> an example using the distributed cache with Hadoop 2.2.0?
>>
>> Thanks for your help!
>>
>> Jonathan
>>
>
>

Re: Hadoop 2.2.0 Distributed Cache

Posted by Stanley Shi <ss...@gopivotal.com>.
where did you get the error? from the compiler or the runtime?

Regards,
*Stanley Shi,*



On Thu, Mar 27, 2014 at 7:34 AM, Jonathan Poon <jk...@ucdavis.edu> wrote:

> Hi Everyone,
>
> I'm submitting a MapReduce job using the -files option to copy a text file
> that contains properties I use for the map and reduce functions.
>
> I'm trying to obtain the local cache files in my mapper function using:
>
> Path[] paths = context.getLocalCacheFiles();
>
> However, i get an error saying getLocalCacheFiles() is undefined.  I've
> imported the hadoop-mapreduce-client-core-2.2.0.jar as part of my build
> environment in Eclipse.
>
> Any ideas on what could be incorrect?
>
> If I'm incorrectly using the distributed cache, could someone point me to
> an example using the distributed cache with Hadoop 2.2.0?
>
> Thanks for your help!
>
> Jonathan
>

Re: Hadoop 2.2.0 Distributed Cache

Posted by Stanley Shi <ss...@gopivotal.com>.
where did you get the error? from the compiler or the runtime?

Regards,
*Stanley Shi,*



On Thu, Mar 27, 2014 at 7:34 AM, Jonathan Poon <jk...@ucdavis.edu> wrote:

> Hi Everyone,
>
> I'm submitting a MapReduce job using the -files option to copy a text file
> that contains properties I use for the map and reduce functions.
>
> I'm trying to obtain the local cache files in my mapper function using:
>
> Path[] paths = context.getLocalCacheFiles();
>
> However, i get an error saying getLocalCacheFiles() is undefined.  I've
> imported the hadoop-mapreduce-client-core-2.2.0.jar as part of my build
> environment in Eclipse.
>
> Any ideas on what could be incorrect?
>
> If I'm incorrectly using the distributed cache, could someone point me to
> an example using the distributed cache with Hadoop 2.2.0?
>
> Thanks for your help!
>
> Jonathan
>

Re: Hadoop 2.2.0 Distributed Cache

Posted by Stanley Shi <ss...@gopivotal.com>.
where did you get the error? from the compiler or the runtime?

Regards,
*Stanley Shi,*



On Thu, Mar 27, 2014 at 7:34 AM, Jonathan Poon <jk...@ucdavis.edu> wrote:

> Hi Everyone,
>
> I'm submitting a MapReduce job using the -files option to copy a text file
> that contains properties I use for the map and reduce functions.
>
> I'm trying to obtain the local cache files in my mapper function using:
>
> Path[] paths = context.getLocalCacheFiles();
>
> However, i get an error saying getLocalCacheFiles() is undefined.  I've
> imported the hadoop-mapreduce-client-core-2.2.0.jar as part of my build
> environment in Eclipse.
>
> Any ideas on what could be incorrect?
>
> If I'm incorrectly using the distributed cache, could someone point me to
> an example using the distributed cache with Hadoop 2.2.0?
>
> Thanks for your help!
>
> Jonathan
>

Re: Hadoop 2.2.0 Distributed Cache

Posted by Stanley Shi <ss...@gopivotal.com>.
where did you get the error? from the compiler or the runtime?

Regards,
*Stanley Shi,*



On Thu, Mar 27, 2014 at 7:34 AM, Jonathan Poon <jk...@ucdavis.edu> wrote:

> Hi Everyone,
>
> I'm submitting a MapReduce job using the -files option to copy a text file
> that contains properties I use for the map and reduce functions.
>
> I'm trying to obtain the local cache files in my mapper function using:
>
> Path[] paths = context.getLocalCacheFiles();
>
> However, i get an error saying getLocalCacheFiles() is undefined.  I've
> imported the hadoop-mapreduce-client-core-2.2.0.jar as part of my build
> environment in Eclipse.
>
> Any ideas on what could be incorrect?
>
> If I'm incorrectly using the distributed cache, could someone point me to
> an example using the distributed cache with Hadoop 2.2.0?
>
> Thanks for your help!
>
> Jonathan
>