You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Mark Kerzner <ma...@gmail.com> on 2011/03/19 00:03:01 UTC

running local hadoop job in windows

Hi, guys,

I want to give my users a sense of what my hadoop application can do, and I
am trying to make it run in Windows, with this command

java -jar dist\FreeEed.jar

This command runs my hadoop job locally, and it works in Linux. However, in
Windows I get the error listed below. Since I am running completely locally,
I don't see why it is trying to do what it does. Is there a workaround?

Thank you,
Mark

Error:

11/03/18 17:57:43 INFO jvm.JvmMetrics: Initializing JVM Metrics with
processName
=JobTracker, sessionId=
java.io.IOException: Failed to set permissions of path:
file:/tmp/hadoop-Mark/ma
pred/staging/Mark-1397630897/.staging to 0700
        at
org.apache.hadoop.fs.RawLocalFileSystem.checkReturnValue(RawLocalFile
System.java:526)
        at
org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSys
tem.java:500)
        at
org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.jav
a:310)
        at
org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:18
9)
        at
org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmi
ssionFiles.java:116)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:799)
        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:793)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Unknown Source)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma
tion.java:1063)
        at
org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:7
93)
        at org.apache.hadoop.mapreduce.Job.submit(Job.java:465)
        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:495)
        at org.freeeed.main.FreeEedProcess.run(FreeEedProcess.java:66)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
        at org.freeeed.main.FreeEedProcess.main(FreeEedProcess.java:71)
        at org.freeeed.main.FreeEedMain.runProcessing(FreeEedMain.java:88)
        at org.freeeed.main.FreeEedMain.processOptions(FreeEedMain.java:65)
        at org.freeeed.main.FreeEedMain.main(FreeEedMain.java:31)

Re: running local hadoop job in windows

Posted by Mark Kerzner <ma...@gmail.com>.
Guys,

I really appreciate all your answers. I am sure that I could have made
Hadoop run under Windows - all of you have done it - but I may go with
Stephen's advice after all. Since I am doing this for my open source
eDiscovery project, FreeEed <https://github.com/markkerzner/FreeEed>, then I
need a lot of Linux utilities anyway, such as readpst, which only works on
Linux.

Thank you,
Mark

On Tue, Apr 5, 2011 at 2:51 PM, Stephen Boesch <ja...@gmail.com> wrote:

> another approach you may well have already considered.. but may
> reconsider..
> use (free version of ..) vmware player running on your winXXX env ("host")
> and install linux distro of your choice as the guest o/s.  you can spin up
> essentially any number of instances that way.
>
> .. and not be concerned about the configuration/behavorial discrepancies
> between cygwin and native linux.
>
> If you wish there are pre-packaged distros for cloudera, apache, yahoo! for
> vmware.
>
>
>
> 2011/4/5 Tish Heyssel <ti...@gmail.com>
>
> > Mark,
> >
> > Make sure you add the cygwin/bin to your global PATH variable in Windows
> > too... and echo the PATH, if you're running in a command window to make
> > sure
> > it shows up there...  When running through eclipse, it should pick up the
> > PATH variable.
> >
> > Good luck.  its worth the trouble.  this does work.
> >
> > tish
> >
> > On Mon, Apr 4, 2011 at 10:13 PM, Mark Kerzner <ma...@gmail.com>
> > wrote:
> >
> > > I understand now, I need to install cygwin correctly, asking it for all
> > the
> > > right options.
> > >
> > > Thank you,
> > > Mark
> > >
> > > On Mon, Apr 4, 2011 at 9:06 PM, Lance Norskog <go...@gmail.com>
> wrote:
> > >
> > > > You're stuck with cygwin! Hadoop insists on running the 'chmod'
> > > > program. You have to have a binary in your search path.
> > > >
> > > > Lance
> > > >
> > > > On Sat, Mar 19, 2011 at 9:15 PM, Mark Kerzner <markkerzner@gmail.com
> >
> > > > wrote:
> > > > > Now I AM running under cygwin, and I get the same error, as you can
> > see
> > > > from
> > > > > the attached screenshot.
> > > > > Thank you,
> > > > > Mark
> > > > >
> > > > > On Sat, Mar 19, 2011 at 9:16 PM, Simon <gs...@gmail.com> wrote:
> > > > >>
> > > > >> As far as I know, currently hadoop can only run under *nix like
> > > systems.
> > > > >> Correct me if I am wrong.
> > > > >> And if you want to run it under windows, you can try cygwin as the
> > > > >> environment.
> > > > >>
> > > > >> Thanks
> > > > >> Simon
> > > > >>
> > > > >> On Fri, Mar 18, 2011 at 7:11 PM, Mark Kerzner <
> > markkerzner@gmail.com>
> > > > >> wrote:
> > > > >>
> > > > >> > No, I hoped that it is not absolutely necessary for that kind of
> > > use.
> > > > I
> > > > >> > am
> > > > >> > not even issuing the "hadoop -jar" command, but it is pure "java
> > > > -jar".
> > > > >> > It
> > > > >> > is true though that my Ubuntu has a Hadoop set up, so maybe it
> is
> > > > doing
> > > > >> > a
> > > > >> > lot of magic behind my back.
> > > > >> >
> > > > >> > I did not want to have my inexperienced Windows users to have to
> > > > install
> > > > >> > cygwin for just trying the package.
> > > > >> >
> > > > >> > Thank you,
> > > > >> > Mark
> > > > >> >
> > > > >> > On Fri, Mar 18, 2011 at 6:06 PM, Stephen Boesch <
> > javadba@gmail.com>
> > > > >> > wrote:
> > > > >> >
> > > > >> > > presumably you ran this under cygwin?
> > > > >> > >
> > > > >> > > 2011/3/18 Mark Kerzner <ma...@gmail.com>
> > > > >> > >
> > > > >> > > > Hi, guys,
> > > > >> > > >
> > > > >> > > > I want to give my users a sense of what my hadoop
> application
> > > can
> > > > >> > > > do,
> > > > >> > and
> > > > >> > > I
> > > > >> > > > am trying to make it run in Windows, with this command
> > > > >> > > >
> > > > >> > > > java -jar dist\FreeEed.jar
> > > > >> > > >
> > > > >> > > > This command runs my hadoop job locally, and it works in
> > Linux.
> > > > >> > However,
> > > > >> > > in
> > > > >> > > > Windows I get the error listed below. Since I am running
> > > > completely
> > > > >> > > > locally,
> > > > >> > > > I don't see why it is trying to do what it does. Is there a
> > > > >> > > > workaround?
> > > > >> > > >
> > > > >> > > > Thank you,
> > > > >> > > > Mark
> > > > >> > > >
> > > > >> > > > Error:
> > > > >> > > >
> > > > >> > > > 11/03/18 17:57:43 INFO jvm.JvmMetrics: Initializing JVM
> > Metrics
> > > > with
> > > > >> > > > processName
> > > > >> > > > =JobTracker, sessionId=
> > > > >> > > > java.io.IOException: Failed to set permissions of path:
> > > > >> > > > file:/tmp/hadoop-Mark/ma
> > > > >> > > > pred/staging/Mark-1397630897/.staging to 0700
> > > > >> > > >        at
> > > > >> > > >
> > > > >> > > >
> > > > org.apache.hadoop.fs.RawLocalFileSystem.checkReturnValue(RawLocalFile
> > > > >> > > > System.java:526)
> > > > >> > > >        at
> > > > >> > > >
> > > > >> > > >
> > > > org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSys
> > > > >> > > > tem.java:500)
> > > > >> > > >        at
> > > > >> > > >
> > > > >> > > >
> > > > org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.jav
> > > > >> > > > a:310)
> > > > >> > > >        at
> > > > >> > > >
> > > > >> > > >
> > > > org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:18
> > > > >> > > > 9)
> > > > >> > > >        at
> > > > >> > > >
> > > > >> > > >
> > > > org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmi
> > > > >> > > > ssionFiles.java:116)
> > > > >> > > >        at
> > > > >> > > > org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:799)
> > > > >> > > >        at
> > > > >> > > > org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:793)
> > > > >> > > >        at java.security.AccessController.doPrivileged(Native
> > > > Method)
> > > > >> > > >        at javax.security.auth.Subject.doAs(Unknown Source)
> > > > >> > > >        at
> > > > >> > > >
> > > > >> > > >
> > > > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma
> > > > >> > > > tion.java:1063)
> > > > >> > > >        at
> > > > >> > > >
> > > > >> > > >
> > > > org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:7
> > > > >> > > > 93)
> > > > >> > > >        at
> org.apache.hadoop.mapreduce.Job.submit(Job.java:465)
> > > > >> > > >        at
> > > > >> > org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:495)
> > > > >> > > >        at
> > > > >> > > > org.freeeed.main.FreeEedProcess.run(FreeEedProcess.java:66)
> > > > >> > > >        at
> > > > org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> > > > >> > > >        at
> > > > org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
> > > > >> > > >        at
> > > > >> > > > org.freeeed.main.FreeEedProcess.main(FreeEedProcess.java:71)
> > > > >> > > >        at
> > > > >> > org.freeeed.main.FreeEedMain.runProcessing(FreeEedMain.java:88)
> > > > >> > > >        at
> > > > >> > >
> org.freeeed.main.FreeEedMain.processOptions(FreeEedMain.java:65)
> > > > >> > > >        at
> > org.freeeed.main.FreeEedMain.main(FreeEedMain.java:31)
> > > > >> > > >
> > > > >> > >
> > > > >> >
> > > > >>
> > > > >>
> > > > >>
> > > > >> --
> > > > >> Regards,
> > > > >> Simon
> > > > >
> > > > >
> > > >
> > > >
> > > >
> > > > --
> > > > Lance Norskog
> > > > goksron@gmail.com
> > > >
> > >
> >
> >
> >
> > --
> > Tish Heyssel
> > Peterson Burnett Technologies
> > Please use: tishhey@gmail.com
> > Alternate email: tish@speakeasy.net
> > pmh@pbtechnologies.com
> >
>

Re: running local hadoop job in windows

Posted by Stephen Boesch <ja...@gmail.com>.
another approach you may well have already considered.. but may reconsider..
use (free version of ..) vmware player running on your winXXX env ("host")
and install linux distro of your choice as the guest o/s.  you can spin up
essentially any number of instances that way.

.. and not be concerned about the configuration/behavorial discrepancies
between cygwin and native linux.

If you wish there are pre-packaged distros for cloudera, apache, yahoo! for
vmware.



2011/4/5 Tish Heyssel <ti...@gmail.com>

> Mark,
>
> Make sure you add the cygwin/bin to your global PATH variable in Windows
> too... and echo the PATH, if you're running in a command window to make
> sure
> it shows up there...  When running through eclipse, it should pick up the
> PATH variable.
>
> Good luck.  its worth the trouble.  this does work.
>
> tish
>
> On Mon, Apr 4, 2011 at 10:13 PM, Mark Kerzner <ma...@gmail.com>
> wrote:
>
> > I understand now, I need to install cygwin correctly, asking it for all
> the
> > right options.
> >
> > Thank you,
> > Mark
> >
> > On Mon, Apr 4, 2011 at 9:06 PM, Lance Norskog <go...@gmail.com> wrote:
> >
> > > You're stuck with cygwin! Hadoop insists on running the 'chmod'
> > > program. You have to have a binary in your search path.
> > >
> > > Lance
> > >
> > > On Sat, Mar 19, 2011 at 9:15 PM, Mark Kerzner <ma...@gmail.com>
> > > wrote:
> > > > Now I AM running under cygwin, and I get the same error, as you can
> see
> > > from
> > > > the attached screenshot.
> > > > Thank you,
> > > > Mark
> > > >
> > > > On Sat, Mar 19, 2011 at 9:16 PM, Simon <gs...@gmail.com> wrote:
> > > >>
> > > >> As far as I know, currently hadoop can only run under *nix like
> > systems.
> > > >> Correct me if I am wrong.
> > > >> And if you want to run it under windows, you can try cygwin as the
> > > >> environment.
> > > >>
> > > >> Thanks
> > > >> Simon
> > > >>
> > > >> On Fri, Mar 18, 2011 at 7:11 PM, Mark Kerzner <
> markkerzner@gmail.com>
> > > >> wrote:
> > > >>
> > > >> > No, I hoped that it is not absolutely necessary for that kind of
> > use.
> > > I
> > > >> > am
> > > >> > not even issuing the "hadoop -jar" command, but it is pure "java
> > > -jar".
> > > >> > It
> > > >> > is true though that my Ubuntu has a Hadoop set up, so maybe it is
> > > doing
> > > >> > a
> > > >> > lot of magic behind my back.
> > > >> >
> > > >> > I did not want to have my inexperienced Windows users to have to
> > > install
> > > >> > cygwin for just trying the package.
> > > >> >
> > > >> > Thank you,
> > > >> > Mark
> > > >> >
> > > >> > On Fri, Mar 18, 2011 at 6:06 PM, Stephen Boesch <
> javadba@gmail.com>
> > > >> > wrote:
> > > >> >
> > > >> > > presumably you ran this under cygwin?
> > > >> > >
> > > >> > > 2011/3/18 Mark Kerzner <ma...@gmail.com>
> > > >> > >
> > > >> > > > Hi, guys,
> > > >> > > >
> > > >> > > > I want to give my users a sense of what my hadoop application
> > can
> > > >> > > > do,
> > > >> > and
> > > >> > > I
> > > >> > > > am trying to make it run in Windows, with this command
> > > >> > > >
> > > >> > > > java -jar dist\FreeEed.jar
> > > >> > > >
> > > >> > > > This command runs my hadoop job locally, and it works in
> Linux.
> > > >> > However,
> > > >> > > in
> > > >> > > > Windows I get the error listed below. Since I am running
> > > completely
> > > >> > > > locally,
> > > >> > > > I don't see why it is trying to do what it does. Is there a
> > > >> > > > workaround?
> > > >> > > >
> > > >> > > > Thank you,
> > > >> > > > Mark
> > > >> > > >
> > > >> > > > Error:
> > > >> > > >
> > > >> > > > 11/03/18 17:57:43 INFO jvm.JvmMetrics: Initializing JVM
> Metrics
> > > with
> > > >> > > > processName
> > > >> > > > =JobTracker, sessionId=
> > > >> > > > java.io.IOException: Failed to set permissions of path:
> > > >> > > > file:/tmp/hadoop-Mark/ma
> > > >> > > > pred/staging/Mark-1397630897/.staging to 0700
> > > >> > > >        at
> > > >> > > >
> > > >> > > >
> > > org.apache.hadoop.fs.RawLocalFileSystem.checkReturnValue(RawLocalFile
> > > >> > > > System.java:526)
> > > >> > > >        at
> > > >> > > >
> > > >> > > >
> > > org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSys
> > > >> > > > tem.java:500)
> > > >> > > >        at
> > > >> > > >
> > > >> > > >
> > > org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.jav
> > > >> > > > a:310)
> > > >> > > >        at
> > > >> > > >
> > > >> > > >
> > > org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:18
> > > >> > > > 9)
> > > >> > > >        at
> > > >> > > >
> > > >> > > >
> > > org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmi
> > > >> > > > ssionFiles.java:116)
> > > >> > > >        at
> > > >> > > > org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:799)
> > > >> > > >        at
> > > >> > > > org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:793)
> > > >> > > >        at java.security.AccessController.doPrivileged(Native
> > > Method)
> > > >> > > >        at javax.security.auth.Subject.doAs(Unknown Source)
> > > >> > > >        at
> > > >> > > >
> > > >> > > >
> > > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma
> > > >> > > > tion.java:1063)
> > > >> > > >        at
> > > >> > > >
> > > >> > > >
> > > org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:7
> > > >> > > > 93)
> > > >> > > >        at org.apache.hadoop.mapreduce.Job.submit(Job.java:465)
> > > >> > > >        at
> > > >> > org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:495)
> > > >> > > >        at
> > > >> > > > org.freeeed.main.FreeEedProcess.run(FreeEedProcess.java:66)
> > > >> > > >        at
> > > org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> > > >> > > >        at
> > > org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
> > > >> > > >        at
> > > >> > > > org.freeeed.main.FreeEedProcess.main(FreeEedProcess.java:71)
> > > >> > > >        at
> > > >> > org.freeeed.main.FreeEedMain.runProcessing(FreeEedMain.java:88)
> > > >> > > >        at
> > > >> > > org.freeeed.main.FreeEedMain.processOptions(FreeEedMain.java:65)
> > > >> > > >        at
> org.freeeed.main.FreeEedMain.main(FreeEedMain.java:31)
> > > >> > > >
> > > >> > >
> > > >> >
> > > >>
> > > >>
> > > >>
> > > >> --
> > > >> Regards,
> > > >> Simon
> > > >
> > > >
> > >
> > >
> > >
> > > --
> > > Lance Norskog
> > > goksron@gmail.com
> > >
> >
>
>
>
> --
> Tish Heyssel
> Peterson Burnett Technologies
> Please use: tishhey@gmail.com
> Alternate email: tish@speakeasy.net
> pmh@pbtechnologies.com
>

Re: running local hadoop job in windows

Posted by Tish Heyssel <ti...@gmail.com>.
Mark,

Make sure you add the cygwin/bin to your global PATH variable in Windows
too... and echo the PATH, if you're running in a command window to make sure
it shows up there...  When running through eclipse, it should pick up the
PATH variable.

Good luck.  its worth the trouble.  this does work.

tish

On Mon, Apr 4, 2011 at 10:13 PM, Mark Kerzner <ma...@gmail.com> wrote:

> I understand now, I need to install cygwin correctly, asking it for all the
> right options.
>
> Thank you,
> Mark
>
> On Mon, Apr 4, 2011 at 9:06 PM, Lance Norskog <go...@gmail.com> wrote:
>
> > You're stuck with cygwin! Hadoop insists on running the 'chmod'
> > program. You have to have a binary in your search path.
> >
> > Lance
> >
> > On Sat, Mar 19, 2011 at 9:15 PM, Mark Kerzner <ma...@gmail.com>
> > wrote:
> > > Now I AM running under cygwin, and I get the same error, as you can see
> > from
> > > the attached screenshot.
> > > Thank you,
> > > Mark
> > >
> > > On Sat, Mar 19, 2011 at 9:16 PM, Simon <gs...@gmail.com> wrote:
> > >>
> > >> As far as I know, currently hadoop can only run under *nix like
> systems.
> > >> Correct me if I am wrong.
> > >> And if you want to run it under windows, you can try cygwin as the
> > >> environment.
> > >>
> > >> Thanks
> > >> Simon
> > >>
> > >> On Fri, Mar 18, 2011 at 7:11 PM, Mark Kerzner <ma...@gmail.com>
> > >> wrote:
> > >>
> > >> > No, I hoped that it is not absolutely necessary for that kind of
> use.
> > I
> > >> > am
> > >> > not even issuing the "hadoop -jar" command, but it is pure "java
> > -jar".
> > >> > It
> > >> > is true though that my Ubuntu has a Hadoop set up, so maybe it is
> > doing
> > >> > a
> > >> > lot of magic behind my back.
> > >> >
> > >> > I did not want to have my inexperienced Windows users to have to
> > install
> > >> > cygwin for just trying the package.
> > >> >
> > >> > Thank you,
> > >> > Mark
> > >> >
> > >> > On Fri, Mar 18, 2011 at 6:06 PM, Stephen Boesch <ja...@gmail.com>
> > >> > wrote:
> > >> >
> > >> > > presumably you ran this under cygwin?
> > >> > >
> > >> > > 2011/3/18 Mark Kerzner <ma...@gmail.com>
> > >> > >
> > >> > > > Hi, guys,
> > >> > > >
> > >> > > > I want to give my users a sense of what my hadoop application
> can
> > >> > > > do,
> > >> > and
> > >> > > I
> > >> > > > am trying to make it run in Windows, with this command
> > >> > > >
> > >> > > > java -jar dist\FreeEed.jar
> > >> > > >
> > >> > > > This command runs my hadoop job locally, and it works in Linux.
> > >> > However,
> > >> > > in
> > >> > > > Windows I get the error listed below. Since I am running
> > completely
> > >> > > > locally,
> > >> > > > I don't see why it is trying to do what it does. Is there a
> > >> > > > workaround?
> > >> > > >
> > >> > > > Thank you,
> > >> > > > Mark
> > >> > > >
> > >> > > > Error:
> > >> > > >
> > >> > > > 11/03/18 17:57:43 INFO jvm.JvmMetrics: Initializing JVM Metrics
> > with
> > >> > > > processName
> > >> > > > =JobTracker, sessionId=
> > >> > > > java.io.IOException: Failed to set permissions of path:
> > >> > > > file:/tmp/hadoop-Mark/ma
> > >> > > > pred/staging/Mark-1397630897/.staging to 0700
> > >> > > >        at
> > >> > > >
> > >> > > >
> > org.apache.hadoop.fs.RawLocalFileSystem.checkReturnValue(RawLocalFile
> > >> > > > System.java:526)
> > >> > > >        at
> > >> > > >
> > >> > > >
> > org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSys
> > >> > > > tem.java:500)
> > >> > > >        at
> > >> > > >
> > >> > > >
> > org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.jav
> > >> > > > a:310)
> > >> > > >        at
> > >> > > >
> > >> > > >
> > org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:18
> > >> > > > 9)
> > >> > > >        at
> > >> > > >
> > >> > > >
> > org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmi
> > >> > > > ssionFiles.java:116)
> > >> > > >        at
> > >> > > > org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:799)
> > >> > > >        at
> > >> > > > org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:793)
> > >> > > >        at java.security.AccessController.doPrivileged(Native
> > Method)
> > >> > > >        at javax.security.auth.Subject.doAs(Unknown Source)
> > >> > > >        at
> > >> > > >
> > >> > > >
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma
> > >> > > > tion.java:1063)
> > >> > > >        at
> > >> > > >
> > >> > > >
> > org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:7
> > >> > > > 93)
> > >> > > >        at org.apache.hadoop.mapreduce.Job.submit(Job.java:465)
> > >> > > >        at
> > >> > org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:495)
> > >> > > >        at
> > >> > > > org.freeeed.main.FreeEedProcess.run(FreeEedProcess.java:66)
> > >> > > >        at
> > org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> > >> > > >        at
> > org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
> > >> > > >        at
> > >> > > > org.freeeed.main.FreeEedProcess.main(FreeEedProcess.java:71)
> > >> > > >        at
> > >> > org.freeeed.main.FreeEedMain.runProcessing(FreeEedMain.java:88)
> > >> > > >        at
> > >> > > org.freeeed.main.FreeEedMain.processOptions(FreeEedMain.java:65)
> > >> > > >        at org.freeeed.main.FreeEedMain.main(FreeEedMain.java:31)
> > >> > > >
> > >> > >
> > >> >
> > >>
> > >>
> > >>
> > >> --
> > >> Regards,
> > >> Simon
> > >
> > >
> >
> >
> >
> > --
> > Lance Norskog
> > goksron@gmail.com
> >
>



-- 
Tish Heyssel
Peterson Burnett Technologies
Please use: tishhey@gmail.com
Alternate email: tish@speakeasy.net
pmh@pbtechnologies.com

Re: running local hadoop job in windows

Posted by Mark Kerzner <ma...@gmail.com>.
I understand now, I need to install cygwin correctly, asking it for all the
right options.

Thank you,
Mark

On Mon, Apr 4, 2011 at 9:06 PM, Lance Norskog <go...@gmail.com> wrote:

> You're stuck with cygwin! Hadoop insists on running the 'chmod'
> program. You have to have a binary in your search path.
>
> Lance
>
> On Sat, Mar 19, 2011 at 9:15 PM, Mark Kerzner <ma...@gmail.com>
> wrote:
> > Now I AM running under cygwin, and I get the same error, as you can see
> from
> > the attached screenshot.
> > Thank you,
> > Mark
> >
> > On Sat, Mar 19, 2011 at 9:16 PM, Simon <gs...@gmail.com> wrote:
> >>
> >> As far as I know, currently hadoop can only run under *nix like systems.
> >> Correct me if I am wrong.
> >> And if you want to run it under windows, you can try cygwin as the
> >> environment.
> >>
> >> Thanks
> >> Simon
> >>
> >> On Fri, Mar 18, 2011 at 7:11 PM, Mark Kerzner <ma...@gmail.com>
> >> wrote:
> >>
> >> > No, I hoped that it is not absolutely necessary for that kind of use.
> I
> >> > am
> >> > not even issuing the "hadoop -jar" command, but it is pure "java
> -jar".
> >> > It
> >> > is true though that my Ubuntu has a Hadoop set up, so maybe it is
> doing
> >> > a
> >> > lot of magic behind my back.
> >> >
> >> > I did not want to have my inexperienced Windows users to have to
> install
> >> > cygwin for just trying the package.
> >> >
> >> > Thank you,
> >> > Mark
> >> >
> >> > On Fri, Mar 18, 2011 at 6:06 PM, Stephen Boesch <ja...@gmail.com>
> >> > wrote:
> >> >
> >> > > presumably you ran this under cygwin?
> >> > >
> >> > > 2011/3/18 Mark Kerzner <ma...@gmail.com>
> >> > >
> >> > > > Hi, guys,
> >> > > >
> >> > > > I want to give my users a sense of what my hadoop application can
> >> > > > do,
> >> > and
> >> > > I
> >> > > > am trying to make it run in Windows, with this command
> >> > > >
> >> > > > java -jar dist\FreeEed.jar
> >> > > >
> >> > > > This command runs my hadoop job locally, and it works in Linux.
> >> > However,
> >> > > in
> >> > > > Windows I get the error listed below. Since I am running
> completely
> >> > > > locally,
> >> > > > I don't see why it is trying to do what it does. Is there a
> >> > > > workaround?
> >> > > >
> >> > > > Thank you,
> >> > > > Mark
> >> > > >
> >> > > > Error:
> >> > > >
> >> > > > 11/03/18 17:57:43 INFO jvm.JvmMetrics: Initializing JVM Metrics
> with
> >> > > > processName
> >> > > > =JobTracker, sessionId=
> >> > > > java.io.IOException: Failed to set permissions of path:
> >> > > > file:/tmp/hadoop-Mark/ma
> >> > > > pred/staging/Mark-1397630897/.staging to 0700
> >> > > >        at
> >> > > >
> >> > > >
> org.apache.hadoop.fs.RawLocalFileSystem.checkReturnValue(RawLocalFile
> >> > > > System.java:526)
> >> > > >        at
> >> > > >
> >> > > >
> org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSys
> >> > > > tem.java:500)
> >> > > >        at
> >> > > >
> >> > > >
> org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.jav
> >> > > > a:310)
> >> > > >        at
> >> > > >
> >> > > >
> org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:18
> >> > > > 9)
> >> > > >        at
> >> > > >
> >> > > >
> org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmi
> >> > > > ssionFiles.java:116)
> >> > > >        at
> >> > > > org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:799)
> >> > > >        at
> >> > > > org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:793)
> >> > > >        at java.security.AccessController.doPrivileged(Native
> Method)
> >> > > >        at javax.security.auth.Subject.doAs(Unknown Source)
> >> > > >        at
> >> > > >
> >> > > >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma
> >> > > > tion.java:1063)
> >> > > >        at
> >> > > >
> >> > > >
> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:7
> >> > > > 93)
> >> > > >        at org.apache.hadoop.mapreduce.Job.submit(Job.java:465)
> >> > > >        at
> >> > org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:495)
> >> > > >        at
> >> > > > org.freeeed.main.FreeEedProcess.run(FreeEedProcess.java:66)
> >> > > >        at
> org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> >> > > >        at
> org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
> >> > > >        at
> >> > > > org.freeeed.main.FreeEedProcess.main(FreeEedProcess.java:71)
> >> > > >        at
> >> > org.freeeed.main.FreeEedMain.runProcessing(FreeEedMain.java:88)
> >> > > >        at
> >> > > org.freeeed.main.FreeEedMain.processOptions(FreeEedMain.java:65)
> >> > > >        at org.freeeed.main.FreeEedMain.main(FreeEedMain.java:31)
> >> > > >
> >> > >
> >> >
> >>
> >>
> >>
> >> --
> >> Regards,
> >> Simon
> >
> >
>
>
>
> --
> Lance Norskog
> goksron@gmail.com
>

Re: running local hadoop job in windows

Posted by Lance Norskog <go...@gmail.com>.
You're stuck with cygwin! Hadoop insists on running the 'chmod'
program. You have to have a binary in your search path.

Lance

On Sat, Mar 19, 2011 at 9:15 PM, Mark Kerzner <ma...@gmail.com> wrote:
> Now I AM running under cygwin, and I get the same error, as you can see from
> the attached screenshot.
> Thank you,
> Mark
>
> On Sat, Mar 19, 2011 at 9:16 PM, Simon <gs...@gmail.com> wrote:
>>
>> As far as I know, currently hadoop can only run under *nix like systems.
>> Correct me if I am wrong.
>> And if you want to run it under windows, you can try cygwin as the
>> environment.
>>
>> Thanks
>> Simon
>>
>> On Fri, Mar 18, 2011 at 7:11 PM, Mark Kerzner <ma...@gmail.com>
>> wrote:
>>
>> > No, I hoped that it is not absolutely necessary for that kind of use. I
>> > am
>> > not even issuing the "hadoop -jar" command, but it is pure "java -jar".
>> > It
>> > is true though that my Ubuntu has a Hadoop set up, so maybe it is doing
>> > a
>> > lot of magic behind my back.
>> >
>> > I did not want to have my inexperienced Windows users to have to install
>> > cygwin for just trying the package.
>> >
>> > Thank you,
>> > Mark
>> >
>> > On Fri, Mar 18, 2011 at 6:06 PM, Stephen Boesch <ja...@gmail.com>
>> > wrote:
>> >
>> > > presumably you ran this under cygwin?
>> > >
>> > > 2011/3/18 Mark Kerzner <ma...@gmail.com>
>> > >
>> > > > Hi, guys,
>> > > >
>> > > > I want to give my users a sense of what my hadoop application can
>> > > > do,
>> > and
>> > > I
>> > > > am trying to make it run in Windows, with this command
>> > > >
>> > > > java -jar dist\FreeEed.jar
>> > > >
>> > > > This command runs my hadoop job locally, and it works in Linux.
>> > However,
>> > > in
>> > > > Windows I get the error listed below. Since I am running completely
>> > > > locally,
>> > > > I don't see why it is trying to do what it does. Is there a
>> > > > workaround?
>> > > >
>> > > > Thank you,
>> > > > Mark
>> > > >
>> > > > Error:
>> > > >
>> > > > 11/03/18 17:57:43 INFO jvm.JvmMetrics: Initializing JVM Metrics with
>> > > > processName
>> > > > =JobTracker, sessionId=
>> > > > java.io.IOException: Failed to set permissions of path:
>> > > > file:/tmp/hadoop-Mark/ma
>> > > > pred/staging/Mark-1397630897/.staging to 0700
>> > > >        at
>> > > >
>> > > > org.apache.hadoop.fs.RawLocalFileSystem.checkReturnValue(RawLocalFile
>> > > > System.java:526)
>> > > >        at
>> > > >
>> > > > org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSys
>> > > > tem.java:500)
>> > > >        at
>> > > >
>> > > > org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.jav
>> > > > a:310)
>> > > >        at
>> > > >
>> > > > org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:18
>> > > > 9)
>> > > >        at
>> > > >
>> > > > org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmi
>> > > > ssionFiles.java:116)
>> > > >        at
>> > > > org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:799)
>> > > >        at
>> > > > org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:793)
>> > > >        at java.security.AccessController.doPrivileged(Native Method)
>> > > >        at javax.security.auth.Subject.doAs(Unknown Source)
>> > > >        at
>> > > >
>> > > > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma
>> > > > tion.java:1063)
>> > > >        at
>> > > >
>> > > > org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:7
>> > > > 93)
>> > > >        at org.apache.hadoop.mapreduce.Job.submit(Job.java:465)
>> > > >        at
>> > org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:495)
>> > > >        at
>> > > > org.freeeed.main.FreeEedProcess.run(FreeEedProcess.java:66)
>> > > >        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>> > > >        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>> > > >        at
>> > > > org.freeeed.main.FreeEedProcess.main(FreeEedProcess.java:71)
>> > > >        at
>> > org.freeeed.main.FreeEedMain.runProcessing(FreeEedMain.java:88)
>> > > >        at
>> > > org.freeeed.main.FreeEedMain.processOptions(FreeEedMain.java:65)
>> > > >        at org.freeeed.main.FreeEedMain.main(FreeEedMain.java:31)
>> > > >
>> > >
>> >
>>
>>
>>
>> --
>> Regards,
>> Simon
>
>



-- 
Lance Norskog
goksron@gmail.com

Re: running local hadoop job in windows

Posted by Mark Kerzner <ma...@gmail.com>.
Now I AM running under cygwin, and I get the same error, as you can see from
the attached screenshot.

Thank you,
Mark

On Sat, Mar 19, 2011 at 9:16 PM, Simon <gs...@gmail.com> wrote:

> As far as I know, currently hadoop can only run under *nix like systems.
> Correct me if I am wrong.
> And if you want to run it under windows, you can try cygwin as the
> environment.
>
> Thanks
> Simon
>
> On Fri, Mar 18, 2011 at 7:11 PM, Mark Kerzner <ma...@gmail.com>
> wrote:
>
> > No, I hoped that it is not absolutely necessary for that kind of use. I
> am
> > not even issuing the "hadoop -jar" command, but it is pure "java -jar".
> It
> > is true though that my Ubuntu has a Hadoop set up, so maybe it is doing a
> > lot of magic behind my back.
> >
> > I did not want to have my inexperienced Windows users to have to install
> > cygwin for just trying the package.
> >
> > Thank you,
> > Mark
> >
> > On Fri, Mar 18, 2011 at 6:06 PM, Stephen Boesch <ja...@gmail.com>
> wrote:
> >
> > > presumably you ran this under cygwin?
> > >
> > > 2011/3/18 Mark Kerzner <ma...@gmail.com>
> > >
> > > > Hi, guys,
> > > >
> > > > I want to give my users a sense of what my hadoop application can do,
> > and
> > > I
> > > > am trying to make it run in Windows, with this command
> > > >
> > > > java -jar dist\FreeEed.jar
> > > >
> > > > This command runs my hadoop job locally, and it works in Linux.
> > However,
> > > in
> > > > Windows I get the error listed below. Since I am running completely
> > > > locally,
> > > > I don't see why it is trying to do what it does. Is there a
> workaround?
> > > >
> > > > Thank you,
> > > > Mark
> > > >
> > > > Error:
> > > >
> > > > 11/03/18 17:57:43 INFO jvm.JvmMetrics: Initializing JVM Metrics with
> > > > processName
> > > > =JobTracker, sessionId=
> > > > java.io.IOException: Failed to set permissions of path:
> > > > file:/tmp/hadoop-Mark/ma
> > > > pred/staging/Mark-1397630897/.staging to 0700
> > > >        at
> > > > org.apache.hadoop.fs.RawLocalFileSystem.checkReturnValue(RawLocalFile
> > > > System.java:526)
> > > >        at
> > > > org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSys
> > > > tem.java:500)
> > > >        at
> > > > org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.jav
> > > > a:310)
> > > >        at
> > > > org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:18
> > > > 9)
> > > >        at
> > > > org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmi
> > > > ssionFiles.java:116)
> > > >        at
> org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:799)
> > > >        at
> org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:793)
> > > >        at java.security.AccessController.doPrivileged(Native Method)
> > > >        at javax.security.auth.Subject.doAs(Unknown Source)
> > > >        at
> > > > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma
> > > > tion.java:1063)
> > > >        at
> > > > org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:7
> > > > 93)
> > > >        at org.apache.hadoop.mapreduce.Job.submit(Job.java:465)
> > > >        at
> > org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:495)
> > > >        at org.freeeed.main.FreeEedProcess.run(FreeEedProcess.java:66)
> > > >        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> > > >        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
> > > >        at
> org.freeeed.main.FreeEedProcess.main(FreeEedProcess.java:71)
> > > >        at
> > org.freeeed.main.FreeEedMain.runProcessing(FreeEedMain.java:88)
> > > >        at
> > > org.freeeed.main.FreeEedMain.processOptions(FreeEedMain.java:65)
> > > >        at org.freeeed.main.FreeEedMain.main(FreeEedMain.java:31)
> > > >
> > >
> >
>
>
>
> --
> Regards,
> Simon
>

Re: running local hadoop job in windows

Posted by Simon <gs...@gmail.com>.
As far as I know, currently hadoop can only run under *nix like systems.
Correct me if I am wrong.
And if you want to run it under windows, you can try cygwin as the
environment.

Thanks
Simon

On Fri, Mar 18, 2011 at 7:11 PM, Mark Kerzner <ma...@gmail.com> wrote:

> No, I hoped that it is not absolutely necessary for that kind of use. I am
> not even issuing the "hadoop -jar" command, but it is pure "java -jar". It
> is true though that my Ubuntu has a Hadoop set up, so maybe it is doing a
> lot of magic behind my back.
>
> I did not want to have my inexperienced Windows users to have to install
> cygwin for just trying the package.
>
> Thank you,
> Mark
>
> On Fri, Mar 18, 2011 at 6:06 PM, Stephen Boesch <ja...@gmail.com> wrote:
>
> > presumably you ran this under cygwin?
> >
> > 2011/3/18 Mark Kerzner <ma...@gmail.com>
> >
> > > Hi, guys,
> > >
> > > I want to give my users a sense of what my hadoop application can do,
> and
> > I
> > > am trying to make it run in Windows, with this command
> > >
> > > java -jar dist\FreeEed.jar
> > >
> > > This command runs my hadoop job locally, and it works in Linux.
> However,
> > in
> > > Windows I get the error listed below. Since I am running completely
> > > locally,
> > > I don't see why it is trying to do what it does. Is there a workaround?
> > >
> > > Thank you,
> > > Mark
> > >
> > > Error:
> > >
> > > 11/03/18 17:57:43 INFO jvm.JvmMetrics: Initializing JVM Metrics with
> > > processName
> > > =JobTracker, sessionId=
> > > java.io.IOException: Failed to set permissions of path:
> > > file:/tmp/hadoop-Mark/ma
> > > pred/staging/Mark-1397630897/.staging to 0700
> > >        at
> > > org.apache.hadoop.fs.RawLocalFileSystem.checkReturnValue(RawLocalFile
> > > System.java:526)
> > >        at
> > > org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSys
> > > tem.java:500)
> > >        at
> > > org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.jav
> > > a:310)
> > >        at
> > > org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:18
> > > 9)
> > >        at
> > > org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmi
> > > ssionFiles.java:116)
> > >        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:799)
> > >        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:793)
> > >        at java.security.AccessController.doPrivileged(Native Method)
> > >        at javax.security.auth.Subject.doAs(Unknown Source)
> > >        at
> > > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma
> > > tion.java:1063)
> > >        at
> > > org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:7
> > > 93)
> > >        at org.apache.hadoop.mapreduce.Job.submit(Job.java:465)
> > >        at
> org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:495)
> > >        at org.freeeed.main.FreeEedProcess.run(FreeEedProcess.java:66)
> > >        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> > >        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
> > >        at org.freeeed.main.FreeEedProcess.main(FreeEedProcess.java:71)
> > >        at
> org.freeeed.main.FreeEedMain.runProcessing(FreeEedMain.java:88)
> > >        at
> > org.freeeed.main.FreeEedMain.processOptions(FreeEedMain.java:65)
> > >        at org.freeeed.main.FreeEedMain.main(FreeEedMain.java:31)
> > >
> >
>



-- 
Regards,
Simon

Re: running local hadoop job in windows

Posted by Mark Kerzner <ma...@gmail.com>.
No, I hoped that it is not absolutely necessary for that kind of use. I am
not even issuing the "hadoop -jar" command, but it is pure "java -jar". It
is true though that my Ubuntu has a Hadoop set up, so maybe it is doing a
lot of magic behind my back.

I did not want to have my inexperienced Windows users to have to install
cygwin for just trying the package.

Thank you,
Mark

On Fri, Mar 18, 2011 at 6:06 PM, Stephen Boesch <ja...@gmail.com> wrote:

> presumably you ran this under cygwin?
>
> 2011/3/18 Mark Kerzner <ma...@gmail.com>
>
> > Hi, guys,
> >
> > I want to give my users a sense of what my hadoop application can do, and
> I
> > am trying to make it run in Windows, with this command
> >
> > java -jar dist\FreeEed.jar
> >
> > This command runs my hadoop job locally, and it works in Linux. However,
> in
> > Windows I get the error listed below. Since I am running completely
> > locally,
> > I don't see why it is trying to do what it does. Is there a workaround?
> >
> > Thank you,
> > Mark
> >
> > Error:
> >
> > 11/03/18 17:57:43 INFO jvm.JvmMetrics: Initializing JVM Metrics with
> > processName
> > =JobTracker, sessionId=
> > java.io.IOException: Failed to set permissions of path:
> > file:/tmp/hadoop-Mark/ma
> > pred/staging/Mark-1397630897/.staging to 0700
> >        at
> > org.apache.hadoop.fs.RawLocalFileSystem.checkReturnValue(RawLocalFile
> > System.java:526)
> >        at
> > org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSys
> > tem.java:500)
> >        at
> > org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.jav
> > a:310)
> >        at
> > org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:18
> > 9)
> >        at
> > org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmi
> > ssionFiles.java:116)
> >        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:799)
> >        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:793)
> >        at java.security.AccessController.doPrivileged(Native Method)
> >        at javax.security.auth.Subject.doAs(Unknown Source)
> >        at
> > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma
> > tion.java:1063)
> >        at
> > org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:7
> > 93)
> >        at org.apache.hadoop.mapreduce.Job.submit(Job.java:465)
> >        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:495)
> >        at org.freeeed.main.FreeEedProcess.run(FreeEedProcess.java:66)
> >        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
> >        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
> >        at org.freeeed.main.FreeEedProcess.main(FreeEedProcess.java:71)
> >        at org.freeeed.main.FreeEedMain.runProcessing(FreeEedMain.java:88)
> >        at
> org.freeeed.main.FreeEedMain.processOptions(FreeEedMain.java:65)
> >        at org.freeeed.main.FreeEedMain.main(FreeEedMain.java:31)
> >
>

Re: running local hadoop job in windows

Posted by Stephen Boesch <ja...@gmail.com>.
presumably you ran this under cygwin?

2011/3/18 Mark Kerzner <ma...@gmail.com>

> Hi, guys,
>
> I want to give my users a sense of what my hadoop application can do, and I
> am trying to make it run in Windows, with this command
>
> java -jar dist\FreeEed.jar
>
> This command runs my hadoop job locally, and it works in Linux. However, in
> Windows I get the error listed below. Since I am running completely
> locally,
> I don't see why it is trying to do what it does. Is there a workaround?
>
> Thank you,
> Mark
>
> Error:
>
> 11/03/18 17:57:43 INFO jvm.JvmMetrics: Initializing JVM Metrics with
> processName
> =JobTracker, sessionId=
> java.io.IOException: Failed to set permissions of path:
> file:/tmp/hadoop-Mark/ma
> pred/staging/Mark-1397630897/.staging to 0700
>        at
> org.apache.hadoop.fs.RawLocalFileSystem.checkReturnValue(RawLocalFile
> System.java:526)
>        at
> org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSys
> tem.java:500)
>        at
> org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.jav
> a:310)
>        at
> org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:18
> 9)
>        at
> org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmi
> ssionFiles.java:116)
>        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:799)
>        at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:793)
>        at java.security.AccessController.doPrivileged(Native Method)
>        at javax.security.auth.Subject.doAs(Unknown Source)
>        at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma
> tion.java:1063)
>        at
> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:7
> 93)
>        at org.apache.hadoop.mapreduce.Job.submit(Job.java:465)
>        at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:495)
>        at org.freeeed.main.FreeEedProcess.run(FreeEedProcess.java:66)
>        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>        at org.freeeed.main.FreeEedProcess.main(FreeEedProcess.java:71)
>        at org.freeeed.main.FreeEedMain.runProcessing(FreeEedMain.java:88)
>        at org.freeeed.main.FreeEedMain.processOptions(FreeEedMain.java:65)
>        at org.freeeed.main.FreeEedMain.main(FreeEedMain.java:31)
>