You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@mahout.apache.org by Wasim <wa...@gmail.com> on 2011/03/11 19:44:05 UTC

File name too long error on linux machine

Hi,

I have executed synthetic control data example from mahout wiki page on
linux mahcine. When i execute the following command to copy data from hadoop
to my local machine:

$HADOOP_HOME/bin/hadoop fs -get output $MAHOUT_HOME/examples

i get the following error:

get: File name too long

Actually, when i executed the following command to see all outputs on
hadoop:

$HADOOP_HOME/bin/hadoop fs -lsr output

there were indeed some very long file names there, such as:

/user/behoover-a001/output/clusteredPoints/_logs/history/localhost_1299864873686_job_201103111834_0013_behoover-a001_KMeans+Driver+running+clusterData+over+input%3A+outp

there are many others like this.

can anybody please have the solution for this problem?

-- 
Thank you & Regards
Muhammad Wasimullah Khan
Mobile:+46 72 03 29 205
Alt.Telephone: +92 345 21 98 451
Email: mwkhan@kth.se
Skype: muhammad.wasim.khan

Re: File name too long error on linux machine

Posted by Wasim <wa...@gmail.com>.
Hi,

I have exactly followed the steps given here on mahout wiki page:
https://cwiki.apache.org/MAHOUT/clustering-of-synthetic-control-data.html

What i have not done is to clean build mahout as described in Pre-Prep part
step 3, because i already had the job jar. I have even used the same data as
given on the wiki page.

Regards,

Wasim

On Sun, Mar 13, 2011 at 12:25 AM, Ted Dunning <te...@gmail.com> wrote:

> Also, that file name looks like you somehow got a lot of text into the
> name.
>
> Can you say more about how you are running that program?
>
> On Sat, Mar 12, 2011 at 1:32 PM, Jeff Eastman <je...@narus.com> wrote:
>
> > Try not getting the _logs directory. All the other names should be find.
> >
> > -----Original Message-----
> > From: Wasim [mailto:wasim.khan@gmail.com]
> > Sent: Saturday, March 12, 2011 12:33 PM
> > To: Ted Dunning
> > Cc: user@mahout.apache.org
> > Subject: Re: File name too long error on linux machine
> >
> > any body please have an answer to above question? i m stuck. :(
> >
> > On Fri, Mar 11, 2011 at 8:21 PM, Ted Dunning <te...@gmail.com>
> > wrote:
> >
> > > What kind of machine are you working on?
> > >
> > > Which version of Mahout?
> > >
> > >
> > > On Fri, Mar 11, 2011 at 10:44 AM, Wasim <wa...@gmail.com> wrote:
> > >
> > >> Hi,
> > >>
> > >> I have executed synthetic control data example from mahout wiki page
> on
> > >> linux mahcine. When i execute the following command to copy data from
> > >> hadoop
> > >> to my local machine:
> > >>
> > >> $HADOOP_HOME/bin/hadoop fs -get output $MAHOUT_HOME/examples
> > >>
> > >> i get the following error:
> > >>
> > >> get: File name too long
> > >>
> > >> Actually, when i executed the following command to see all outputs on
> > >> hadoop:
> > >>
> > >> $HADOOP_HOME/bin/hadoop fs -lsr output
> > >>
> > >> there were indeed some very long file names there, such as:
> > >>
> > >>
> > >>
> >
> /user/behoover-a001/output/clusteredPoints/_logs/history/localhost_1299864873686_job_201103111834_0013_behoover-a001_KMeans+Driver+running+clusterData+over+input%3A+outp
> > >>
> > >> there are many others like this.
> > >>
> > >> can anybody please have the solution for this problem?
> > >>
> > >> --
> > >>
> > >>
> > >
> > >
> >
> >
> > --
> >
> >
>

Re: File name too long error on linux machine

Posted by Ted Dunning <te...@gmail.com>.
Also, that file name looks like you somehow got a lot of text into the name.

Can you say more about how you are running that program?

On Sat, Mar 12, 2011 at 1:32 PM, Jeff Eastman <je...@narus.com> wrote:

> Try not getting the _logs directory. All the other names should be find.
>
> -----Original Message-----
> From: Wasim [mailto:wasim.khan@gmail.com]
> Sent: Saturday, March 12, 2011 12:33 PM
> To: Ted Dunning
> Cc: user@mahout.apache.org
> Subject: Re: File name too long error on linux machine
>
> any body please have an answer to above question? i m stuck. :(
>
> On Fri, Mar 11, 2011 at 8:21 PM, Ted Dunning <te...@gmail.com>
> wrote:
>
> > What kind of machine are you working on?
> >
> > Which version of Mahout?
> >
> >
> > On Fri, Mar 11, 2011 at 10:44 AM, Wasim <wa...@gmail.com> wrote:
> >
> >> Hi,
> >>
> >> I have executed synthetic control data example from mahout wiki page on
> >> linux mahcine. When i execute the following command to copy data from
> >> hadoop
> >> to my local machine:
> >>
> >> $HADOOP_HOME/bin/hadoop fs -get output $MAHOUT_HOME/examples
> >>
> >> i get the following error:
> >>
> >> get: File name too long
> >>
> >> Actually, when i executed the following command to see all outputs on
> >> hadoop:
> >>
> >> $HADOOP_HOME/bin/hadoop fs -lsr output
> >>
> >> there were indeed some very long file names there, such as:
> >>
> >>
> >>
> /user/behoover-a001/output/clusteredPoints/_logs/history/localhost_1299864873686_job_201103111834_0013_behoover-a001_KMeans+Driver+running+clusterData+over+input%3A+outp
> >>
> >> there are many others like this.
> >>
> >> can anybody please have the solution for this problem?
> >>
> >> --
> >> Thank you & Regards
> >> Muhammad Wasimullah Khan
> >> Mobile: <%2B46%2072%2003%2029%20205>+46 72 03 29 205
> >> Alt.Telephone: <%2B92%20345%2021%2098%20451>+92 345 21 98 451
> >> Email: mwkhan@kth.se
> >> Skype: muhammad.wasim.khan
> >>
> >
> >
>
>
> --
> Thank you & Regards
> Muhammad Wasimullah Khan
> Mobile:+46 72 03 29 205
> Alt.Telephone: +92 345 21 98 451
> Email: mwkhan@kth.se
> Skype: muhammad.wasim.khan
>

RE: File name too long error on linux machine

Posted by Jeff Eastman <je...@Narus.com>.
Try not getting the _logs directory. All the other names should be find.

-----Original Message-----
From: Wasim [mailto:wasim.khan@gmail.com] 
Sent: Saturday, March 12, 2011 12:33 PM
To: Ted Dunning
Cc: user@mahout.apache.org
Subject: Re: File name too long error on linux machine

any body please have an answer to above question? i m stuck. :(

On Fri, Mar 11, 2011 at 8:21 PM, Ted Dunning <te...@gmail.com> wrote:

> What kind of machine are you working on?
>
> Which version of Mahout?
>
>
> On Fri, Mar 11, 2011 at 10:44 AM, Wasim <wa...@gmail.com> wrote:
>
>> Hi,
>>
>> I have executed synthetic control data example from mahout wiki page on
>> linux mahcine. When i execute the following command to copy data from
>> hadoop
>> to my local machine:
>>
>> $HADOOP_HOME/bin/hadoop fs -get output $MAHOUT_HOME/examples
>>
>> i get the following error:
>>
>> get: File name too long
>>
>> Actually, when i executed the following command to see all outputs on
>> hadoop:
>>
>> $HADOOP_HOME/bin/hadoop fs -lsr output
>>
>> there were indeed some very long file names there, such as:
>>
>>
>> /user/behoover-a001/output/clusteredPoints/_logs/history/localhost_1299864873686_job_201103111834_0013_behoover-a001_KMeans+Driver+running+clusterData+over+input%3A+outp
>>
>> there are many others like this.
>>
>> can anybody please have the solution for this problem?
>>
>> --
>> Thank you & Regards
>> Muhammad Wasimullah Khan
>> Mobile: <%2B46%2072%2003%2029%20205>+46 72 03 29 205
>> Alt.Telephone: <%2B92%20345%2021%2098%20451>+92 345 21 98 451
>> Email: mwkhan@kth.se
>> Skype: muhammad.wasim.khan
>>
>
>


-- 
Thank you & Regards
Muhammad Wasimullah Khan
Mobile:+46 72 03 29 205
Alt.Telephone: +92 345 21 98 451
Email: mwkhan@kth.se
Skype: muhammad.wasim.khan

Re: File name too long error on linux machine

Posted by Wasim <wa...@gmail.com>.
any body please have an answer to above question? i m stuck. :(

On Fri, Mar 11, 2011 at 8:21 PM, Ted Dunning <te...@gmail.com> wrote:

> What kind of machine are you working on?
>
> Which version of Mahout?
>
>
> On Fri, Mar 11, 2011 at 10:44 AM, Wasim <wa...@gmail.com> wrote:
>
>> Hi,
>>
>> I have executed synthetic control data example from mahout wiki page on
>> linux mahcine. When i execute the following command to copy data from
>> hadoop
>> to my local machine:
>>
>> $HADOOP_HOME/bin/hadoop fs -get output $MAHOUT_HOME/examples
>>
>> i get the following error:
>>
>> get: File name too long
>>
>> Actually, when i executed the following command to see all outputs on
>> hadoop:
>>
>> $HADOOP_HOME/bin/hadoop fs -lsr output
>>
>> there were indeed some very long file names there, such as:
>>
>>
>> /user/behoover-a001/output/clusteredPoints/_logs/history/localhost_1299864873686_job_201103111834_0013_behoover-a001_KMeans+Driver+running+clusterData+over+input%3A+outp
>>
>> there are many others like this.
>>
>> can anybody please have the solution for this problem?
>>
>> --
>> Thank you & Regards
>> Muhammad Wasimullah Khan
>> Mobile: <%2B46%2072%2003%2029%20205>+46 72 03 29 205
>> Alt.Telephone: <%2B92%20345%2021%2098%20451>+92 345 21 98 451
>> Email: mwkhan@kth.se
>> Skype: muhammad.wasim.khan
>>
>
>


-- 
Thank you & Regards
Muhammad Wasimullah Khan
Mobile:+46 72 03 29 205
Alt.Telephone: +92 345 21 98 451
Email: mwkhan@kth.se
Skype: muhammad.wasim.khan

Re: File name too long error on linux machine

Posted by Ted Dunning <te...@gmail.com>.
What kind of machine are you working on?

Which version of Mahout?

On Fri, Mar 11, 2011 at 10:44 AM, Wasim <wa...@gmail.com> wrote:

> Hi,
>
> I have executed synthetic control data example from mahout wiki page on
> linux mahcine. When i execute the following command to copy data from
> hadoop
> to my local machine:
>
> $HADOOP_HOME/bin/hadoop fs -get output $MAHOUT_HOME/examples
>
> i get the following error:
>
> get: File name too long
>
> Actually, when i executed the following command to see all outputs on
> hadoop:
>
> $HADOOP_HOME/bin/hadoop fs -lsr output
>
> there were indeed some very long file names there, such as:
>
>
> /user/behoover-a001/output/clusteredPoints/_logs/history/localhost_1299864873686_job_201103111834_0013_behoover-a001_KMeans+Driver+running+clusterData+over+input%3A+outp
>
> there are many others like this.
>
> can anybody please have the solution for this problem?
>
> --
> Thank you & Regards
> Muhammad Wasimullah Khan
> Mobile:+46 72 03 29 205
> Alt.Telephone: +92 345 21 98 451
> Email: mwkhan@kth.se
> Skype: muhammad.wasim.khan
>