You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Mohit Singh <mo...@gmail.com> on 2014/02/21 01:25:58 UTC

file not found

Hi,
  I am trying to read a file from localdisk..
And just counting number of lines in that file..
But I see this error:
Task 2.0:225 failed 4 times (most recent failure: Exception failure:
java.io.FileNotFoundException: File
*file:/*home/hadoop/data/backup/data/domain/domainz0
does not exist.)

But the file is there..
Though the file:/ doesnt look right?
Also, if i try to read from hdfs:
Incomplete HDFS URI, no host: hdfs:/user/hadoop/foo.csv
Shouldnt it be hdfs:///user/hadoop/foo.csv
Am I missing something?

Thanks

-- 
Mohit

"When you want success as badly as you want the air, then you will get it.
There is no other secret of success."
-Socrates

Re: file not found

Posted by jaranda <jo...@bsc.es>.
Thanks for the heads up, I also experienced this issue.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/file-not-found-tp1854p6438.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: file not found

Posted by Mohit Singh <mo...@gmail.com>.
Got it. That was the fix.
Thanks


On Thu, Feb 20, 2014 at 5:19 PM, Bryn Keller <xo...@xoltar.org> wrote:

> Hi Mohit,
>
> If you're using an hdfs url, you'll want to use
> hdfs://host:port/path/to/file, e.g. hdfs://master:9000/user/hadoop/foo.csv.
>
> The file:/ pattern is right for files on the local machine, but I'm not
> sure it can read from just one machine, it might need to be available on
> all the other nodes as well - others can probably comment on that. HDFS is
> usually easier to work with for this reason.
>
> Thanks,
> Bryn
>
>
> On Thu, Feb 20, 2014 at 4:25 PM, Mohit Singh <mo...@gmail.com> wrote:
>
>> Hi,
>>   I am trying to read a file from localdisk..
>> And just counting number of lines in that file..
>> But I see this error:
>> Task 2.0:225 failed 4 times (most recent failure: Exception failure:
>> java.io.FileNotFoundException: File *file:/*home/hadoop/data/backup/data/domain/domainz0
>> does not exist.)
>>
>> But the file is there..
>> Though the file:/ doesnt look right?
>> Also, if i try to read from hdfs:
>> Incomplete HDFS URI, no host: hdfs:/user/hadoop/foo.csv
>> Shouldnt it be hdfs:///user/hadoop/foo.csv
>> Am I missing something?
>>
>> Thanks
>>
>> --
>> Mohit
>>
>> "When you want success as badly as you want the air, then you will get
>> it. There is no other secret of success."
>> -Socrates
>>
>
>


-- 
Mohit

"When you want success as badly as you want the air, then you will get it.
There is no other secret of success."
-Socrates

Re: file not found

Posted by Bryn Keller <xo...@xoltar.org>.
Hi Mohit,

If you're using an hdfs url, you'll want to use
hdfs://host:port/path/to/file, e.g. hdfs://master:9000/user/hadoop/foo.csv.

The file:/ pattern is right for files on the local machine, but I'm not
sure it can read from just one machine, it might need to be available on
all the other nodes as well - others can probably comment on that. HDFS is
usually easier to work with for this reason.

Thanks,
Bryn


On Thu, Feb 20, 2014 at 4:25 PM, Mohit Singh <mo...@gmail.com> wrote:

> Hi,
>   I am trying to read a file from localdisk..
> And just counting number of lines in that file..
> But I see this error:
> Task 2.0:225 failed 4 times (most recent failure: Exception failure:
> java.io.FileNotFoundException: File *file:/*home/hadoop/data/backup/data/domain/domainz0
> does not exist.)
>
> But the file is there..
> Though the file:/ doesnt look right?
> Also, if i try to read from hdfs:
> Incomplete HDFS URI, no host: hdfs:/user/hadoop/foo.csv
> Shouldnt it be hdfs:///user/hadoop/foo.csv
> Am I missing something?
>
> Thanks
>
> --
> Mohit
>
> "When you want success as badly as you want the air, then you will get it.
> There is no other secret of success."
> -Socrates
>