You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-user@hadoop.apache.org by nipun_mlist Assam <ni...@gmail.com> on 2009/09/29 08:16:31 UTC

problem in copying files

Hi All,

I have installed hadoop 0.20.1 on my system and set up a pseudo
cluster configuration.
Below is the core-site.xml I am using:

<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<property>
  <name>hadoop.tmp.dir</name>
  <value>/tmp/hadoop-${user.name}</value>
</property>
<property>
  <name>fs.default.name</name>
  <value>hdfs://localhost:54310</value>
</property>
<property>
  <name>mapred.job.tracker</name>
  <value>hdfs://localhost:54311</value>
</property>
<property>
  <name>dfs.replication</name>
  <value>1</value>
</property>
<property>
  <name>mapred.child.java.opts</name>
  <value>-Xmx512m</value>
</property>
</configuration>


Formatted hdfs using  command "$ hadoop namenode -format" and I get no error.
Then I started-all components using  "$ start-all.sh" and yet I get no error.
Env variables HADOOP_HOME etc are correctly defined.

Now I get the following errors:

$hadoop dfs -ls
ls: Cannot access .: No such file or directory.

$ hadoop dfs -copyFromLocal      .vimrc    .
$ hadoop dfs -ls
Found 1 items
-rw-r--r--   3 nipunt supergroup       3051 2009-09-28 23:14 /user/nipunt
$ hadoop dfs -copyFromLocal    .bashrc     .
copyFromLocal: Target  already exists

What is the root cause of the problem ?
How to overcome it ?

Re: problem in copying files

Posted by nipun_mlist Assam <ni...@gmail.com>.
>> I recommend that the first command you run after all daemons are formatted and started is to create your home directory (before u upload files):
Yah. It worked.

On Tue, Sep 29, 2009 at 1:11 PM, Dhruba Borthakur <dh...@gmail.com> wrote:
> I recommend that the first command you run after all daemons are formatted
> and started is to create your home directory (before u upload files):
>
> $hadoop dfs -ls
> ls: Cannot access .: No such file or directory.
> $hadoop dfs -mkdir /user/ninput
> $ hadoop dfs -copyFromLocal      .vimrc    .
> $ hadoop dfs -ls
> Found 1 items
> -rw-r--r--   3 nipunt supergroup       3051 2009-09-28 23:14
> /user/ninput/.vimrc
> $ hadoop dfs -copyFromLocal    .bashrc     .
>
>
> On Mon, Sep 28, 2009 at 11:16 PM, nipun_mlist Assam <ni...@gmail.com>
> wrote:
>>
>> Hi All,
>>
>> I have installed hadoop 0.20.1 on my system and set up a pseudo
>> cluster configuration.
>> Below is the core-site.xml I am using:
>>
>> <?xml version="1.0"?>
>> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
>> <configuration>
>> <property>
>>  <name>hadoop.tmp.dir</name>
>>  <value>/tmp/hadoop-${user.name}</value>
>> </property>
>> <property>
>>  <name>fs.default.name</name>
>>  <value>hdfs://localhost:54310</value>
>> </property>
>> <property>
>>  <name>mapred.job.tracker</name>
>>  <value>hdfs://localhost:54311</value>
>> </property>
>> <property>
>>  <name>dfs.replication</name>
>>  <value>1</value>
>> </property>
>> <property>
>>  <name>mapred.child.java.opts</name>
>>  <value>-Xmx512m</value>
>> </property>
>> </configuration>
>>
>>
>> Formatted hdfs using  command "$ hadoop namenode -format" and I get no
>> error.
>> Then I started-all components using  "$ start-all.sh" and yet I get no
>> error.
>> Env variables HADOOP_HOME etc are correctly defined.
>>
>> Now I get the following errors:
>>
>> $hadoop dfs -ls
>> ls: Cannot access .: No such file or directory.
>>
>> $ hadoop dfs -copyFromLocal      .vimrc    .
>> $ hadoop dfs -ls
>> Found 1 items
>> -rw-r--r--   3 nipunt supergroup       3051 2009-09-28 23:14 /user/nipunt
>> $ hadoop dfs -copyFromLocal    .bashrc     .
>> copyFromLocal: Target  already exists
>>
>> What is the root cause of the problem ?
>> How to overcome it ?
>
>
>
> --
> Connect to me at http://www.facebook.com/dhruba
>

Re: problem in copying files

Posted by Dhruba Borthakur <dh...@gmail.com>.
I recommend that the first command you run after all daemons are formatted
and started is to create your home directory (before u upload files):

$hadoop dfs -ls
ls: Cannot access .: No such file or directory.
$hadoop dfs -mkdir /user/ninput
$ hadoop dfs -copyFromLocal      .vimrc    .
$ hadoop dfs -ls
Found 1 items
-rw-r--r--   3 nipunt supergroup       3051 2009-09-28 23:14
/user/ninput/.vimrc
$ hadoop dfs -copyFromLocal    .bashrc     .


On Mon, Sep 28, 2009 at 11:16 PM, nipun_mlist Assam <ni...@gmail.com>wrote:

> Hi All,
>
> I have installed hadoop 0.20.1 on my system and set up a pseudo
> cluster configuration.
> Below is the core-site.xml I am using:
>
> <?xml version="1.0"?>
> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
> <configuration>
> <property>
>  <name>hadoop.tmp.dir</name>
>  <value>/tmp/hadoop-${user.name}</value>
> </property>
> <property>
>  <name>fs.default.name</name>
>  <value>hdfs://localhost:54310</value>
> </property>
> <property>
>  <name>mapred.job.tracker</name>
>  <value>hdfs://localhost:54311</value>
> </property>
> <property>
>  <name>dfs.replication</name>
>  <value>1</value>
> </property>
> <property>
>  <name>mapred.child.java.opts</name>
>  <value>-Xmx512m</value>
> </property>
> </configuration>
>
>
> Formatted hdfs using  command "$ hadoop namenode -format" and I get no
> error.
> Then I started-all components using  "$ start-all.sh" and yet I get no
> error.
> Env variables HADOOP_HOME etc are correctly defined.
>
> Now I get the following errors:
>
> $hadoop dfs -ls
> ls: Cannot access .: No such file or directory.
>
> $ hadoop dfs -copyFromLocal      .vimrc    .
> $ hadoop dfs -ls
> Found 1 items
> -rw-r--r--   3 nipunt supergroup       3051 2009-09-28 23:14 /user/nipunt
> $ hadoop dfs -copyFromLocal    .bashrc     .
> copyFromLocal: Target  already exists
>
> What is the root cause of the problem ?
> How to overcome it ?
>



-- 
Connect to me at http://www.facebook.com/dhruba