You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Nikolay Grebnev <ni...@gmail.com> on 2008/11/29 00:12:12 UTC

configuration questions

Hello,

I installed hadoop 0.19.0 on several servers.
Please help me in configuration of the cluster.

1 How can I set that hadoop will place all files on HDFS into big
files on local filesystem. By default, all files are placed separately
to
~/data/hadoop/dfs/data/current
-rw-r--r--  1 hadoop users 715542 2008-11-29 01:01 blk_-8690966142665497288
-rw-r--r--  1 hadoop users     71 2008-11-29 01:16
blk_8457110993060324288_1062.meta
drwxr-xr-x  2 hadoop users  12288 2008-11-29 01:18 subdir0/
drwxr-xr-x  2 hadoop users  12288 2008-11-29 01:18 subdir1/

2  At present when one datanode dies I can't put any new files to the
cluster. Can I set that when one (several) datanodes die the cluster
works without delays, and when datanodes are up then they eventually
receive all missing information

3 When datanode looses network connection can a local client work with
data on this datanode?

Best,
Nik