You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-user@hadoop.apache.org by Scott <sk...@weather.com> on 2010/01/07 17:34:19 UTC
Cant get HDFS to load more than 1Gig total data
Hi, new to hadoop/HDFS. We have a 4 node test cluster, and I can't
write more than approximately 1 Gig of total data. Any additional file
puts error out with the following:
WARN hdfs.DFSClient: DataStreamer Exception:
org.apache.hadoop.ipc.RemoteException: java.io.IOException: File
/user/hadoop/ads3x11-1256301562.log.lzo could only be replicated to 0
nodes, instead of 1
I have checked quotas and found none. I have also tried other users,
including the hadoop user, and get the same result. Any ideas?
Thanks,
Scott
Re: Cant get HDFS to load more than 1Gig total data
Posted by Allen Wittenauer <aw...@linkedin.com>.
On 1/7/10 8:34 AM, "Scott" <sk...@weather.com> wrote:
> WARN hdfs.DFSClient: DataStreamer Exception:
> org.apache.hadoop.ipc.RemoteException: java.io.IOException: File
> /user/hadoop/ads3x11-1256301562.log.lzo could only be replicated to 0
> nodes, instead of 1
This almost always means your HDFS is in safemode and/or has no live
datanodes.
> I have checked quotas and found none. I have also tried other users,
> including the hadoop user, and get the same result. Any ideas?
How is the namenode heap?
Are you out of physical space?
What does hadoop fsck / say?