You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Colin Freas <co...@gmail.com> on 2008/04/16 16:35:18 UTC

hdfs "injection" node?

I have a machine that stores a lot of the data I need to put into my
cluster's HDFS.  It's on the same private network as the nodes, but it isn't
a node itself.

What is the easiest way to have it be able to directly inject the data files
into HDFS, without it acting as a datanode for replicas?

I tried an NFS mount, but something either within Hadoop, NFS, my hardware,
or somewhere else, and it would always hang when transferring more than a
few hundred files.

I'm hoping for a more direct solution, like setting up a dummy datanode
without a local storage space or something.  Just wondering if there's a
trick to that, or something.

-Colin