You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by shaik ahamed <sh...@gmail.com> on 2012/08/13 12:12:38 UTC

loading data in HDFS similar to raid concept(i.e i have 100GB data file load as 30GB in one node, 40 GB in other node and 30GB in other node

Hi Users,


                         Is it possible in HDFS to load 100GB file into
30GB, 30GB & 40GB (similar to Raid) concept .If so please let me know the
way in achieving it.


Thanks in advance


Regards,
shaik.

Re: loading data in HDFS similar to raid concept(i.e i have 100GB data file load as 30GB in one node, 40 GB in other node and 30GB in other node

Posted by Bejoy KS <be...@yahoo.com>.
Hi Shaik

AFAIK it is not possible in hadoop. The hdfs storage concept is different from RAID, In hdfs your file is broken down to blocks and each of these blocks are stored in one or more Data Nodes in your cluster based on the replication factor.


Regards
Bejoy KS

Sent from handheld, please excuse typos.

-----Original Message-----
From: shaik ahamed <sh...@gmail.com>
Date: Mon, 13 Aug 2012 15:42:38 
To: <us...@hive.apache.org>
Reply-To: user@hive.apache.org
Subject: loading data in HDFS similar to raid concept(i.e i have 100GB data
 file load as 30GB in one node, 40 GB in other node and 30GB in other node

Hi Users,


                         Is it possible in HDFS to load 100GB file into
30GB, 30GB & 40GB (similar to Raid) concept .If so please let me know the
way in achieving it.


Thanks in advance


Regards,
shaik.