You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by 王洋 <wa...@gmail.com> on 2014/08/28 16:45:04 UTC

How to merge the output file when insert data into a table

Hello,
     I have two table ,one is stored as textfile for loading data from
local data, the other is stored as rcfile for compressing data。
     when I execute the following sql two times

*     insert into table tablename1 select * from tablename2*

     I find that there are two files in the warehouse

      *hadoop fs -ls /user/hive/warehouse/**tablename2**/*
*      Found 2 items*
*     -rwxrwxrwt   3 root supergroup        423 2014-08-28 21:09
/user/hive/warehouse/**tablename2**/000000_0*
*     -rwxrwxrwt   3 root supergroup        423 2014-08-28 21:11
/user/hive/warehouse/**tablename2**/000000_0_copy_1*

   I need to merge the two files to reduce the small file, is there have
some way to merge the files?
   thank you !

Re: How to merge the output file when insert data into a table

Posted by 王洋 <wa...@gmail.com>.
It takes effect when i use "alter table tablename concatenate" after a
insert sql
thanks


2014-08-29 0:44 GMT+08:00 Prasanth Jayachandran <
pjayachandran@hortonworks.com>:

> Hi
>
> If its an rcfile you can use "alter table tablename concatenate" to merge
> them into 1 file. For text files, you can may have to reload these 2 files
> into another table with "Order By". This will force one reducer to generate
> total ordering and thereby generating 1 output file. But remember if the
> data size huge order by will take very long time.
>
> Thanks
> Prasanth
>
> On Aug 28, 2014, at 7:45 AM, 王洋 <wa...@gmail.com> wrote:
>
> Hello,
>      I have two table ,one is stored as textfile for loading data from
> local data, the other is stored as rcfile for compressing data。
>      when I execute the following sql two times
>
> *     insert into table tablename1 select * from tablename2*
>
>      I find that there are two files in the warehouse
>
>       *hadoop fs -ls /user/hive/warehouse/**tablename2**/*
> *      Found 2 items*
> *     -rwxrwxrwt   3 root supergroup        423 2014-08-28 21:09
> /user/hive/warehouse/**tablename2**/000000_0*
>  *     -rwxrwxrwt   3 root supergroup        423 2014-08-28 21:11
> /user/hive/warehouse/**tablename2**/000000_0_copy_1*
>
>     I need to merge the two files to reduce the small file, is there have
> some way to merge the files?
>    thank you !
>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.

Re: How to merge the output file when insert data into a table

Posted by Prasanth Jayachandran <pj...@hortonworks.com>.
Hi

If its an rcfile you can use "alter table tablename concatenate" to merge them into 1 file. For text files, you can may have to reload these 2 files into another table with "Order By". This will force one reducer to generate total ordering and thereby generating 1 output file. But remember if the data size huge order by will take very long time. 

Thanks
Prasanth

> On Aug 28, 2014, at 7:45 AM, 王洋 <wa...@gmail.com> wrote:
> 
> Hello,
>      I have two table ,one is stored as textfile for loading data from local data, the other is stored as rcfile for compressing data。
>      when I execute the following sql two times
>      
>      insert into table tablename1 select * from tablename2
>    
>      I find that there are two files in the warehouse
> 
>       hadoop fs -ls /user/hive/warehouse/tablename2/
>       Found 2 items
>      -rwxrwxrwt   3 root supergroup        423 2014-08-28 21:09 /user/hive/warehouse/tablename2/000000_0
>      -rwxrwxrwt   3 root supergroup        423 2014-08-28 21:11 /user/hive/warehouse/tablename2/000000_0_copy_1
> 
>    I need to merge the two files to reduce the small file, is there have some way to merge the files?  
>    thank you !

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: How to merge the output file when insert data into a table

Posted by 王洋 <wa...@gmail.com>.
I collect data from a real-time system, we have not another database.


2014-08-29 0:59 GMT+08:00 Venkat V <ve...@gmail.com>:

> Have you tried sqoop merge?
>
>
> On Thu, Aug 28, 2014 at 7:45 AM, 王洋 <wa...@gmail.com> wrote:
>
>> Hello,
>>      I have two table ,one is stored as textfile for loading data from
>> local data, the other is stored as rcfile for compressing data。
>>      when I execute the following sql two times
>>
>> *     insert into table tablename1 select * from tablename2*
>>
>>      I find that there are two files in the warehouse
>>
>>       *hadoop fs -ls /user/hive/warehouse/**tablename2**/*
>> *      Found 2 items*
>> *     -rwxrwxrwt   3 root supergroup        423 2014-08-28 21:09
>> /user/hive/warehouse/**tablename2**/000000_0*
>>  *     -rwxrwxrwt   3 root supergroup        423 2014-08-28 21:11
>> /user/hive/warehouse/**tablename2**/000000_0_copy_1*
>>
>>     I need to merge the two files to reduce the small file, is there
>> have some way to merge the files?
>>    thank you !
>>
>
>
>
> --
> Venkat V
>

Re: How to merge the output file when insert data into a table

Posted by Venkat V <ve...@gmail.com>.
Have you tried sqoop merge?


On Thu, Aug 28, 2014 at 7:45 AM, 王洋 <wa...@gmail.com> wrote:

> Hello,
>      I have two table ,one is stored as textfile for loading data from
> local data, the other is stored as rcfile for compressing data。
>      when I execute the following sql two times
>
> *     insert into table tablename1 select * from tablename2*
>
>      I find that there are two files in the warehouse
>
>       *hadoop fs -ls /user/hive/warehouse/**tablename2**/*
> *      Found 2 items*
> *     -rwxrwxrwt   3 root supergroup        423 2014-08-28 21:09
> /user/hive/warehouse/**tablename2**/000000_0*
>  *     -rwxrwxrwt   3 root supergroup        423 2014-08-28 21:11
> /user/hive/warehouse/**tablename2**/000000_0_copy_1*
>
>     I need to merge the two files to reduce the small file, is there have
> some way to merge the files?
>    thank you !
>



-- 
Venkat V