You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "Dan Gustafsson (JIRA)" <ji...@apache.org> on 2016/05/31 16:58:13 UTC

[jira] [Comment Edited] (HIVE-6589) Automatically add partitions for external tables

    [ https://issues.apache.org/jira/browse/HIVE-6589?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15308032#comment-15308032 ] 

Dan Gustafsson edited comment on HIVE-6589 at 5/31/16 4:57 PM:
---------------------------------------------------------------

This is a nice-to-have, but would really simplify loading of data. 

The solution given by  the blogs are :

MSCK REPAIR TABLE your_table_name;

This seems to identify and repair partitions, if they have been created as folders with the "partition_symbol=partition_value" structure.


was (Author: dan0704090017@hotmail.com):
This is a nice-to-have, but would really simplify loading of data. Any solution in sight (that i might have missed) ?

> Automatically add partitions for external tables
> ------------------------------------------------
>
>                 Key: HIVE-6589
>                 URL: https://issues.apache.org/jira/browse/HIVE-6589
>             Project: Hive
>          Issue Type: New Feature
>    Affects Versions: 0.14.0
>            Reporter: Ken Dallmeyer
>            Assignee: Dharmendra Pratap Singh
>
> I have a data stream being loaded into Hadoop via Flume. It loads into a date partition folder in HDFS.  The path looks like this:
> {code}/flume/my_data/YYYY/MM/DD/HH
> /flume/my_data/2014/03/02/01
> /flume/my_data/2014/03/02/02
> /flume/my_data/2014/03/02/03{code}
> On top of it I create an EXTERNAL hive table to do querying.  As of now, I have to manually add partitions.  What I want is for EXTERNAL tables, Hive should "discover" those partitions.  Additionally I would like to specify a partition pattern so that when I query Hive will know to use the partition pattern to find the HDFS folder.
> So something like this:
> {code}CREATE EXTERNAL TABLE my_data (
>   col1 STRING,
>   col2 INT
> )
> PARTITIONED BY (
>   dt STRING,
>   hour STRING
> )
> LOCATION 
>   '/flume/mydata'
> TBLPROPERTIES (
>   'hive.partition.spec' = 'dt=$Y-$M-$D, hour=$H',
>   'hive.partition.spec.location' = '$Y/$M/$D/$H',
> );
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)