You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "Dharmendra Pratap Singh (JIRA)" <ji...@apache.org> on 2016/01/12 10:34:39 UTC

[jira] [Updated] (HIVE-6589) Automatically add partitions for external tables

     [ https://issues.apache.org/jira/browse/HIVE-6589?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Dharmendra Pratap Singh updated HIVE-6589:
------------------------------------------
    Affects Version/s:     (was: 0.10.0)
                       0.14.0

> Automatically add partitions for external tables
> ------------------------------------------------
>
>                 Key: HIVE-6589
>                 URL: https://issues.apache.org/jira/browse/HIVE-6589
>             Project: Hive
>          Issue Type: New Feature
>    Affects Versions: 0.14.0
>            Reporter: Ken Dallmeyer
>            Assignee: Dharmendra Pratap Singh
>
> I have a data stream being loaded into Hadoop via Flume. It loads into a date partition folder in HDFS.  The path looks like this:
> {code}/flume/my_data/YYYY/MM/DD/HH
> /flume/my_data/2014/03/02/01
> /flume/my_data/2014/03/02/02
> /flume/my_data/2014/03/02/03{code}
> On top of it I create an EXTERNAL hive table to do querying.  As of now, I have to manually add partitions.  What I want is for EXTERNAL tables, Hive should "discover" those partitions.  Additionally I would like to specify a partition pattern so that when I query Hive will know to use the partition pattern to find the HDFS folder.
> So something like this:
> {code}CREATE EXTERNAL TABLE my_data (
>   col1 STRING,
>   col2 INT
> )
> PARTITIONED BY (
>   dt STRING,
>   hour STRING
> )
> LOCATION 
>   '/flume/mydata'
> TBLPROPERTIES (
>   'hive.partition.spec' = 'dt=$Y-$M-$D, hour=$H',
>   'hive.partition.spec.location' = '$Y/$M/$D/$H',
> );
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)