You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Xun TANG <ta...@gmail.com> on 2013/04/23 06:48:03 UTC

Table present in HDFS but 'show table' Returns Empty

Hi guys,

I've created hive tables via 'hive -f' or hive interactive shell. Data are
loaded to tables right after, so the tables are not empty. However, when I
log out and log in to hive shell again, 'show tables' return 0 table, while
'hadoop list' command shows the hdfs files are still where they were.

So my questions are,
1. What triggered hive to 'forget' the tables?
2. How to make hive permanently (or within the lifecycle of hadoop daemons)
remember the tables until they were dropped?

This problem has puzzled me for a while, and I could not find similar
question/answer online... Did I miss some configuration?


Thanks,
Alice

Re: Table present in HDFS but 'show table' Returns Empty

Posted by Xun TANG <ta...@gmail.com>.
That's exactly why! Thank you so much.

Alice


On Mon, Apr 22, 2013 at 11:12 PM, Ramki Palle <ra...@gmail.com> wrote:

> May be you are using derby as your metastore. It creates the metastore in
> the current directory from where you started your hive session. You may
> have started your hive session from a different directory next time.
>
> Please use either mysql as your metastore or set a definite directory in
> your config file as your metastore if you continue to use derby as your
> meta store.
>
> Regards,
> Ramki.
>
>
> On Mon, Apr 22, 2013 at 9:48 PM, Xun TANG <ta...@gmail.com> wrote:
>
>> Hi guys,
>>
>> I've created hive tables via 'hive -f' or hive interactive shell. Data
>> are loaded to tables right after, so the tables are not empty. However,
>> when I log out and log in to hive shell again, 'show tables' return 0
>> table, while 'hadoop list' command shows the hdfs files are still where
>> they were.
>>
>> So my questions are,
>> 1. What triggered hive to 'forget' the tables?
>> 2. How to make hive permanently (or within the lifecycle of hadoop
>> daemons) remember the tables until they were dropped?
>>
>> This problem has puzzled me for a while, and I could not find similar
>> question/answer online... Did I miss some configuration?
>>
>>
>> Thanks,
>> Alice
>>
>
>

Re: Table present in HDFS but 'show table' Returns Empty

Posted by Ramki Palle <ra...@gmail.com>.
May be you are using derby as your metastore. It creates the metastore in
the current directory from where you started your hive session. You may
have started your hive session from a different directory next time.

Please use either mysql as your metastore or set a definite directory in
your config file as your metastore if you continue to use derby as your
meta store.

Regards,
Ramki.


On Mon, Apr 22, 2013 at 9:48 PM, Xun TANG <ta...@gmail.com> wrote:

> Hi guys,
>
> I've created hive tables via 'hive -f' or hive interactive shell. Data are
> loaded to tables right after, so the tables are not empty. However, when I
> log out and log in to hive shell again, 'show tables' return 0 table, while
> 'hadoop list' command shows the hdfs files are still where they were.
>
> So my questions are,
> 1. What triggered hive to 'forget' the tables?
> 2. How to make hive permanently (or within the lifecycle of hadoop
> daemons) remember the tables until they were dropped?
>
> This problem has puzzled me for a while, and I could not find similar
> question/answer online... Did I miss some configuration?
>
>
> Thanks,
> Alice
>