You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Arun Luthra <ar...@gmail.com> on 2015/07/07 20:07:56 UTC
How to change hive database?
https://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.sql.hive.HiveContext
I'm getting org.apache.spark.sql.catalyst.analysis.NoSuchTableException
from:
val dataframe = hiveContext.table("other_db.mytable")
Do I have to change current database to access it? Is it possible to do
this? I'm guessing that the "database.table" syntax that I used in
hiveContext.table is not recognized.
I have no problems accessing tables in the database called "default".
I can list tables in "other_db" with hiveContext.tableNames("other_db")
Using Spark 1.4.0.
Re: How to change hive database?
Posted by Arun Luthra <ar...@gmail.com>.
Thanks, it works.
On Tue, Jul 7, 2015 at 11:15 AM, Ted Yu <yu...@gmail.com> wrote:
> See this thread http://search-hadoop.com/m/q3RTt0NFls1XATV02
>
> Cheers
>
> On Tue, Jul 7, 2015 at 11:07 AM, Arun Luthra <ar...@gmail.com>
> wrote:
>
>>
>> https://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.sql.hive.HiveContext
>>
>> I'm getting org.apache.spark.sql.catalyst.analysis.NoSuchTableException
>> from:
>>
>> val dataframe = hiveContext.table("other_db.mytable")
>>
>> Do I have to change current database to access it? Is it possible to do
>> this? I'm guessing that the "database.table" syntax that I used in
>> hiveContext.table is not recognized.
>>
>> I have no problems accessing tables in the database called "default".
>>
>> I can list tables in "other_db" with hiveContext.tableNames("other_db")
>>
>> Using Spark 1.4.0.
>>
>>
>>
>
Re: How to change hive database?
Posted by Ted Yu <yu...@gmail.com>.
See this thread http://search-hadoop.com/m/q3RTt0NFls1XATV02
Cheers
On Tue, Jul 7, 2015 at 11:07 AM, Arun Luthra <ar...@gmail.com> wrote:
>
> https://spark.apache.org/docs/latest/api/scala/index.html#org.apache.spark.sql.hive.HiveContext
>
> I'm getting org.apache.spark.sql.catalyst.analysis.NoSuchTableException
> from:
>
> val dataframe = hiveContext.table("other_db.mytable")
>
> Do I have to change current database to access it? Is it possible to do
> this? I'm guessing that the "database.table" syntax that I used in
> hiveContext.table is not recognized.
>
> I have no problems accessing tables in the database called "default".
>
> I can list tables in "other_db" with hiveContext.tableNames("other_db")
>
> Using Spark 1.4.0.
>
>
>