You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Stephen Boesch <ja...@gmail.com> on 2014/06/12 02:37:07 UTC

Hive classes for Catalyst

Hi,
  The documentation of Catalyst describes using HiveContext; however, the
scala classes do not exist in Master or 1.0.0 Branch.  What is the
replacement/equivalent in Master?

Package does not exist:
org.apache.spark.sql.hive

Here is code from SQL on Spark meetup slides referencing that
package/classes:

val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
import hiveContext._

hql("CREATE TABLE IF NOT EXISTS src (key INT, value STRING)")

Re: Hive classes for Catalyst

Posted by Stephen Boesch <ja...@gmail.com>.
Thanks for the (super) quick replies.  My bad - i was looking under
spark/sql/*catalyst*  instead of /spark/sql/hive


2014-06-11 17:40 GMT-07:00 Mark Hamstra <ma...@clearstorydata.com>:

> And the code is right here:
> https://github.com/apache/spark/blob/master/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveContext.scala
>
>
> On Wed, Jun 11, 2014 at 5:38 PM, Michael Armbrust <mi...@databricks.com>
> wrote:
>
>> You will need to compile spark with SPARK_HIVE=true.
>>
>>
>> On Wed, Jun 11, 2014 at 5:37 PM, Stephen Boesch <ja...@gmail.com>
>> wrote:
>>
>>> Hi,
>>>   The documentation of Catalyst describes using HiveContext; however,
>>> the scala classes do not exist in Master or 1.0.0 Branch.  What is the
>>> replacement/equivalent in Master?
>>>
>>> Package does not exist:
>>> org.apache.spark.sql.hive
>>>
>>> Here is code from SQL on Spark meetup slides referencing that
>>> package/classes:
>>>
>>> val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
>>> import hiveContext._
>>>
>>> hql("CREATE TABLE IF NOT EXISTS src (key INT, value STRING)")
>>>
>>>
>>>
>>>
>>
>

Re: Hive classes for Catalyst

Posted by Mark Hamstra <ma...@clearstorydata.com>.
And the code is right here:
https://github.com/apache/spark/blob/master/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveContext.scala


On Wed, Jun 11, 2014 at 5:38 PM, Michael Armbrust <mi...@databricks.com>
wrote:

> You will need to compile spark with SPARK_HIVE=true.
>
>
> On Wed, Jun 11, 2014 at 5:37 PM, Stephen Boesch <ja...@gmail.com> wrote:
>
>> Hi,
>>   The documentation of Catalyst describes using HiveContext; however, the
>> scala classes do not exist in Master or 1.0.0 Branch.  What is the
>> replacement/equivalent in Master?
>>
>> Package does not exist:
>> org.apache.spark.sql.hive
>>
>> Here is code from SQL on Spark meetup slides referencing that
>> package/classes:
>>
>> val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
>> import hiveContext._
>>
>> hql("CREATE TABLE IF NOT EXISTS src (key INT, value STRING)")
>>
>>
>>
>>
>

Re: Hive classes for Catalyst

Posted by Michael Armbrust <mi...@databricks.com>.
You will need to compile spark with SPARK_HIVE=true.


On Wed, Jun 11, 2014 at 5:37 PM, Stephen Boesch <ja...@gmail.com> wrote:

> Hi,
>   The documentation of Catalyst describes using HiveContext; however, the
> scala classes do not exist in Master or 1.0.0 Branch.  What is the
> replacement/equivalent in Master?
>
> Package does not exist:
> org.apache.spark.sql.hive
>
> Here is code from SQL on Spark meetup slides referencing that
> package/classes:
>
> val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
> import hiveContext._
>
> hql("CREATE TABLE IF NOT EXISTS src (key INT, value STRING)")
>
>
>
>