You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by sandeep vura <sa...@gmail.com> on 2015/03/13 16:30:21 UTC

Errors in SPARK

Hi Sparkers,

Can anyone please check the below error and give solution for this.I am
using hive version 0.13 and spark 1.2.1 .

Step 1 : I have installed hive 0.13 with local metastore (mySQL database)
Step 2:  Hive is running without any errors and able to create tables and
loading data in hive table
Step 3: copied hive-site.xml in spark/conf directory
Step 4: copied core-site.xml in spakr/conf directory
Step 5: started spark shell

Please check the below error for clarifications.

scala> val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
sqlContext: org.apache.spark.sql.hive.HiveContext =
org.apache.spark.sql.hive.Hi
                                         veContext@2821ec0c

scala> sqlContext.sql("CREATE TABLE IF NOT EXISTS src (key INT, value
STRING)")
java.lang.RuntimeException: java.lang.RuntimeException: Unable to
instantiate or

 g.apache.hadoop.hive.metastore.HiveMetaStoreClient
        at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.jav

     a:346)
        at
org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.sc

     ala:235)
        at
org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.sc

     ala:231)
        at scala.Option.orElse(Option.scala:257)
        at
org.apache.spark.sql.hive.HiveContext.x$3$lzycompute(HiveContext.scal

     a:231)
        at org.apache.spark.sql.hive.HiveContext.x$3(HiveContext.scala:229)
        at
org.apache.spark.sql.hive.HiveContext.hiveconf$lzycompute(HiveContext

     .scala:229)
        at
org.apache.spark.sql.hive.HiveContext.hiveconf(HiveContext.scala:229)
        at
org.apache.spark.sql.hive.HiveMetastoreCatalog.<init>(HiveMetastoreCa

     talog.scala:55)

Regards,
Sandeep.v

Re: Errors in SPARK

Posted by Denny Lee <de...@gmail.com>.
The error you're seeing typically means that you cannot connect to the Hive
metastore itself.  Some quick thoughts:
- If you were to run "show tables" (instead of the CREATE TABLE statement),
are you still getting the same error?

- To confirm, the Hive metastore (MySQL database) is up and running

- Did you download or build your version of Spark?




On Tue, Mar 24, 2015 at 10:48 PM sandeep vura <sa...@gmail.com> wrote:

> Hi Denny,
>
> Still facing the same issue.Please find the following errors.
>
> *scala> val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)*
> *sqlContext: org.apache.spark.sql.hive.HiveContext =
> org.apache.spark.sql.hive.HiveContext@4e4f880c*
>
> *scala> sqlContext.sql("CREATE TABLE IF NOT EXISTS src (key INT, value
> STRING)")*
> *java.lang.RuntimeException: java.lang.RuntimeException: Unable to
> instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient*
>
> Cheers,
> Sandeep.v
>
> On Wed, Mar 25, 2015 at 11:10 AM, sandeep vura <sa...@gmail.com>
> wrote:
>
>> No I am just running ./spark-shell command in terminal I will try with
>> above command
>>
>> On Wed, Mar 25, 2015 at 11:09 AM, Denny Lee <de...@gmail.com>
>> wrote:
>>
>>> Did you include the connection to a MySQL connector jar so that way
>>> spark-shell / hive can connect to the metastore?
>>>
>>> For example, when I run my spark-shell instance in standalone mode, I
>>> use:
>>> ./spark-shell --master spark://servername:7077 --driver-class-path /lib/
>>> mysql-connector-java-5.1.27.jar
>>>
>>>
>>>
>>> On Fri, Mar 13, 2015 at 8:31 AM sandeep vura <sa...@gmail.com>
>>> wrote:
>>>
>>>> Hi Sparkers,
>>>>
>>>> Can anyone please check the below error and give solution for this.I am
>>>> using hive version 0.13 and spark 1.2.1 .
>>>>
>>>> Step 1 : I have installed hive 0.13 with local metastore (mySQL
>>>> database)
>>>> Step 2:  Hive is running without any errors and able to create tables
>>>> and loading data in hive table
>>>> Step 3: copied hive-site.xml in spark/conf directory
>>>> Step 4: copied core-site.xml in spakr/conf directory
>>>> Step 5: started spark shell
>>>>
>>>> Please check the below error for clarifications.
>>>>
>>>> scala> val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
>>>> sqlContext: org.apache.spark.sql.hive.HiveContext =
>>>> org.apache.spark.sql.hive.Hi
>>>>                                          veContext@2821ec0c
>>>>
>>>> scala> sqlContext.sql("CREATE TABLE IF NOT EXISTS src (key INT, value
>>>> STRING)")
>>>> java.lang.RuntimeException: java.lang.RuntimeException: Unable to
>>>> instantiate or
>>>>                            g.apache.hadoop.hive.
>>>> metastore.HiveMetaStoreClient
>>>>         at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.jav
>>>>
>>>>            a:346)
>>>>         at org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.sc
>>>>
>>>>            ala:235)
>>>>         at org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.sc
>>>>
>>>>            ala:231)
>>>>         at scala.Option.orElse(Option.scala:257)
>>>>         at org.apache.spark.sql.hive.HiveContext.x$3$lzycompute(HiveContext.scal
>>>>
>>>>            a:231)
>>>>         at org.apache.spark.sql.hive.HiveContext.x$3(HiveContext.
>>>> scala:229)
>>>>         at org.apache.spark.sql.hive.HiveContext.hiveconf$lzycompute(HiveContext
>>>>
>>>>            .scala:229)
>>>>         at org.apache.spark.sql.hive.HiveContext.hiveconf(
>>>> HiveContext.scala:229)
>>>>         at org.apache.spark.sql.hive.HiveMetastoreCatalog.<init>(HiveMetastoreCa
>>>>
>>>>            talog.scala:55)
>>>>
>>>> Regards,
>>>> Sandeep.v
>>>>
>>>>
>>
>

Re: Errors in SPARK

Posted by sandeep vura <sa...@gmail.com>.
Hi Denny,

Still facing the same issue.Please find the following errors.

*scala> val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)*
*sqlContext: org.apache.spark.sql.hive.HiveContext =
org.apache.spark.sql.hive.HiveContext@4e4f880c*

*scala> sqlContext.sql("CREATE TABLE IF NOT EXISTS src (key INT, value
STRING)")*
*java.lang.RuntimeException: java.lang.RuntimeException: Unable to
instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient*

Cheers,
Sandeep.v

On Wed, Mar 25, 2015 at 11:10 AM, sandeep vura <sa...@gmail.com>
wrote:

> No I am just running ./spark-shell command in terminal I will try with
> above command
>
> On Wed, Mar 25, 2015 at 11:09 AM, Denny Lee <de...@gmail.com> wrote:
>
>> Did you include the connection to a MySQL connector jar so that way
>> spark-shell / hive can connect to the metastore?
>>
>> For example, when I run my spark-shell instance in standalone mode, I use:
>> ./spark-shell --master spark://servername:7077 --driver-class-path
>> /lib/mysql-connector-java-5.1.27.jar
>>
>>
>>
>> On Fri, Mar 13, 2015 at 8:31 AM sandeep vura <sa...@gmail.com>
>> wrote:
>>
>>> Hi Sparkers,
>>>
>>> Can anyone please check the below error and give solution for this.I am
>>> using hive version 0.13 and spark 1.2.1 .
>>>
>>> Step 1 : I have installed hive 0.13 with local metastore (mySQL database)
>>> Step 2:  Hive is running without any errors and able to create tables
>>> and loading data in hive table
>>> Step 3: copied hive-site.xml in spark/conf directory
>>> Step 4: copied core-site.xml in spakr/conf directory
>>> Step 5: started spark shell
>>>
>>> Please check the below error for clarifications.
>>>
>>> scala> val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
>>> sqlContext: org.apache.spark.sql.hive.HiveContext =
>>> org.apache.spark.sql.hive.Hi
>>>                                          veContext@2821ec0c
>>>
>>> scala> sqlContext.sql("CREATE TABLE IF NOT EXISTS src (key INT, value
>>> STRING)")
>>> java.lang.RuntimeException: java.lang.RuntimeException: Unable to
>>> instantiate or
>>>
>>>  g.apache.hadoop.hive.metastore.HiveMetaStoreClient
>>>         at
>>> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.jav
>>>
>>>      a:346)
>>>         at
>>> org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.sc
>>>
>>>      ala:235)
>>>         at
>>> org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.sc
>>>
>>>      ala:231)
>>>         at scala.Option.orElse(Option.scala:257)
>>>         at
>>> org.apache.spark.sql.hive.HiveContext.x$3$lzycompute(HiveContext.scal
>>>
>>>      a:231)
>>>         at
>>> org.apache.spark.sql.hive.HiveContext.x$3(HiveContext.scala:229)
>>>         at
>>> org.apache.spark.sql.hive.HiveContext.hiveconf$lzycompute(HiveContext
>>>
>>>      .scala:229)
>>>         at
>>> org.apache.spark.sql.hive.HiveContext.hiveconf(HiveContext.scala:229)
>>>         at
>>> org.apache.spark.sql.hive.HiveMetastoreCatalog.<init>(HiveMetastoreCa
>>>
>>>      talog.scala:55)
>>>
>>> Regards,
>>> Sandeep.v
>>>
>>>
>

Re: Errors in SPARK

Posted by sandeep vura <sa...@gmail.com>.
No I am just running ./spark-shell command in terminal I will try with
above command

On Wed, Mar 25, 2015 at 11:09 AM, Denny Lee <de...@gmail.com> wrote:

> Did you include the connection to a MySQL connector jar so that way
> spark-shell / hive can connect to the metastore?
>
> For example, when I run my spark-shell instance in standalone mode, I use:
> ./spark-shell --master spark://servername:7077 --driver-class-path
> /lib/mysql-connector-java-5.1.27.jar
>
>
>
> On Fri, Mar 13, 2015 at 8:31 AM sandeep vura <sa...@gmail.com>
> wrote:
>
>> Hi Sparkers,
>>
>> Can anyone please check the below error and give solution for this.I am
>> using hive version 0.13 and spark 1.2.1 .
>>
>> Step 1 : I have installed hive 0.13 with local metastore (mySQL database)
>> Step 2:  Hive is running without any errors and able to create tables and
>> loading data in hive table
>> Step 3: copied hive-site.xml in spark/conf directory
>> Step 4: copied core-site.xml in spakr/conf directory
>> Step 5: started spark shell
>>
>> Please check the below error for clarifications.
>>
>> scala> val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
>> sqlContext: org.apache.spark.sql.hive.HiveContext =
>> org.apache.spark.sql.hive.Hi
>>                                          veContext@2821ec0c
>>
>> scala> sqlContext.sql("CREATE TABLE IF NOT EXISTS src (key INT, value
>> STRING)")
>> java.lang.RuntimeException: java.lang.RuntimeException: Unable to
>> instantiate or
>>
>>  g.apache.hadoop.hive.metastore.HiveMetaStoreClient
>>         at
>> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.jav
>>
>>      a:346)
>>         at
>> org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.sc
>>
>>      ala:235)
>>         at
>> org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.sc
>>
>>      ala:231)
>>         at scala.Option.orElse(Option.scala:257)
>>         at
>> org.apache.spark.sql.hive.HiveContext.x$3$lzycompute(HiveContext.scal
>>
>>      a:231)
>>         at
>> org.apache.spark.sql.hive.HiveContext.x$3(HiveContext.scala:229)
>>         at
>> org.apache.spark.sql.hive.HiveContext.hiveconf$lzycompute(HiveContext
>>
>>      .scala:229)
>>         at
>> org.apache.spark.sql.hive.HiveContext.hiveconf(HiveContext.scala:229)
>>         at
>> org.apache.spark.sql.hive.HiveMetastoreCatalog.<init>(HiveMetastoreCa
>>
>>      talog.scala:55)
>>
>> Regards,
>> Sandeep.v
>>
>>

Re: Errors in SPARK

Posted by Denny Lee <de...@gmail.com>.
Did you include the connection to a MySQL connector jar so that way
spark-shell / hive can connect to the metastore?

For example, when I run my spark-shell instance in standalone mode, I use:
./spark-shell --master spark://servername:7077 --driver-class-path
/lib/mysql-connector-java-5.1.27.jar



On Fri, Mar 13, 2015 at 8:31 AM sandeep vura <sa...@gmail.com> wrote:

> Hi Sparkers,
>
> Can anyone please check the below error and give solution for this.I am
> using hive version 0.13 and spark 1.2.1 .
>
> Step 1 : I have installed hive 0.13 with local metastore (mySQL database)
> Step 2:  Hive is running without any errors and able to create tables and
> loading data in hive table
> Step 3: copied hive-site.xml in spark/conf directory
> Step 4: copied core-site.xml in spakr/conf directory
> Step 5: started spark shell
>
> Please check the below error for clarifications.
>
> scala> val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
> sqlContext: org.apache.spark.sql.hive.HiveContext =
> org.apache.spark.sql.hive.Hi
>                                          veContext@2821ec0c
>
> scala> sqlContext.sql("CREATE TABLE IF NOT EXISTS src (key INT, value
> STRING)")
> java.lang.RuntimeException: java.lang.RuntimeException: Unable to
> instantiate or
>
>  g.apache.hadoop.hive.metastore.HiveMetaStoreClient
>         at
> org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.jav
>
>      a:346)
>         at
> org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.sc
>
>      ala:235)
>         at
> org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.sc
>
>      ala:231)
>         at scala.Option.orElse(Option.scala:257)
>         at
> org.apache.spark.sql.hive.HiveContext.x$3$lzycompute(HiveContext.scal
>
>      a:231)
>         at org.apache.spark.sql.hive.HiveContext.x$3(HiveContext.scala:229)
>         at
> org.apache.spark.sql.hive.HiveContext.hiveconf$lzycompute(HiveContext
>
>      .scala:229)
>         at
> org.apache.spark.sql.hive.HiveContext.hiveconf(HiveContext.scala:229)
>         at
> org.apache.spark.sql.hive.HiveMetastoreCatalog.<init>(HiveMetastoreCa
>
>      talog.scala:55)
>
> Regards,
> Sandeep.v
>
>