You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by tridib <tr...@live.com> on 2014/11/06 08:53:39 UTC

Unable to use HiveContext in spark-shell

I am connecting to a remote master using spark shell. Then I am getting
following error while trying to instantiate HiveContext.

scala> val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)

error: bad symbolic reference. A signature in HiveContext.class refers to
term hive
in package org.apache.hadoop which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling
HiveContext.class.
error:
     while compiling: <console>
        during phase: erasure
     library version: version 2.10.4
    compiler version: version 2.10.4
  reconstructed args:

  last tree to typer: Apply(value $outer)
              symbol: value $outer (flags: <method> <synthetic> <stable>
<expandedname> <triedcooking>)
   symbol definition: val $outer(): $iwC.$iwC.type
                 tpe: $iwC.$iwC.type
       symbol owners: value $outer -> class $iwC -> class $iwC -> class $iwC
-> class $read -> package $line5
      context owners: class $iwC -> class $iwC -> class $iwC -> class $iwC
-> class $read -> package $line5

== Enclosing template or block ==

ClassDef( // class $iwC extends Serializable
  0
  "$iwC"
  []
  Template( // val <local $iwC>: <notype>, tree.tpe=$iwC
    "java.lang.Object", "scala.Serializable" // parents
    ValDef(
      private
      "_"
      <tpt>
      <empty>
    )
    // 5 statements
    DefDef( // def <init>(arg$outer: $iwC.$iwC.$iwC.type): $iwC
      <method> <triedcooking>
      "<init>"
      []
      // 1 parameter list
      ValDef( // $outer: $iwC.$iwC.$iwC.type
        
        "$outer"
        <tpt> // tree.tpe=$iwC.$iwC.$iwC.type
        <empty>
      )
      <tpt> // tree.tpe=$iwC
      Block( // tree.tpe=Unit
        Apply( // def <init>(): Object in class Object, tree.tpe=Object
          $iwC.super."<init>" // def <init>(): Object in class Object,
tree.tpe=()Object
          Nil
        )
        ()
      )
    )
    ValDef( // private[this] val sqlContext:
org.apache.spark.sql.hive.HiveContext
      private <local> <triedcooking>
      "sqlContext "
      <tpt> // tree.tpe=org.apache.spark.sql.hive.HiveContext
      Apply( // def <init>(sc: org.apache.spark.SparkContext):
org.apache.spark.sql.hive.HiveContext in class HiveContext,
tree.tpe=org.apache.spark.sql.hive.HiveContext
        new org.apache.spark.sql.hive.HiveContext."<init>" // def <init>(sc:
org.apache.spark.SparkContext): org.apache.spark.sql.hive.HiveContext in
class HiveContext, tree.tpe=(sc:
org.apache.spark.SparkContext)org.apache.spark.sql.hive.HiveContext
        Apply( // val sc(): org.apache.spark.SparkContext,
tree.tpe=org.apache.spark.SparkContext
         
$iwC.this.$line5$$read$$iwC$$iwC$$iwC$$iwC$$$outer().$line5$$read$$iwC$$iwC$$iwC$$$outer().$line5$$read$$iwC$$iwC$$$outer().$VAL1().$iw().$iw()."sc"
// val sc(): org.apache.spark.SparkContext,
tree.tpe=()org.apache.spark.SparkContext
          Nil
        )
      )
    )
    DefDef( // val sqlContext(): org.apache.spark.sql.hive.HiveContext
      <method> <stable> <accessor>
      "sqlContext"
      []
      List(Nil)
      <tpt> // tree.tpe=org.apache.spark.sql.hive.HiveContext
      $iwC.this."sqlContext " // private[this] val sqlContext:
org.apache.spark.sql.hive.HiveContext,
tree.tpe=org.apache.spark.sql.hive.HiveContext
    )
    ValDef( // protected val $outer: $iwC.$iwC.$iwC.type
      protected <synthetic> <paramaccessor> <triedcooking>
      "$outer "
      <tpt> // tree.tpe=$iwC.$iwC.$iwC.type
      <empty>
    )
    DefDef( // val $outer(): $iwC.$iwC.$iwC.type
      <method> <synthetic> <stable> <expandedname> <triedcooking>
      "$line5$$read$$iwC$$iwC$$iwC$$iwC$$$outer"
      []
      List(Nil)
      <tpt> // tree.tpe=Any
      $iwC.this."$outer " // protected val $outer: $iwC.$iwC.$iwC.type,
tree.tpe=$iwC.$iwC.$iwC.type
    )
  )
)

== Expanded type of tree ==

ThisType(class $iwC)

uncaught exception during compilation:
scala.reflect.internal.Types$TypeError
scala.reflect.internal.Types$TypeError: bad symbolic reference. A signature
in HiveContext.class refers to term conf
in value org.apache.hadoop.hive which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling
HiveContext.class.
That entry seems to have slain the compiler.  Shall I replay
your session? I can re-run each line except the last one.
[y/n]


Thanks
Tridib



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-use-HiveContext-in-spark-shell-tp18261.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Unable to use HiveContext in spark-shell

Posted by tridib <tr...@live.com>.
I built spark-1.1.0 in a new fresh machine. This issue is gone! Thank you all
for your help.

Thanks & Regards
Tridib



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-use-HiveContext-in-spark-shell-tp18261p18324.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Unable to use HiveContext in spark-shell

Posted by tridib <tr...@live.com>.
Yes. I have org.apache.hadoop.hive package in spark assembly.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-use-HiveContext-in-spark-shell-tp18261p18322.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Unable to use HiveContext in spark-shell

Posted by Terry Siu <Te...@smartfocus.com>.
Those are the same options I used, except I had —tgz to package it and I built off of the master branch. Unfortunately, my only guess is that these errors stem from your build environment.  In your spark assembly, do you have any classes which belong to the org.apache.hadoop.hive package?


From: Tridib Samanta <tr...@live.com>>
Date: Thursday, November 6, 2014 at 9:49 AM
To: Terry Siu <te...@smartfocus.com>>, "user@spark.incubator.apache.org<ma...@spark.incubator.apache.org>" <us...@spark.incubator.apache.org>>
Subject: RE: Unable to use HiveContext in spark-shell

I am using spark 1.1.0.
I built it using:
./make-distribution.sh -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -DskipTests

My ultimate goal is to execute a query on parquet file with nested structure and cast a date string to Date. This is required to calculate the age of Person entity.
but I am even unable to pass this line:
val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
I made sure that org.apache.hadoop package is in the spark assembly jar.

Re-attaching the stack trace for quick reference.

scala> val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)

error: bad symbolic reference. A signature in HiveContext.class refers to term hive
in package org.apache.hadoop which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling HiveContext.class.
error:
     while compiling: <console>
        during phase: erasure
     library version: version 2.10.4
    compiler version: version 2.10.4
  reconstructed args:

  last tree to typer: Apply(value $outer)
              symbol: value $outer (flags: <method> <synthetic> <stable> <expandedname> <triedcooking>)
   symbol definition: val $outer(): $iwC.$iwC.type
                 tpe: $iwC.$iwC.type
       symbol owners: value $outer -> class $iwC -> class $iwC -> class $iwC -> class $read -> package $line5
      context owners: class $iwC -> class $iwC -> class $iwC -> class $iwC -> class $read -> package $line5

== Enclosing template or block ==

ClassDef( // class $iwC extends Serializable
  0
  "$iwC"
  []
  Template( // val <local $iwC>: <notype>, tree.tpe=$iwC
    "java.lang.Object", "scala.Serializable" // parents
    ValDef(
      private
      "_"
      <tpt>
      <empty>
    )
    // 5 statements
    DefDef( // def <init>(arg$outer: $iwC.$iwC.$iwC.type): $iwC
      <method> <triedcooking>
      "<init>"
      []
      // 1 parameter list
      ValDef( // $outer: $iwC.$iwC.$iwC.type

        "$outer"
        <tpt> // tree.tpe=$iwC.$iwC.$iwC.type
        <empty>
      )
      <tpt> // tree.tpe=$iwC
      Block( // tree.tpe=Unit
        Apply( // def <init>(): Object in class Object, tree.tpe=Object
          $iwC.super."<init>" // def <init>(): Object in class Object, tree.tpe=()Object
          Nil
        )
        ()
      )
    )
    ValDef( // private[this] val sqlContext: org.apache.spark.sql.hive.HiveContext
      private <local> <triedcooking>
      "sqlContext "
      <tpt> // tree.tpe=org.apache.spark.sql.hive.HiveContext
      Apply( // def <init>(sc: org.apache.spark.SparkContext): org.apache.spark.sql.hive.HiveContext in class HiveContext, tree.tpe=org.apache.spark.sql.hive.HiveContext
        new org.apache.spark.sql.hive.HiveContext."<init>" // def <init>(sc: org.apache.spark.SparkContext): org.apache.spark.sql.hive.HiveContext in class HiveContext, tree.tpe=(sc: org.apache.spark.SparkContext)org.apache.spark.sql.hive.HiveContext
        Apply( // val sc(): org.apache.spark.SparkContext, tree.tpe=org.apache.spark.SparkContext
          $iwC.this.$line5$$read$$iwC$$iwC$$iwC$$iwC$$$outer().$line5$$read$$iwC$$iwC$$iwC$$$outer().$line5$$read$$iwC$$iwC$$$outer().$VAL1().$iw().$iw()."sc" // val sc(): org.apache.spark.SparkContext, tree.tpe=()org.apache.spark.SparkContext
          Nil
        )
      )
    )
    DefDef( // val sqlContext(): org.apache.spark.sql.hive.HiveContext
      <method> <stable> <accessor>
      "sqlContext"
      []
      List(Nil)
      <tpt> // tree.tpe=org.apache.spark.sql.hive.HiveContext
      $iwC.this."sqlContext " // private[this] val sqlContext: org.apache.spark.sql.hive.HiveContext, tree.tpe=org.apache.spark.sql.hive.HiveContext
    )
    ValDef( // protected val $outer: $iwC.$iwC.$iwC.type
      protected <synthetic> <paramaccessor> <triedcooking>
      "$outer "
      <tpt> // tree.tpe=$iwC.$iwC.$iwC.type
      <empty>
    )
    DefDef( // val $outer(): $iwC.$iwC.$iwC.type
      <method> <synthetic> <stable> <expandedname> <triedcooking>
      "$line5$$read$$iwC$$iwC$$iwC$$iwC$$$outer"
      []
      List(Nil)
      <tpt> // tree.tpe=Any
      $iwC.this."$outer " // protected val $outer: $iwC.$iwC.$iwC.type, tree.tpe=$iwC.$iwC.$iwC.type
    )
  )
)

== Expanded type of tree ==

ThisType(class $iwC)

uncaught exception during compilation: scala.reflect.internal.Types$TypeError
scala.reflect.internal.Types$TypeError: bad symbolic reference. A signature in HiveContext.class refers to term conf
in value org.apache.hadoop.hive which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling HiveContext.class.
That entry seems to have slain the compiler.  Shall I replay
your session? I can re-run each line except the last one.
[y/n]


Thanks
Tridib

> From: Terry.Siu@smartfocus.com<ma...@smartfocus.com>
> To: tridib.samanta@live.com<ma...@live.com>; user@spark.incubator.apache.org<ma...@spark.incubator.apache.org>
> Subject: Re: Unable to use HiveContext in spark-shell
> Date: Thu, 6 Nov 2014 17:38:51 +0000
>
> What version of Spark are you using? Did you compile your Spark version
> and if so, what compile options did you use?
>
> On 11/6/14, 9:22 AM, "tridib" <tr...@live.com>> wrote:
>
> >Help please!
> >
> >
> >
> >--
> >View this message in context:
> >http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-use-HiveCont
> >ext-in-spark-shell-tp18261p18280.html
> >Sent from the Apache Spark User List mailing list archive at Nabble.com.
> >
> >---------------------------------------------------------------------
> >To unsubscribe, e-mail: user-unsubscribe@spark.apache.org<ma...@spark.apache.org>
> >For additional commands, e-mail: user-help@spark.apache.org<ma...@spark.apache.org>
> >
>



RE: Unable to use HiveContext in spark-shell

Posted by Tridib Samanta <tr...@live.com>.


I am using spark 1.1.0.
I built it using:
./make-distribution.sh -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Phive -DskipTests
 
My ultimate goal is to execute a query on parquet file with nested structure and cast a date string to Date. This is required to calculate the age of Person entity. but I am even unable to pass this line:val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc) 
I made sure that org.apache.hadoop package is in the spark assembly jar. Re-attaching the stack trace for quick reference. scala> val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc) 

error: bad symbolic reference. A signature in HiveContext.class refers to term hive 
in package org.apache.hadoop which is not available. 
It may be completely missing from the current classpath, or the version on 
the classpath might be incompatible with the version used when compiling HiveContext.class. 
error: 
     while compiling: <console>
        during phase: erasure 
     library version: version 2.10.4 
    compiler version: version 2.10.4 
  reconstructed args: 

  last tree to typer: Apply(value $outer) 
              symbol: value $outer (flags: <method> <synthetic> <stable> <expandedname> <triedcooking>) 
   symbol definition: val $outer(): $iwC.$iwC.type 
                 tpe: $iwC.$iwC.type 
       symbol owners: value $outer -> class $iwC -> class $iwC -> class $iwC -> class $read -> package $line5 
      context owners: class $iwC -> class $iwC -> class $iwC -> class $iwC -> class $read -> package $line5 

== Enclosing template or block == 

ClassDef( // class $iwC extends Serializable 
  0 
  "$iwC" 
  [] 
  Template( // val <local $iwC>: <notype>, tree.tpe=$iwC 
    "java.lang.Object", "scala.Serializable" // parents 
    ValDef( 
      private 
      "_" 
      <tpt>
      <empty>
    ) 
    // 5 statements 
    DefDef( // def <init>(arg$outer: $iwC.$iwC.$iwC.type): $iwC 
      <method> <triedcooking>
      "<init>" 
      [] 
      // 1 parameter list 
      ValDef( // $outer: $iwC.$iwC.$iwC.type 
        
        "$outer" 
        <tpt> // tree.tpe=$iwC.$iwC.$iwC.type 
        <empty>
      ) 
      <tpt> // tree.tpe=$iwC 
      Block( // tree.tpe=Unit 
        Apply( // def <init>(): Object in class Object, tree.tpe=Object 
          $iwC.super."<init>" // def <init>(): Object in class Object, tree.tpe=()Object 
          Nil 
        ) 
        () 
      ) 
    ) 
    ValDef( // private[this] val sqlContext: org.apache.spark.sql.hive.HiveContext 
      private <local> <triedcooking>
      "sqlContext " 
      <tpt> // tree.tpe=org.apache.spark.sql.hive.HiveContext 
      Apply( // def <init>(sc: org.apache.spark.SparkContext): org.apache.spark.sql.hive.HiveContext in class HiveContext, tree.tpe=org.apache.spark.sql.hive.HiveContext 
        new org.apache.spark.sql.hive.HiveContext."<init>" // def <init>(sc: org.apache.spark.SparkContext): org.apache.spark.sql.hive.HiveContext in class HiveContext, tree.tpe=(sc: org.apache.spark.SparkContext)org.apache.spark.sql.hive.HiveContext 
        Apply( // val sc(): org.apache.spark.SparkContext, tree.tpe=org.apache.spark.SparkContext 
          $iwC.this.$line5$$read$$iwC$$iwC$$iwC$$iwC$$$outer().$line5$$read$$iwC$$iwC$$iwC$$$outer().$line5$$read$$iwC$$iwC$$$outer().$VAL1().$iw().$iw()."sc" // val sc(): org.apache.spark.SparkContext, tree.tpe=()org.apache.spark.SparkContext 
          Nil 
        ) 
      ) 
    ) 
    DefDef( // val sqlContext(): org.apache.spark.sql.hive.HiveContext 
      <method> <stable> <accessor>
      "sqlContext" 
      [] 
      List(Nil) 
      <tpt> // tree.tpe=org.apache.spark.sql.hive.HiveContext 
      $iwC.this."sqlContext " // private[this] val sqlContext: org.apache.spark.sql.hive.HiveContext, tree.tpe=org.apache.spark.sql.hive.HiveContext 
    ) 
    ValDef( // protected val $outer: $iwC.$iwC.$iwC.type 
      protected <synthetic> <paramaccessor> <triedcooking>
      "$outer " 
      <tpt> // tree.tpe=$iwC.$iwC.$iwC.type 
      <empty>
    ) 
    DefDef( // val $outer(): $iwC.$iwC.$iwC.type 
      <method> <synthetic> <stable> <expandedname> <triedcooking>
      "$line5$$read$$iwC$$iwC$$iwC$$iwC$$$outer" 
      [] 
      List(Nil) 
      <tpt> // tree.tpe=Any 
      $iwC.this."$outer " // protected val $outer: $iwC.$iwC.$iwC.type, tree.tpe=$iwC.$iwC.$iwC.type 
    ) 
  ) 
) 

== Expanded type of tree == 

ThisType(class $iwC) 

uncaught exception during compilation: scala.reflect.internal.Types$TypeError 
scala.reflect.internal.Types$TypeError: bad symbolic reference. A signature in HiveContext.class refers to term conf 
in value org.apache.hadoop.hive which is not available. 
It may be completely missing from the current classpath, or the version on 
the classpath might be incompatible with the version used when compiling HiveContext.class. 
That entry seems to have slain the compiler.  Shall I replay 
your session? I can re-run each line except the last one. 
[y/n] 

 
Thanks
Tridib
 
> From: Terry.Siu@smartfocus.com
> To: tridib.samanta@live.com; user@spark.incubator.apache.org
> Subject: Re: Unable to use HiveContext in spark-shell
> Date: Thu, 6 Nov 2014 17:38:51 +0000
>  
> What version of Spark are you using? Did you compile your Spark version
> and if so, what compile options did you use?
> 
> On 11/6/14, 9:22 AM, "tridib" <tr...@live.com> wrote:
> 
> >Help please!
> >
> >
> >
> >--
> >View this message in context:
> >http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-use-HiveCont
> >ext-in-spark-shell-tp18261p18280.html
> >Sent from the Apache Spark User List mailing list archive at Nabble.com.
> >
> >---------------------------------------------------------------------
> >To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> >For additional commands, e-mail: user-help@spark.apache.org
> >
> 

 		 	   		  

Re: Unable to use HiveContext in spark-shell

Posted by Terry Siu <Te...@smartfocus.com>.
What version of Spark are you using? Did you compile your Spark version
and if so, what compile options did you use?

On 11/6/14, 9:22 AM, "tridib" <tr...@live.com> wrote:

>Help please!
>
>
>
>--
>View this message in context:
>http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-use-HiveCont
>ext-in-spark-shell-tp18261p18280.html
>Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
>---------------------------------------------------------------------
>To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>For additional commands, e-mail: user-help@spark.apache.org
>


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Unable to use HiveContext in spark-shell

Posted by Jimmy McErlain <ji...@sellpoints.com>.
can you be more specific.... what version of spark, hive, hadoop, etc...
what are you trying to do?  what are the issues you are seeing?
J
ᐧ




*JIMMY MCERLAIN*

DATA SCIENTIST (NERD)

*. . . . . . . . . . . . . . . . . .*


*IF WE CAN’T DOUBLE YOUR SALES,*



*ONE OF US IS IN THE WRONG BUSINESS.*

*E*: jimmy@sellpoints.com

*M*: *510.303.7751*

On Thu, Nov 6, 2014 at 9:22 AM, tridib <tr...@live.com> wrote:

> Help please!
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-use-HiveContext-in-spark-shell-tp18261p18280.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Re: Unable to use HiveContext in spark-shell

Posted by tridib <tr...@live.com>.
Help please!



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-use-HiveContext-in-spark-shell-tp18261p18280.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org