You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Benjamin Kim <bb...@gmail.com> on 2017/02/02 18:28:52 UTC

Re: HBase Spark

Elek,

Can you give me some sample code? I can’t get mine to work.

import org.apache.spark.sql.{SQLContext, _}
import org.apache.spark.sql.execution.datasources.hbase._
import org.apache.spark.{SparkConf, SparkContext}

def cat = s"""{
    |"table":{"namespace":"ben", "name":"dmp_test", "tableCoder":"PrimitiveType"},
    |"rowkey":"key",
    |"columns":{
        |"col0":{"cf":"rowkey", "col":"key", "type":"string"},
        |"col1":{"cf":"d", "col":"google_gid", "type":"string"}
    |}
|}""".stripMargin

import sqlContext.implicits._

def withCatalog(cat: String): DataFrame = {
    sqlContext
        .read
        .options(Map(HBaseTableCatalog.tableCatalog->cat))
        .format("org.apache.spark.sql.execution.datasources.hbase")
        .load()
}

val df = withCatalog(cat)
df.show

It gives me this error.

java.lang.NoSuchMethodError: scala.runtime.ObjectRef.create(Ljava/lang/Object;)Lscala/runtime/ObjectRef;
	at org.apache.spark.sql.execution.datasources.hbase.HBaseTableCatalog$.apply(HBaseTableCatalog.scala:232)
	at org.apache.spark.sql.execution.datasources.hbase.HBaseRelation.<init>(HBaseRelation.scala:77)
	at org.apache.spark.sql.execution.datasources.hbase.DefaultSource.createRelation(HBaseRelation.scala:51)
	at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:158)
	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119)

If you can please help, I would be grateful.

Cheers,
Ben


> On Jan 31, 2017, at 1:02 PM, Marton, Elek <hd...@anzix.net> wrote:
> 
> 
> I tested this one with hbase 1.2.4:
> 
> https://github.com/hortonworks-spark/shc
> 
> Marton
> 
> On 01/31/2017 09:17 PM, Benjamin Kim wrote:
>> Does anyone know how to backport the HBase Spark module to HBase 1.2.0? I tried to build it from source, but I cannot get it to work.
>> 
>> Thanks,
>> Ben
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>> 
> 
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
> 


Re: HBase Spark

Posted by Asher Krim <ak...@hubspot.com>.
You can see in the tree what's pulling in 2.11. Your option then will be to
either shade them and add an explicit dependency on 2.10.5 in your pom.
Alternatively, you can explore upgrading your project to 2.11 (which will
require using a 2_11 build of spark)


On Fri, Feb 3, 2017 at 2:03 PM, Benjamin Kim <bb...@gmail.com> wrote:

> Asher,
>
> You’re right. I don’t see anything but 2.11 being pulled in. Do you know
> where I can change this?
>
> Cheers,
> Ben
>
>
> On Feb 3, 2017, at 10:50 AM, Asher Krim <ak...@hubspot.com> wrote:
>
> Sorry for my persistence, but did you actually run "mvn dependency:tree
> -Dverbose=true"? And did you see only scala 2.10.5 being pulled in?
>
> On Fri, Feb 3, 2017 at 12:33 PM, Benjamin Kim <bb...@gmail.com> wrote:
>
>> Asher,
>>
>> It’s still the same. Do you have any other ideas?
>>
>> Cheers,
>> Ben
>>
>>
>> On Feb 3, 2017, at 8:16 AM, Asher Krim <ak...@hubspot.com> wrote:
>>
>> Did you check the actual maven dep tree? Something might be pulling in a
>> different version. Also, if you're seeing this locally, you might want to
>> check which version of the scala sdk your IDE is using
>>
>> Asher Krim
>> Senior Software Engineer
>>
>> On Thu, Feb 2, 2017 at 5:43 PM, Benjamin Kim <bb...@gmail.com> wrote:
>>
>>> Hi Asher,
>>>
>>> I modified the pom to be the same Spark (1.6.0), HBase (1.2.0), and Java
>>> (1.8) version as our installation. The Scala (2.10.5) version is already
>>> the same as ours. But I’m still getting the same error. Can you think of
>>> anything else?
>>>
>>> Cheers,
>>> Ben
>>>
>>>
>>> On Feb 2, 2017, at 11:06 AM, Asher Krim <ak...@hubspot.com> wrote:
>>>
>>> Ben,
>>>
>>> That looks like a scala version mismatch. Have you checked your dep tree?
>>>
>>> Asher Krim
>>> Senior Software Engineer
>>>
>>> On Thu, Feb 2, 2017 at 1:28 PM, Benjamin Kim <bb...@gmail.com> wrote:
>>>
>>>> Elek,
>>>>
>>>> Can you give me some sample code? I can’t get mine to work.
>>>>
>>>> import org.apache.spark.sql.{SQLContext, _}
>>>> import org.apache.spark.sql.execution.datasources.hbase._
>>>> import org.apache.spark.{SparkConf, SparkContext}
>>>>
>>>> def cat = s"""{
>>>>     |"table":{"namespace":"ben", "name":"dmp_test",
>>>> "tableCoder":"PrimitiveType"},
>>>>     |"rowkey":"key",
>>>>     |"columns":{
>>>>         |"col0":{"cf":"rowkey", "col":"key", "type":"string"},
>>>>         |"col1":{"cf":"d", "col":"google_gid", "type":"string"}
>>>>     |}
>>>> |}""".stripMargin
>>>>
>>>> import sqlContext.implicits._
>>>>
>>>> def withCatalog(cat: String): DataFrame = {
>>>>     sqlContext
>>>>         .read
>>>>         .options(Map(HBaseTableCatalog.tableCatalog->cat))
>>>>         .format("org.apache.spark.sql.execution.datasources.hbase")
>>>>         .load()
>>>> }
>>>>
>>>> val df = withCatalog(cat)
>>>> df.show
>>>>
>>>>
>>>> It gives me this error.
>>>>
>>>> java.lang.NoSuchMethodError: scala.runtime.ObjectRef.create
>>>> (Ljava/lang/Object;)Lscala/runtime/ObjectRef;
>>>> at org.apache.spark.sql.execution.datasources.hbase.HBaseTableC
>>>> atalog$.apply(HBaseTableCatalog.scala:232)
>>>> at org.apache.spark.sql.execution.datasources.hbase.HBaseRelati
>>>> on.<init>(HBaseRelation.scala:77)
>>>> at org.apache.spark.sql.execution.datasources.hbase.DefaultSour
>>>> ce.createRelation(HBaseRelation.scala:51)
>>>> at org.apache.spark.sql.execution.datasources.ResolvedDataSourc
>>>> e$.apply(ResolvedDataSource.scala:158)
>>>> at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119)
>>>>
>>>>
>>>> If you can please help, I would be grateful.
>>>>
>>>> Cheers,
>>>> Ben
>>>>
>>>>
>>>> On Jan 31, 2017, at 1:02 PM, Marton, Elek <hd...@anzix.net> wrote:
>>>>
>>>>
>>>> I tested this one with hbase 1.2.4:
>>>>
>>>> https://github.com/hortonworks-spark/shc
>>>>
>>>> Marton
>>>>
>>>> On 01/31/2017 09:17 PM, Benjamin Kim wrote:
>>>>
>>>> Does anyone know how to backport the HBase Spark module to HBase 1.2.0?
>>>> I tried to build it from source, but I cannot get it to work.
>>>>
>>>> Thanks,
>>>> Ben
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>>>>
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>>>>
>>>>
>>>>
>>>
>>>
>>
>>
>
>

Re: HBase Spark

Posted by Benjamin Kim <bb...@gmail.com>.
Asher,

You’re right. I don’t see anything but 2.11 being pulled in. Do you know where I can change this?

Cheers,
Ben


> On Feb 3, 2017, at 10:50 AM, Asher Krim <ak...@hubspot.com> wrote:
> 
> Sorry for my persistence, but did you actually run "mvn dependency:tree -Dverbose=true"? And did you see only scala 2.10.5 being pulled in?
> 
> On Fri, Feb 3, 2017 at 12:33 PM, Benjamin Kim <bbuild11@gmail.com <ma...@gmail.com>> wrote:
> Asher,
> 
> It’s still the same. Do you have any other ideas?
> 
> Cheers,
> Ben
> 
> 
>> On Feb 3, 2017, at 8:16 AM, Asher Krim <akrim@hubspot.com <ma...@hubspot.com>> wrote:
>> 
>> Did you check the actual maven dep tree? Something might be pulling in a different version. Also, if you're seeing this locally, you might want to check which version of the scala sdk your IDE is using
>> 
>> Asher Krim
>> Senior Software Engineer
>> 
>> 
>> On Thu, Feb 2, 2017 at 5:43 PM, Benjamin Kim <bbuild11@gmail.com <ma...@gmail.com>> wrote:
>> Hi Asher,
>> 
>> I modified the pom to be the same Spark (1.6.0), HBase (1.2.0), and Java (1.8) version as our installation. The Scala (2.10.5) version is already the same as ours. But I’m still getting the same error. Can you think of anything else?
>> 
>> Cheers,
>> Ben
>> 
>> 
>>> On Feb 2, 2017, at 11:06 AM, Asher Krim <akrim@hubspot.com <ma...@hubspot.com>> wrote:
>>> 
>>> Ben,
>>> 
>>> That looks like a scala version mismatch. Have you checked your dep tree?
>>> 
>>> Asher Krim
>>> Senior Software Engineer
>>> 
>>> 
>>> On Thu, Feb 2, 2017 at 1:28 PM, Benjamin Kim <bbuild11@gmail.com <ma...@gmail.com>> wrote:
>>> Elek,
>>> 
>>> Can you give me some sample code? I can’t get mine to work.
>>> 
>>> import org.apache.spark.sql.{SQLContext, _}
>>> import org.apache.spark.sql.execution.datasources.hbase._
>>> import org.apache.spark.{SparkConf, SparkContext}
>>> 
>>> def cat = s"""{
>>>     |"table":{"namespace":"ben", "name":"dmp_test", "tableCoder":"PrimitiveType"},
>>>     |"rowkey":"key",
>>>     |"columns":{
>>>         |"col0":{"cf":"rowkey", "col":"key", "type":"string"},
>>>         |"col1":{"cf":"d", "col":"google_gid", "type":"string"}
>>>     |}
>>> |}""".stripMargin
>>> 
>>> import sqlContext.implicits._
>>> 
>>> def withCatalog(cat: String): DataFrame = {
>>>     sqlContext
>>>         .read
>>>         .options(Map(HBaseTableCatalog.tableCatalog->cat))
>>>         .format("org.apache.spark.sql.execution.datasources.hbase")
>>>         .load()
>>> }
>>> 
>>> val df = withCatalog(cat)
>>> df.show
>>> 
>>> It gives me this error.
>>> 
>>> java.lang.NoSuchMethodError: scala.runtime.ObjectRef.create(Ljava/lang/Object;)Lscala/runtime/ObjectRef;
>>> 	at org.apache.spark.sql.execution.datasources.hbase.HBaseTableCatalog$.apply(HBaseTableCatalog.scala:232)
>>> 	at org.apache.spark.sql.execution.datasources.hbase.HBaseRelation.<init>(HBaseRelation.scala:77)
>>> 	at org.apache.spark.sql.execution.datasources.hbase.DefaultSource.createRelation(HBaseRelation.scala:51)
>>> 	at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:158)
>>> 	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119)
>>> 
>>> If you can please help, I would be grateful.
>>> 
>>> Cheers,
>>> Ben
>>> 
>>> 
>>>> On Jan 31, 2017, at 1:02 PM, Marton, Elek <hdp@anzix.net <ma...@anzix.net>> wrote:
>>>> 
>>>> 
>>>> I tested this one with hbase 1.2.4:
>>>> 
>>>> https://github.com/hortonworks-spark/shc <https://github.com/hortonworks-spark/shc>
>>>> 
>>>> Marton
>>>> 
>>>> On 01/31/2017 09:17 PM, Benjamin Kim wrote:
>>>>> Does anyone know how to backport the HBase Spark module to HBase 1.2.0? I tried to build it from source, but I cannot get it to work.
>>>>> 
>>>>> Thanks,
>>>>> Ben
>>>>> ---------------------------------------------------------------------
>>>>> To unsubscribe e-mail: user-unsubscribe@spark.apache.org <ma...@spark.apache.org>
>>>>> 
>>>> 
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe e-mail: user-unsubscribe@spark.apache.org <ma...@spark.apache.org>
>>>> 
>>> 
>>> 
>> 
>> 
> 
> 


Re: HBase Spark

Posted by Asher Krim <ak...@hubspot.com>.
Sorry for my persistence, but did you actually run "mvn dependency:tree
-Dverbose=true"? And did you see only scala 2.10.5 being pulled in?

On Fri, Feb 3, 2017 at 12:33 PM, Benjamin Kim <bb...@gmail.com> wrote:

> Asher,
>
> It’s still the same. Do you have any other ideas?
>
> Cheers,
> Ben
>
>
> On Feb 3, 2017, at 8:16 AM, Asher Krim <ak...@hubspot.com> wrote:
>
> Did you check the actual maven dep tree? Something might be pulling in a
> different version. Also, if you're seeing this locally, you might want to
> check which version of the scala sdk your IDE is using
>
> Asher Krim
> Senior Software Engineer
>
> On Thu, Feb 2, 2017 at 5:43 PM, Benjamin Kim <bb...@gmail.com> wrote:
>
>> Hi Asher,
>>
>> I modified the pom to be the same Spark (1.6.0), HBase (1.2.0), and Java
>> (1.8) version as our installation. The Scala (2.10.5) version is already
>> the same as ours. But I’m still getting the same error. Can you think of
>> anything else?
>>
>> Cheers,
>> Ben
>>
>>
>> On Feb 2, 2017, at 11:06 AM, Asher Krim <ak...@hubspot.com> wrote:
>>
>> Ben,
>>
>> That looks like a scala version mismatch. Have you checked your dep tree?
>>
>> Asher Krim
>> Senior Software Engineer
>>
>> On Thu, Feb 2, 2017 at 1:28 PM, Benjamin Kim <bb...@gmail.com> wrote:
>>
>>> Elek,
>>>
>>> Can you give me some sample code? I can’t get mine to work.
>>>
>>> import org.apache.spark.sql.{SQLContext, _}
>>> import org.apache.spark.sql.execution.datasources.hbase._
>>> import org.apache.spark.{SparkConf, SparkContext}
>>>
>>> def cat = s"""{
>>>     |"table":{"namespace":"ben", "name":"dmp_test",
>>> "tableCoder":"PrimitiveType"},
>>>     |"rowkey":"key",
>>>     |"columns":{
>>>         |"col0":{"cf":"rowkey", "col":"key", "type":"string"},
>>>         |"col1":{"cf":"d", "col":"google_gid", "type":"string"}
>>>     |}
>>> |}""".stripMargin
>>>
>>> import sqlContext.implicits._
>>>
>>> def withCatalog(cat: String): DataFrame = {
>>>     sqlContext
>>>         .read
>>>         .options(Map(HBaseTableCatalog.tableCatalog->cat))
>>>         .format("org.apache.spark.sql.execution.datasources.hbase")
>>>         .load()
>>> }
>>>
>>> val df = withCatalog(cat)
>>> df.show
>>>
>>>
>>> It gives me this error.
>>>
>>> java.lang.NoSuchMethodError: scala.runtime.ObjectRef.create
>>> (Ljava/lang/Object;)Lscala/runtime/ObjectRef;
>>> at org.apache.spark.sql.execution.datasources.hbase.HBaseTableC
>>> atalog$.apply(HBaseTableCatalog.scala:232)
>>> at org.apache.spark.sql.execution.datasources.hbase.HBaseRelati
>>> on.<init>(HBaseRelation.scala:77)
>>> at org.apache.spark.sql.execution.datasources.hbase.DefaultSour
>>> ce.createRelation(HBaseRelation.scala:51)
>>> at org.apache.spark.sql.execution.datasources.ResolvedDataSourc
>>> e$.apply(ResolvedDataSource.scala:158)
>>> at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119)
>>>
>>>
>>> If you can please help, I would be grateful.
>>>
>>> Cheers,
>>> Ben
>>>
>>>
>>> On Jan 31, 2017, at 1:02 PM, Marton, Elek <hd...@anzix.net> wrote:
>>>
>>>
>>> I tested this one with hbase 1.2.4:
>>>
>>> https://github.com/hortonworks-spark/shc
>>>
>>> Marton
>>>
>>> On 01/31/2017 09:17 PM, Benjamin Kim wrote:
>>>
>>> Does anyone know how to backport the HBase Spark module to HBase 1.2.0?
>>> I tried to build it from source, but I cannot get it to work.
>>>
>>> Thanks,
>>> Ben
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>>>
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>>>
>>>
>>>
>>
>>
>
>

Re: HBase Spark

Posted by Benjamin Kim <bb...@gmail.com>.
Asher,

It’s still the same. Do you have any other ideas?

Cheers,
Ben


> On Feb 3, 2017, at 8:16 AM, Asher Krim <ak...@hubspot.com> wrote:
> 
> Did you check the actual maven dep tree? Something might be pulling in a different version. Also, if you're seeing this locally, you might want to check which version of the scala sdk your IDE is using
> 
> Asher Krim
> Senior Software Engineer
> 
> 
> On Thu, Feb 2, 2017 at 5:43 PM, Benjamin Kim <bbuild11@gmail.com <ma...@gmail.com>> wrote:
> Hi Asher,
> 
> I modified the pom to be the same Spark (1.6.0), HBase (1.2.0), and Java (1.8) version as our installation. The Scala (2.10.5) version is already the same as ours. But I’m still getting the same error. Can you think of anything else?
> 
> Cheers,
> Ben
> 
> 
>> On Feb 2, 2017, at 11:06 AM, Asher Krim <akrim@hubspot.com <ma...@hubspot.com>> wrote:
>> 
>> Ben,
>> 
>> That looks like a scala version mismatch. Have you checked your dep tree?
>> 
>> Asher Krim
>> Senior Software Engineer
>> 
>> 
>> On Thu, Feb 2, 2017 at 1:28 PM, Benjamin Kim <bbuild11@gmail.com <ma...@gmail.com>> wrote:
>> Elek,
>> 
>> Can you give me some sample code? I can’t get mine to work.
>> 
>> import org.apache.spark.sql.{SQLContext, _}
>> import org.apache.spark.sql.execution.datasources.hbase._
>> import org.apache.spark.{SparkConf, SparkContext}
>> 
>> def cat = s"""{
>>     |"table":{"namespace":"ben", "name":"dmp_test", "tableCoder":"PrimitiveType"},
>>     |"rowkey":"key",
>>     |"columns":{
>>         |"col0":{"cf":"rowkey", "col":"key", "type":"string"},
>>         |"col1":{"cf":"d", "col":"google_gid", "type":"string"}
>>     |}
>> |}""".stripMargin
>> 
>> import sqlContext.implicits._
>> 
>> def withCatalog(cat: String): DataFrame = {
>>     sqlContext
>>         .read
>>         .options(Map(HBaseTableCatalog.tableCatalog->cat))
>>         .format("org.apache.spark.sql.execution.datasources.hbase")
>>         .load()
>> }
>> 
>> val df = withCatalog(cat)
>> df.show
>> 
>> It gives me this error.
>> 
>> java.lang.NoSuchMethodError: scala.runtime.ObjectRef.create(Ljava/lang/Object;)Lscala/runtime/ObjectRef;
>> 	at org.apache.spark.sql.execution.datasources.hbase.HBaseTableCatalog$.apply(HBaseTableCatalog.scala:232)
>> 	at org.apache.spark.sql.execution.datasources.hbase.HBaseRelation.<init>(HBaseRelation.scala:77)
>> 	at org.apache.spark.sql.execution.datasources.hbase.DefaultSource.createRelation(HBaseRelation.scala:51)
>> 	at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:158)
>> 	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119)
>> 
>> If you can please help, I would be grateful.
>> 
>> Cheers,
>> Ben
>> 
>> 
>>> On Jan 31, 2017, at 1:02 PM, Marton, Elek <hdp@anzix.net <ma...@anzix.net>> wrote:
>>> 
>>> 
>>> I tested this one with hbase 1.2.4:
>>> 
>>> https://github.com/hortonworks-spark/shc <https://github.com/hortonworks-spark/shc>
>>> 
>>> Marton
>>> 
>>> On 01/31/2017 09:17 PM, Benjamin Kim wrote:
>>>> Does anyone know how to backport the HBase Spark module to HBase 1.2.0? I tried to build it from source, but I cannot get it to work.
>>>> 
>>>> Thanks,
>>>> Ben
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe e-mail: user-unsubscribe@spark.apache.org <ma...@spark.apache.org>
>>>> 
>>> 
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: user-unsubscribe@spark.apache.org <ma...@spark.apache.org>
>>> 
>> 
>> 
> 
> 


Re: HBase Spark

Posted by Benjamin Kim <bb...@gmail.com>.
I'll clean up any .m2 or .ivy directories. And try again.

I ran this on our lab cluster for testing.

Cheers,
Ben


On Fri, Feb 3, 2017 at 8:16 AM Asher Krim <ak...@hubspot.com> wrote:

> Did you check the actual maven dep tree? Something might be pulling in a
> different version. Also, if you're seeing this locally, you might want to
> check which version of the scala sdk your IDE is using
>
> Asher Krim
> Senior Software Engineer
>
> On Thu, Feb 2, 2017 at 5:43 PM, Benjamin Kim <bb...@gmail.com> wrote:
>
> Hi Asher,
>
> I modified the pom to be the same Spark (1.6.0), HBase (1.2.0), and Java
> (1.8) version as our installation. The Scala (2.10.5) version is already
> the same as ours. But I’m still getting the same error. Can you think of
> anything else?
>
> Cheers,
> Ben
>
>
> On Feb 2, 2017, at 11:06 AM, Asher Krim <ak...@hubspot.com> wrote:
>
> Ben,
>
> That looks like a scala version mismatch. Have you checked your dep tree?
>
> Asher Krim
> Senior Software Engineer
>
> On Thu, Feb 2, 2017 at 1:28 PM, Benjamin Kim <bb...@gmail.com> wrote:
>
> Elek,
>
> Can you give me some sample code? I can’t get mine to work.
>
> import org.apache.spark.sql.{SQLContext, _}
> import org.apache.spark.sql.execution.datasources.hbase._
> import org.apache.spark.{SparkConf, SparkContext}
>
> def cat = s"""{
>     |"table":{"namespace":"ben", "name":"dmp_test",
> "tableCoder":"PrimitiveType"},
>     |"rowkey":"key",
>     |"columns":{
>         |"col0":{"cf":"rowkey", "col":"key", "type":"string"},
>         |"col1":{"cf":"d", "col":"google_gid", "type":"string"}
>     |}
> |}""".stripMargin
>
> import sqlContext.implicits._
>
> def withCatalog(cat: String): DataFrame = {
>     sqlContext
>         .read
>         .options(Map(HBaseTableCatalog.tableCatalog->cat))
>         .format("org.apache.spark.sql.execution.datasources.hbase")
>         .load()
> }
>
> val df = withCatalog(cat)
> df.show
>
>
> It gives me this error.
>
> java.lang.NoSuchMethodError:
> scala.runtime.ObjectRef.create(Ljava/lang/Object;)Lscala/runtime/ObjectRef;
> at
> org.apache.spark.sql.execution.datasources.hbase.HBaseTableCatalog$.apply(HBaseTableCatalog.scala:232)
> at
> org.apache.spark.sql.execution.datasources.hbase.HBaseRelation.<init>(HBaseRelation.scala:77)
> at
> org.apache.spark.sql.execution.datasources.hbase.DefaultSource.createRelation(HBaseRelation.scala:51)
> at
> org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:158)
> at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119)
>
>
> If you can please help, I would be grateful.
>
> Cheers,
> Ben
>
>
> On Jan 31, 2017, at 1:02 PM, Marton, Elek <hd...@anzix.net> wrote:
>
>
> I tested this one with hbase 1.2.4:
>
> https://github.com/hortonworks-spark/shc
>
> Marton
>
> On 01/31/2017 09:17 PM, Benjamin Kim wrote:
>
> Does anyone know how to backport the HBase Spark module to HBase 1.2.0? I
> tried to build it from source, but I cannot get it to work.
>
> Thanks,
> Ben
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>
>
>
>
>

Re: HBase Spark

Posted by Benjamin Kim <bb...@gmail.com>.
Asher,

I found a profile for Spark 2.11 and removed it. Now, it brings in 2.10. I ran some code and got further. Now, I get this error below when I do a “df.show”.

java.lang.AbstractMethodError
	at org.apache.spark.Logging$class.log(Logging.scala:50)
	at org.apache.spark.sql.execution.datasources.hbase.HBaseFilter$.log(HBaseFilter.scala:122)
	at org.apache.spark.sql.execution.datasources.hbase.HBaseFilter$.buildFilters(HBaseFilter.scala:125)
	at org.apache.spark.sql.execution.datasources.hbase.HBaseTableScanRDD.getPartitions(HBaseTableScan.scala:59)
	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239)
	at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237)
	at scala.Option.getOrElse(Option.scala:120)

Thanks for all your help.

Cheers,
Ben


> On Feb 3, 2017, at 8:16 AM, Asher Krim <ak...@hubspot.com> wrote:
> 
> Did you check the actual maven dep tree? Something might be pulling in a different version. Also, if you're seeing this locally, you might want to check which version of the scala sdk your IDE is using
> 
> Asher Krim
> Senior Software Engineer
> 
> 
> On Thu, Feb 2, 2017 at 5:43 PM, Benjamin Kim <bbuild11@gmail.com <ma...@gmail.com>> wrote:
> Hi Asher,
> 
> I modified the pom to be the same Spark (1.6.0), HBase (1.2.0), and Java (1.8) version as our installation. The Scala (2.10.5) version is already the same as ours. But I’m still getting the same error. Can you think of anything else?
> 
> Cheers,
> Ben
> 
> 
>> On Feb 2, 2017, at 11:06 AM, Asher Krim <akrim@hubspot.com <ma...@hubspot.com>> wrote:
>> 
>> Ben,
>> 
>> That looks like a scala version mismatch. Have you checked your dep tree?
>> 
>> Asher Krim
>> Senior Software Engineer
>> 
>> 
>> On Thu, Feb 2, 2017 at 1:28 PM, Benjamin Kim <bbuild11@gmail.com <ma...@gmail.com>> wrote:
>> Elek,
>> 
>> Can you give me some sample code? I can’t get mine to work.
>> 
>> import org.apache.spark.sql.{SQLContext, _}
>> import org.apache.spark.sql.execution.datasources.hbase._
>> import org.apache.spark.{SparkConf, SparkContext}
>> 
>> def cat = s"""{
>>     |"table":{"namespace":"ben", "name":"dmp_test", "tableCoder":"PrimitiveType"},
>>     |"rowkey":"key",
>>     |"columns":{
>>         |"col0":{"cf":"rowkey", "col":"key", "type":"string"},
>>         |"col1":{"cf":"d", "col":"google_gid", "type":"string"}
>>     |}
>> |}""".stripMargin
>> 
>> import sqlContext.implicits._
>> 
>> def withCatalog(cat: String): DataFrame = {
>>     sqlContext
>>         .read
>>         .options(Map(HBaseTableCatalog.tableCatalog->cat))
>>         .format("org.apache.spark.sql.execution.datasources.hbase")
>>         .load()
>> }
>> 
>> val df = withCatalog(cat)
>> df.show
>> 
>> It gives me this error.
>> 
>> java.lang.NoSuchMethodError: scala.runtime.ObjectRef.create(Ljava/lang/Object;)Lscala/runtime/ObjectRef;
>> 	at org.apache.spark.sql.execution.datasources.hbase.HBaseTableCatalog$.apply(HBaseTableCatalog.scala:232)
>> 	at org.apache.spark.sql.execution.datasources.hbase.HBaseRelation.<init>(HBaseRelation.scala:77)
>> 	at org.apache.spark.sql.execution.datasources.hbase.DefaultSource.createRelation(HBaseRelation.scala:51)
>> 	at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:158)
>> 	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119)
>> 
>> If you can please help, I would be grateful.
>> 
>> Cheers,
>> Ben
>> 
>> 
>>> On Jan 31, 2017, at 1:02 PM, Marton, Elek <hdp@anzix.net <ma...@anzix.net>> wrote:
>>> 
>>> 
>>> I tested this one with hbase 1.2.4:
>>> 
>>> https://github.com/hortonworks-spark/shc <https://github.com/hortonworks-spark/shc>
>>> 
>>> Marton
>>> 
>>> On 01/31/2017 09:17 PM, Benjamin Kim wrote:
>>>> Does anyone know how to backport the HBase Spark module to HBase 1.2.0? I tried to build it from source, but I cannot get it to work.
>>>> 
>>>> Thanks,
>>>> Ben
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe e-mail: user-unsubscribe@spark.apache.org <ma...@spark.apache.org>
>>>> 
>>> 
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: user-unsubscribe@spark.apache.org <ma...@spark.apache.org>
>>> 
>> 
>> 
> 
> 


Re: HBase Spark

Posted by Asher Krim <ak...@hubspot.com>.
Did you check the actual maven dep tree? Something might be pulling in a
different version. Also, if you're seeing this locally, you might want to
check which version of the scala sdk your IDE is using

Asher Krim
Senior Software Engineer

On Thu, Feb 2, 2017 at 5:43 PM, Benjamin Kim <bb...@gmail.com> wrote:

> Hi Asher,
>
> I modified the pom to be the same Spark (1.6.0), HBase (1.2.0), and Java
> (1.8) version as our installation. The Scala (2.10.5) version is already
> the same as ours. But I’m still getting the same error. Can you think of
> anything else?
>
> Cheers,
> Ben
>
>
> On Feb 2, 2017, at 11:06 AM, Asher Krim <ak...@hubspot.com> wrote:
>
> Ben,
>
> That looks like a scala version mismatch. Have you checked your dep tree?
>
> Asher Krim
> Senior Software Engineer
>
> On Thu, Feb 2, 2017 at 1:28 PM, Benjamin Kim <bb...@gmail.com> wrote:
>
>> Elek,
>>
>> Can you give me some sample code? I can’t get mine to work.
>>
>> import org.apache.spark.sql.{SQLContext, _}
>> import org.apache.spark.sql.execution.datasources.hbase._
>> import org.apache.spark.{SparkConf, SparkContext}
>>
>> def cat = s"""{
>>     |"table":{"namespace":"ben", "name":"dmp_test",
>> "tableCoder":"PrimitiveType"},
>>     |"rowkey":"key",
>>     |"columns":{
>>         |"col0":{"cf":"rowkey", "col":"key", "type":"string"},
>>         |"col1":{"cf":"d", "col":"google_gid", "type":"string"}
>>     |}
>> |}""".stripMargin
>>
>> import sqlContext.implicits._
>>
>> def withCatalog(cat: String): DataFrame = {
>>     sqlContext
>>         .read
>>         .options(Map(HBaseTableCatalog.tableCatalog->cat))
>>         .format("org.apache.spark.sql.execution.datasources.hbase")
>>         .load()
>> }
>>
>> val df = withCatalog(cat)
>> df.show
>>
>>
>> It gives me this error.
>>
>> java.lang.NoSuchMethodError: scala.runtime.ObjectRef.create
>> (Ljava/lang/Object;)Lscala/runtime/ObjectRef;
>> at org.apache.spark.sql.execution.datasources.hbase.HBaseTableC
>> atalog$.apply(HBaseTableCatalog.scala:232)
>> at org.apache.spark.sql.execution.datasources.hbase.HBaseRelati
>> on.<init>(HBaseRelation.scala:77)
>> at org.apache.spark.sql.execution.datasources.hbase.DefaultSour
>> ce.createRelation(HBaseRelation.scala:51)
>> at org.apache.spark.sql.execution.datasources.ResolvedDataSourc
>> e$.apply(ResolvedDataSource.scala:158)
>> at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119)
>>
>>
>> If you can please help, I would be grateful.
>>
>> Cheers,
>> Ben
>>
>>
>> On Jan 31, 2017, at 1:02 PM, Marton, Elek <hd...@anzix.net> wrote:
>>
>>
>> I tested this one with hbase 1.2.4:
>>
>> https://github.com/hortonworks-spark/shc
>>
>> Marton
>>
>> On 01/31/2017 09:17 PM, Benjamin Kim wrote:
>>
>> Does anyone know how to backport the HBase Spark module to HBase 1.2.0? I
>> tried to build it from source, but I cannot get it to work.
>>
>> Thanks,
>> Ben
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>>
>>
>>
>
>

Re: HBase Spark

Posted by Benjamin Kim <bb...@gmail.com>.
Hi Asher,

I modified the pom to be the same Spark (1.6.0), HBase (1.2.0), and Java (1.8) version as our installation. The Scala (2.10.5) version is already the same as ours. But I’m still getting the same error. Can you think of anything else?

Cheers,
Ben


> On Feb 2, 2017, at 11:06 AM, Asher Krim <ak...@hubspot.com> wrote:
> 
> Ben,
> 
> That looks like a scala version mismatch. Have you checked your dep tree?
> 
> Asher Krim
> Senior Software Engineer
> 
> 
> On Thu, Feb 2, 2017 at 1:28 PM, Benjamin Kim <bbuild11@gmail.com <ma...@gmail.com>> wrote:
> Elek,
> 
> Can you give me some sample code? I can’t get mine to work.
> 
> import org.apache.spark.sql.{SQLContext, _}
> import org.apache.spark.sql.execution.datasources.hbase._
> import org.apache.spark.{SparkConf, SparkContext}
> 
> def cat = s"""{
>     |"table":{"namespace":"ben", "name":"dmp_test", "tableCoder":"PrimitiveType"},
>     |"rowkey":"key",
>     |"columns":{
>         |"col0":{"cf":"rowkey", "col":"key", "type":"string"},
>         |"col1":{"cf":"d", "col":"google_gid", "type":"string"}
>     |}
> |}""".stripMargin
> 
> import sqlContext.implicits._
> 
> def withCatalog(cat: String): DataFrame = {
>     sqlContext
>         .read
>         .options(Map(HBaseTableCatalog.tableCatalog->cat))
>         .format("org.apache.spark.sql.execution.datasources.hbase")
>         .load()
> }
> 
> val df = withCatalog(cat)
> df.show
> 
> It gives me this error.
> 
> java.lang.NoSuchMethodError: scala.runtime.ObjectRef.create(Ljava/lang/Object;)Lscala/runtime/ObjectRef;
> 	at org.apache.spark.sql.execution.datasources.hbase.HBaseTableCatalog$.apply(HBaseTableCatalog.scala:232)
> 	at org.apache.spark.sql.execution.datasources.hbase.HBaseRelation.<init>(HBaseRelation.scala:77)
> 	at org.apache.spark.sql.execution.datasources.hbase.DefaultSource.createRelation(HBaseRelation.scala:51)
> 	at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:158)
> 	at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119)
> 
> If you can please help, I would be grateful.
> 
> Cheers,
> Ben
> 
> 
>> On Jan 31, 2017, at 1:02 PM, Marton, Elek <hdp@anzix.net <ma...@anzix.net>> wrote:
>> 
>> 
>> I tested this one with hbase 1.2.4:
>> 
>> https://github.com/hortonworks-spark/shc <https://github.com/hortonworks-spark/shc>
>> 
>> Marton
>> 
>> On 01/31/2017 09:17 PM, Benjamin Kim wrote:
>>> Does anyone know how to backport the HBase Spark module to HBase 1.2.0? I tried to build it from source, but I cannot get it to work.
>>> 
>>> Thanks,
>>> Ben
>>> ---------------------------------------------------------------------
>>> To unsubscribe e-mail: user-unsubscribe@spark.apache.org <ma...@spark.apache.org>
>>> 
>> 
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: user-unsubscribe@spark.apache.org <ma...@spark.apache.org>
>> 
> 
> 


Re: HBase Spark

Posted by Asher Krim <ak...@hubspot.com>.
Ben,

That looks like a scala version mismatch. Have you checked your dep tree?

Asher Krim
Senior Software Engineer

On Thu, Feb 2, 2017 at 1:28 PM, Benjamin Kim <bb...@gmail.com> wrote:

> Elek,
>
> Can you give me some sample code? I can’t get mine to work.
>
> import org.apache.spark.sql.{SQLContext, _}
> import org.apache.spark.sql.execution.datasources.hbase._
> import org.apache.spark.{SparkConf, SparkContext}
>
> def cat = s"""{
>     |"table":{"namespace":"ben", "name":"dmp_test",
> "tableCoder":"PrimitiveType"},
>     |"rowkey":"key",
>     |"columns":{
>         |"col0":{"cf":"rowkey", "col":"key", "type":"string"},
>         |"col1":{"cf":"d", "col":"google_gid", "type":"string"}
>     |}
> |}""".stripMargin
>
> import sqlContext.implicits._
>
> def withCatalog(cat: String): DataFrame = {
>     sqlContext
>         .read
>         .options(Map(HBaseTableCatalog.tableCatalog->cat))
>         .format("org.apache.spark.sql.execution.datasources.hbase")
>         .load()
> }
>
> val df = withCatalog(cat)
> df.show
>
>
> It gives me this error.
>
> java.lang.NoSuchMethodError: scala.runtime.ObjectRef.
> create(Ljava/lang/Object;)Lscala/runtime/ObjectRef;
> at org.apache.spark.sql.execution.datasources.hbase.
> HBaseTableCatalog$.apply(HBaseTableCatalog.scala:232)
> at org.apache.spark.sql.execution.datasources.hbase.HBaseRelation.<init>(
> HBaseRelation.scala:77)
> at org.apache.spark.sql.execution.datasources.hbase.
> DefaultSource.createRelation(HBaseRelation.scala:51)
> at org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(
> ResolvedDataSource.scala:158)
> at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119)
>
>
> If you can please help, I would be grateful.
>
> Cheers,
> Ben
>
>
> On Jan 31, 2017, at 1:02 PM, Marton, Elek <hd...@anzix.net> wrote:
>
>
> I tested this one with hbase 1.2.4:
>
> https://github.com/hortonworks-spark/shc
>
> Marton
>
> On 01/31/2017 09:17 PM, Benjamin Kim wrote:
>
> Does anyone know how to backport the HBase Spark module to HBase 1.2.0? I
> tried to build it from source, but I cannot get it to work.
>
> Thanks,
> Ben
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>
>