You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Nan Zhu <zh...@gmail.com> on 2014/05/18 22:18:49 UTC

IllegelAccessError when writing to HBase?

Hi, all 

I tried to write data to HBase in a Spark-1.0 rc8  application, 

the application is terminated due to the java.lang.IllegalAccessError, Hbase shell works fine, and the same application works with a standalone Hbase deployment

java.lang.IllegalAccessError: com/google/protobuf/HBaseZeroCopyByteString 
at org.apache.hadoop.hbase.protobuf.RequestConverter.buildRegionSpecifier(RequestConverter.java:930)
at org.apache.hadoop.hbase.protobuf.RequestConverter.buildGetRowOrBeforeRequest(RequestConverter.java:133)
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRowOrBefore(ProtobufUtil.java:1466)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:1236)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:1110)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:1067)
at org.apache.hadoop.hbase.client.AsyncProcess.findDestLocation(AsyncProcess.java:356)
at org.apache.hadoop.hbase.client.AsyncProcess.submit(AsyncProcess.java:301)
at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:955)
at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1239)
at org.apache.hadoop.hbase.client.HTable.close(HTable.java:1276)
at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.close(TableOutputFormat.java:112)
at org.apache.spark.rdd.PairRDDFunctions.org$apache$spark$rdd$PairRDDFunctions$$writeShard$1(PairRDDFunctions.scala:720)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.apply(PairRDDFunctions.scala:730)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.apply(PairRDDFunctions.scala:730)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:111)
at org.apache.spark.scheduler.Task.run(Task.scala:51)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:187)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)


Can anyone give some hint to the issue?

Best, 

-- 
Nan Zhu


Re: IllegelAccessError when writing to HBase?

Posted by Nan Zhu <zh...@gmail.com>.
I tried hbase-0.96.2/0.98.1/0.98.2

HDFS version is 2.3 

-- 
Nan Zhu

On Sunday, May 18, 2014 at 4:18 PM, Nan Zhu wrote: 
> Hi, all 
> 
> I tried to write data to HBase in a Spark-1.0 rc8  application, 
> 
> the application is terminated due to the java.lang.IllegalAccessError, Hbase shell works fine, and the same application works with a standalone Hbase deployment
> 
> java.lang.IllegalAccessError: com/google/protobuf/HBaseZeroCopyByteString 
> at org.apache.hadoop.hbase.protobuf.RequestConverter.buildRegionSpecifier(RequestConverter.java:930)
> at org.apache.hadoop.hbase.protobuf.RequestConverter.buildGetRowOrBeforeRequest(RequestConverter.java:133)
> at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRowOrBefore(ProtobufUtil.java:1466)
> at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegionInMeta(HConnectionManager.java:1236)
> at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:1110)
> at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.locateRegion(HConnectionManager.java:1067)
> at org.apache.hadoop.hbase.client.AsyncProcess.findDestLocation(AsyncProcess.java:356)
> at org.apache.hadoop.hbase.client.AsyncProcess.submit(AsyncProcess.java:301)
> at org.apache.hadoop.hbase.client.HTable.backgroundFlushCommits(HTable.java:955)
> at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1239)
> at org.apache.hadoop.hbase.client.HTable.close(HTable.java:1276)
> at org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.close(TableOutputFormat.java:112)
> at org.apache.spark.rdd.PairRDDFunctions.org (http://org.apache.spark.rdd.PairRDDFunctions.org)$apache$spark$rdd$PairRDDFunctions$$writeShard$1(PairRDDFunctions.scala:720)
> at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.apply(PairRDDFunctions.scala:730)
> at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsNewAPIHadoopDataset$1.apply(PairRDDFunctions.scala:730)
> at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:111)
> at org.apache.spark.scheduler.Task.run(Task.scala:51)
> at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:187)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:744)
> 
> 
> Can anyone give some hint to the issue?
> 
> Best, 
> 
> -- 
> Nan Zhu
>