You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Kyle Lin <ky...@gmail.com> on 2013/12/06 10:56:38 UTC

Pig cannot load data using hbasestorage

Hey there

    First, my Environment: Hortonworks HDP2(HBase 0.95.2.2.0.5.0-64, Pig
0.11.1).

    I use pig to load data from hbase, then got Exception Message of
java.lang.ClassNotFoundException:
org.apache.hadoop.hbase.filter.WritableByteArrayComparable.

    My script is like below:
samples = LOAD 'hbase://test' using
  org.apache.pig.backend.hadoop.hbase.HBaseStorage('cf:name cf:phone
cf:city cf:address')
  as (name, phone, city, address);
dump samples;

    After googling, people said you need to set PIG_CLASSPATH first. So I
try to add the target jar in PIG_CLASSPATH, but cannot
find org.apache.hadoop.hbase.filter.WritableByteArrayComparable in any
hbase jars.


Kyle

Re: Pig cannot load data using hbasestorage

Posted by Kyle Lin <ky...@gmail.com>.
Hey guys

    My script worked. I've change my script as below, including more and
enough jar files.

REGISTER /usr/lib/hbase/lib/zookeeper.jar;
REGISTER /usr/lib/hbase/lib/*.jar;
REGISTER /usr/lib/hadoop/*.jar
samples = LOAD 'hbase://test' using
  org.apache.pig.backend.hadoop.hbase.HBaseStorage('cf:name cf:phone
cf:city cf:address')
  as (name:chararray, phone:chararray, city:chararray, address:chararray);
dump samples;


Kyle



2013/12/9 Kyle Lin <ky...@gmail.com>

> Hey
>
>     Thanks for your help. But the story not end...
>
>     Finally, I use latest Hortonworks Sandbox 2 as my testing
> environment(Pig 0.12.0, HBase 0.96.0). But, got another problem.
>
>     My pig script is below (I type "pig -f xx.pig" to run it)
>
> REGISTER /usr/lib/hbase/lib/zookeeper.jar;
> REGISTER /usr/lib/hbase/lib/hbase-*.jar;
> REGISTER /usr/lib/hadoop/hadoop*.jar
> samples = LOAD 'hbase://test' using
>   org.apache.pig.backend.hadoop.hbase.HBaseStorage('cf:name cf:phone
> cf:city cf:address')
>   as (name, phone, city, address);
> dump samples;
>
>     The stack trace
>
> ERROR 1066: Unable to open iterator for alias samples. Backend error :
> java.io.IOException: Cannot create a record reader because of a previous
> error. Please look at the previous logs lines from the task's full log for
> more details.
>
> org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to
> open iterator for alias samples. Backend error : java.io.IOException:
> Cannot create a record reader because of a previous error. Please look at
> the previous logs lines from the task's full log for more details.
>         at org.apache.pig.PigServer.openIterator(PigServer.java:870)
>         at
> org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:774)
>         at
> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:372)
>         at
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:198)
>         at
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:173)
>         at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:84)
>         at org.apache.pig.Main.run(Main.java:478)
>         at org.apache.pig.Main.main(Main.java:156)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
> Caused by: java.lang.RuntimeException: java.io.IOException: Cannot create
> a record reader because of a previous error. Please look at the previous
> logs lines from the task's full log for more details.
>         at
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.initNextRecordReader(PigRecordReader.java:266)
>         at
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.<init>(PigRecordReader.java:123)
>         at
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.createRecordReader(PigInputFormat.java:123)
>         at
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.<init>(MapTask.java:491)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:734)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:339)
>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:396)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
> Caused by: java.io.IOException: Cannot create a record reader because of a
> previous error. Please look at the previous logs lines from the task's full
> log for more details.
>         at
> org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:119)
>         at
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.initNextRecordReader(PigRecordReader.java:256)
>
>
> Kyle
>
>
>
> 2013/12/9 Ted Yu <yu...@gmail.com>
>
>> Please use this link to install:
>> http://hortonworks.com/products/hdp-2/#install
>>
>> Cheers
>>
>>
>> On Mon, Dec 9, 2013 at 10:06 AM, Kyle Lin <ky...@gmail.com> wrote:
>>
>> > Hello Ted
>> >
>> >     Actually I used Hortonworks sandbox 2.0 Beta. Should I get rid off
>> this
>> > problem by using ambari to install HDP2?
>> >
>> > Kyle
>> >
>> >
>> > 2013/12/9 Ted Yu <yu...@gmail.com>
>> >
>> > > Kyle:
>> > > According to http://hortonworks.com/products/hdp-2/ , PIG 0.12
>> should be
>> > > used where there is no such problem.
>> > >
>> > > Did you happen to use HDP 2 Beta ?
>> > >
>> > > Cheers
>> > >
>> > >
>> > > On Mon, Dec 9, 2013 at 9:49 AM, Kyle Lin <ky...@gmail.com>
>> wrote:
>> > >
>> > > > Hello Ted
>> > > >
>> > > >     Below is the stack trace. May I say that because HBase having
>> > > > removed WritableByteArrayComparable, but pig is not upgrade for this
>> > > > change?
>> > > >
>> > > >
>> > > > Pig Stack Trace
>> > > > ---------------
>> > > > ERROR 2998: Unhandled internal error.
>> > > > org/apache/hadoop/hbase/filter/WritableByteArrayComparable
>> > > >
>> > > > java.lang.NoClassDefFoundError:
>> > > > org/apache/hadoop/hbase/filter/WritableByteArrayComparable
>> > > >         at java.lang.Class.forName0(Native Method)
>> > > >         at java.lang.Class.forName(Class.java:247)
>> > > >         at
>> > > > org.apache.pig.impl.PigContext.resolveClassName(PigContext.java:510)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.pig.parser.LogicalPlanBuilder.validateFuncSpec(LogicalPlanBuilder.java:1220)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.pig.parser.LogicalPlanBuilder.buildFuncSpec(LogicalPlanBuilder.java:1208)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.pig.parser.LogicalPlanGenerator.func_clause(LogicalPlanGenerator.java:4849)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.pig.parser.LogicalPlanGenerator.load_clause(LogicalPlanGenerator.java:3206)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.pig.parser.LogicalPlanGenerator.op_clause(LogicalPlanGenerator.java:1338)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.pig.parser.LogicalPlanGenerator.general_statement(LogicalPlanGenerator.java:822)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.pig.parser.LogicalPlanGenerator.statement(LogicalPlanGenerator.java:540)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.pig.parser.LogicalPlanGenerator.query(LogicalPlanGenerator.java:415)
>> > > >         at
>> > > >
>> >
>> org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:181)
>> > > >         at
>> > > > org.apache.pig.PigServer$Graph.validateQuery(PigServer.java:1633)
>> > > >         at
>> > > > org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1606)
>> > > >         at
>> org.apache.pig.PigServer.registerQuery(PigServer.java:565)
>> > > >         at
>> > > >
>> > org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:1032)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:499)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:194)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:170)
>> > > >         at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:69)
>> > > >         at org.apache.pig.Main.run(Main.java:543)
>> > > >         at org.apache.pig.Main.main(Main.java:158)
>> > > >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
>> Method)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> > > >         at
>> > > >
>> > > >
>> > >
>> >
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> > > >         at java.lang.reflect.Method.invoke(Method.java:597)
>> > > >         at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
>> > > > Caused by: java.lang.ClassNotFoundException:
>> > > > org.apache.hadoop.hbase.filter.WritableByteArrayComparable
>> > > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>> > > >         at java.security.AccessController.doPrivileged(Native
>> Method)
>> > > >         at
>> java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>> > > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>> > > >         at
>> > sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>> > > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>> > > >         ... 27 more
>> > > >
>> > > >
>> > > >
>> > > > Kyle
>> > > >
>> > > >
>> > > > 2013/12/7 Ted Yu <yu...@gmail.com>
>> > > >
>> > > > > bq. HDP2(HBase 0.95.2.2.0.5.0-64
>> > > > >
>> > > > > HDP2 goes with 0.96.0
>> > > > >
>> > > > > bq. java.lang.ClassNotFoundException:
>> org.apache.hadoop.hbase.filter.
>> > > > > WritableByteArrayComparable.
>> > > > >
>> > > > > Can you show us the stack trace ?
>> > > > > WritableByteArrayComparable doesn't exist in 0.96 and later
>> branches.
>> > > > >
>> > > > > Cheers
>> > > > >
>> > > > >
>> > > > > On Sat, Dec 7, 2013 at 4:22 AM, Rohini Palaniswamy
>> > > > > <ro...@gmail.com>wrote:
>> > > > >
>> > > > > > Do a register of your hbase and zookeeper jars in the pig
>> script.
>> > > > > >
>> > > > > > -Rohini
>> > > > > >
>> > > > > >
>> > > > > > On Fri, Dec 6, 2013 at 1:56 AM, Kyle Lin <kylelin2000@gmail.com
>> >
>> > > > wrote:
>> > > > > >
>> > > > > > > Hey there
>> > > > > > >
>> > > > > > >     First, my Environment: Hortonworks HDP2(HBase
>> > > 0.95.2.2.0.5.0-64,
>> > > > > Pig
>> > > > > > > 0.11.1).
>> > > > > > >
>> > > > > > >     I use pig to load data from hbase, then got Exception
>> Message
>> > > of
>> > > > > > > java.lang.ClassNotFoundException:
>> > > > > > > org.apache.hadoop.hbase.filter.WritableByteArrayComparable.
>> > > > > > >
>> > > > > > >     My script is like below:
>> > > > > > > samples = LOAD 'hbase://test' using
>> > > > > > >   org.apache.pig.backend.hadoop.hbase.HBaseStorage('cf:name
>> > > cf:phone
>> > > > > > > cf:city cf:address')
>> > > > > > >   as (name, phone, city, address);
>> > > > > > > dump samples;
>> > > > > > >
>> > > > > > >     After googling, people said you need to set PIG_CLASSPATH
>> > > first.
>> > > > > So I
>> > > > > > > try to add the target jar in PIG_CLASSPATH, but cannot
>> > > > > > > find
>> org.apache.hadoop.hbase.filter.WritableByteArrayComparable
>> > in
>> > > > any
>> > > > > > > hbase jars.
>> > > > > > >
>> > > > > > >
>> > > > > > > Kyle
>> > > > > > >
>> > > > > >
>> > > > >
>> > > >
>> > >
>> >
>>
>
>

Re: Pig cannot load data using hbasestorage

Posted by Kyle Lin <ky...@gmail.com>.
Hello Ted

    As pig log said "Please look at the previous logs lines from the task's
full log for more details", so I try to re-run the scenario to get the same
error. After running the imperfect script, I've see all logs under
/var/log/hadoop-mapreduce, /var/log/hadoop-yarn....., but cannot see any
relative errors in these logs.

   No logs said I lost some classes, but when I include more jars, the
script works fine.

Kyle


2013/12/9 Ted Yu <yu...@gmail.com>

> bq. Please look at the previous logs lines from the task's full log for
> more details.
>
> Do you still keep the full log ?
> If so, can you pastebin the full log with the detail ?
>
> Cheers
>
>
> On Mon, Dec 9, 2013 at 5:27 PM, Kyle Lin <ky...@gmail.com> wrote:
>
> > Hey
> >
> >     Thanks for your help. But the story not end...
> >
> >     Finally, I use latest Hortonworks Sandbox 2 as my testing
> > environment(Pig 0.12.0, HBase 0.96.0). But, got another problem.
> >
> >     My pig script is below (I type "pig -f xx.pig" to run it)
> >
> > REGISTER /usr/lib/hbase/lib/zookeeper.jar;
> > REGISTER /usr/lib/hbase/lib/hbase-*.jar;
> > REGISTER /usr/lib/hadoop/hadoop*.jar
> > samples = LOAD 'hbase://test' using
> >   org.apache.pig.backend.hadoop.hbase.HBaseStorage('cf:name cf:phone
> > cf:city cf:address')
> >   as (name, phone, city, address);
> > dump samples;
> >
> >     The stack trace
> >
> > ERROR 1066: Unable to open iterator for alias samples. Backend error :
> > java.io.IOException: Cannot create a record reader because of a previous
> > error. Please look at the previous logs lines from the task's full log
> for
> > more details.
> >
> > org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to
> > open iterator for alias samples. Backend error : java.io.IOException:
> > Cannot create a record reader because of a previous error. Please look at
> > the previous logs lines from the task's full log for more details.
> >         at org.apache.pig.PigServer.openIterator(PigServer.java:870)
> >         at
> > org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:774)
> >         at
> >
> >
> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:372)
> >         at
> >
> >
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:198)
> >         at
> >
> >
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:173)
> >         at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:84)
> >         at org.apache.pig.Main.run(Main.java:478)
> >         at org.apache.pig.Main.main(Main.java:156)
> >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >         at
> >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >         at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >         at java.lang.reflect.Method.invoke(Method.java:597)
> >         at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
> > Caused by: java.lang.RuntimeException: java.io.IOException: Cannot
> create a
> > record reader because of a previous error. Please look at the previous
> logs
> > lines from the task's full log for more details.
> >         at
> >
> >
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.initNextRecordReader(PigRecordReader.java:266)
> >         at
> >
> >
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.<init>(PigRecordReader.java:123)
> >         at
> >
> >
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.createRecordReader(PigInputFormat.java:123)
> >         at
> >
> >
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.<init>(MapTask.java:491)
> >         at
> org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:734)
> >         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:339)
> >         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at javax.security.auth.Subject.doAs(Subject.java:396)
> >         at
> >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> >         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
> > Caused by: java.io.IOException: Cannot create a record reader because of
> a
> > previous error. Please look at the previous logs lines from the task's
> full
> > log for more details.
> >         at
> >
> >
> org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:119)
> >         at
> >
> >
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.initNextRecordReader(PigRecordReader.java:256)
> >
> >
> > Kyle
> >
> >
> >
> > 2013/12/9 Ted Yu <yu...@gmail.com>
> >
> > > Please use this link to install:
> > > http://hortonworks.com/products/hdp-2/#install
> > >
> > > Cheers
> > >
> > >
> > > On Mon, Dec 9, 2013 at 10:06 AM, Kyle Lin <ky...@gmail.com>
> wrote:
> > >
> > > > Hello Ted
> > > >
> > > >     Actually I used Hortonworks sandbox 2.0 Beta. Should I get rid
> off
> > > this
> > > > problem by using ambari to install HDP2?
> > > >
> > > > Kyle
> > > >
> > > >
> > > > 2013/12/9 Ted Yu <yu...@gmail.com>
> > > >
> > > > > Kyle:
> > > > > According to http://hortonworks.com/products/hdp-2/ , PIG 0.12
> > should
> > > be
> > > > > used where there is no such problem.
> > > > >
> > > > > Did you happen to use HDP 2 Beta ?
> > > > >
> > > > > Cheers
> > > > >
> > > > >
> > > > > On Mon, Dec 9, 2013 at 9:49 AM, Kyle Lin <ky...@gmail.com>
> > > wrote:
> > > > >
> > > > > > Hello Ted
> > > > > >
> > > > > >     Below is the stack trace. May I say that because HBase having
> > > > > > removed WritableByteArrayComparable, but pig is not upgrade for
> > this
> > > > > > change?
> > > > > >
> > > > > >
> > > > > > Pig Stack Trace
> > > > > > ---------------
> > > > > > ERROR 2998: Unhandled internal error.
> > > > > > org/apache/hadoop/hbase/filter/WritableByteArrayComparable
> > > > > >
> > > > > > java.lang.NoClassDefFoundError:
> > > > > > org/apache/hadoop/hbase/filter/WritableByteArrayComparable
> > > > > >         at java.lang.Class.forName0(Native Method)
> > > > > >         at java.lang.Class.forName(Class.java:247)
> > > > > >         at
> > > > > >
> > org.apache.pig.impl.PigContext.resolveClassName(PigContext.java:510)
> > > > > >         at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.pig.parser.LogicalPlanBuilder.validateFuncSpec(LogicalPlanBuilder.java:1220)
> > > > > >         at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.pig.parser.LogicalPlanBuilder.buildFuncSpec(LogicalPlanBuilder.java:1208)
> > > > > >         at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.pig.parser.LogicalPlanGenerator.func_clause(LogicalPlanGenerator.java:4849)
> > > > > >         at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.pig.parser.LogicalPlanGenerator.load_clause(LogicalPlanGenerator.java:3206)
> > > > > >         at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.pig.parser.LogicalPlanGenerator.op_clause(LogicalPlanGenerator.java:1338)
> > > > > >         at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.pig.parser.LogicalPlanGenerator.general_statement(LogicalPlanGenerator.java:822)
> > > > > >         at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.pig.parser.LogicalPlanGenerator.statement(LogicalPlanGenerator.java:540)
> > > > > >         at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.pig.parser.LogicalPlanGenerator.query(LogicalPlanGenerator.java:415)
> > > > > >         at
> > > > > >
> > > >
> > org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:181)
> > > > > >         at
> > > > > > org.apache.pig.PigServer$Graph.validateQuery(PigServer.java:1633)
> > > > > >         at
> > > > > > org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1606)
> > > > > >         at
> > org.apache.pig.PigServer.registerQuery(PigServer.java:565)
> > > > > >         at
> > > > > >
> > > >
> > org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:1032)
> > > > > >         at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:499)
> > > > > >         at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:194)
> > > > > >         at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:170)
> > > > > >         at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:69)
> > > > > >         at org.apache.pig.Main.run(Main.java:543)
> > > > > >         at org.apache.pig.Main.main(Main.java:158)
> > > > > >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> > > Method)
> > > > > >         at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> > > > > >         at
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > > > > >         at java.lang.reflect.Method.invoke(Method.java:597)
> > > > > >         at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
> > > > > > Caused by: java.lang.ClassNotFoundException:
> > > > > > org.apache.hadoop.hbase.filter.WritableByteArrayComparable
> > > > > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> > > > > >         at java.security.AccessController.doPrivileged(Native
> > Method)
> > > > > >         at
> > java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > > > > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > > > > >         at
> > > > sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> > > > > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > > > > >         ... 27 more
> > > > > >
> > > > > >
> > > > > >
> > > > > > Kyle
> > > > > >
> > > > > >
> > > > > > 2013/12/7 Ted Yu <yu...@gmail.com>
> > > > > >
> > > > > > > bq. HDP2(HBase 0.95.2.2.0.5.0-64
> > > > > > >
> > > > > > > HDP2 goes with 0.96.0
> > > > > > >
> > > > > > > bq. java.lang.ClassNotFoundException:
> > > org.apache.hadoop.hbase.filter.
> > > > > > > WritableByteArrayComparable.
> > > > > > >
> > > > > > > Can you show us the stack trace ?
> > > > > > > WritableByteArrayComparable doesn't exist in 0.96 and later
> > > branches.
> > > > > > >
> > > > > > > Cheers
> > > > > > >
> > > > > > >
> > > > > > > On Sat, Dec 7, 2013 at 4:22 AM, Rohini Palaniswamy
> > > > > > > <ro...@gmail.com>wrote:
> > > > > > >
> > > > > > > > Do a register of your hbase and zookeeper jars in the pig
> > script.
> > > > > > > >
> > > > > > > > -Rohini
> > > > > > > >
> > > > > > > >
> > > > > > > > On Fri, Dec 6, 2013 at 1:56 AM, Kyle Lin <
> > kylelin2000@gmail.com>
> > > > > > wrote:
> > > > > > > >
> > > > > > > > > Hey there
> > > > > > > > >
> > > > > > > > >     First, my Environment: Hortonworks HDP2(HBase
> > > > > 0.95.2.2.0.5.0-64,
> > > > > > > Pig
> > > > > > > > > 0.11.1).
> > > > > > > > >
> > > > > > > > >     I use pig to load data from hbase, then got Exception
> > > Message
> > > > > of
> > > > > > > > > java.lang.ClassNotFoundException:
> > > > > > > > > org.apache.hadoop.hbase.filter.WritableByteArrayComparable.
> > > > > > > > >
> > > > > > > > >     My script is like below:
> > > > > > > > > samples = LOAD 'hbase://test' using
> > > > > > > > >   org.apache.pig.backend.hadoop.hbase.HBaseStorage('cf:name
> > > > > cf:phone
> > > > > > > > > cf:city cf:address')
> > > > > > > > >   as (name, phone, city, address);
> > > > > > > > > dump samples;
> > > > > > > > >
> > > > > > > > >     After googling, people said you need to set
> PIG_CLASSPATH
> > > > > first.
> > > > > > > So I
> > > > > > > > > try to add the target jar in PIG_CLASSPATH, but cannot
> > > > > > > > > find
> > org.apache.hadoop.hbase.filter.WritableByteArrayComparable
> > > > in
> > > > > > any
> > > > > > > > > hbase jars.
> > > > > > > > >
> > > > > > > > >
> > > > > > > > > Kyle
> > > > > > > > >
> > > > > > > >
> > > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
>

Re: Pig cannot load data using hbasestorage

Posted by Ted Yu <yu...@gmail.com>.
bq. Please look at the previous logs lines from the task's full log for
more details.

Do you still keep the full log ?
If so, can you pastebin the full log with the detail ?

Cheers


On Mon, Dec 9, 2013 at 5:27 PM, Kyle Lin <ky...@gmail.com> wrote:

> Hey
>
>     Thanks for your help. But the story not end...
>
>     Finally, I use latest Hortonworks Sandbox 2 as my testing
> environment(Pig 0.12.0, HBase 0.96.0). But, got another problem.
>
>     My pig script is below (I type "pig -f xx.pig" to run it)
>
> REGISTER /usr/lib/hbase/lib/zookeeper.jar;
> REGISTER /usr/lib/hbase/lib/hbase-*.jar;
> REGISTER /usr/lib/hadoop/hadoop*.jar
> samples = LOAD 'hbase://test' using
>   org.apache.pig.backend.hadoop.hbase.HBaseStorage('cf:name cf:phone
> cf:city cf:address')
>   as (name, phone, city, address);
> dump samples;
>
>     The stack trace
>
> ERROR 1066: Unable to open iterator for alias samples. Backend error :
> java.io.IOException: Cannot create a record reader because of a previous
> error. Please look at the previous logs lines from the task's full log for
> more details.
>
> org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to
> open iterator for alias samples. Backend error : java.io.IOException:
> Cannot create a record reader because of a previous error. Please look at
> the previous logs lines from the task's full log for more details.
>         at org.apache.pig.PigServer.openIterator(PigServer.java:870)
>         at
> org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:774)
>         at
>
> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:372)
>         at
>
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:198)
>         at
>
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:173)
>         at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:84)
>         at org.apache.pig.Main.run(Main.java:478)
>         at org.apache.pig.Main.main(Main.java:156)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
> Caused by: java.lang.RuntimeException: java.io.IOException: Cannot create a
> record reader because of a previous error. Please look at the previous logs
> lines from the task's full log for more details.
>         at
>
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.initNextRecordReader(PigRecordReader.java:266)
>         at
>
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.<init>(PigRecordReader.java:123)
>         at
>
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.createRecordReader(PigInputFormat.java:123)
>         at
>
> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.<init>(MapTask.java:491)
>         at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:734)
>         at org.apache.hadoop.mapred.MapTask.run(MapTask.java:339)
>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:396)
>         at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
> Caused by: java.io.IOException: Cannot create a record reader because of a
> previous error. Please look at the previous logs lines from the task's full
> log for more details.
>         at
>
> org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:119)
>         at
>
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.initNextRecordReader(PigRecordReader.java:256)
>
>
> Kyle
>
>
>
> 2013/12/9 Ted Yu <yu...@gmail.com>
>
> > Please use this link to install:
> > http://hortonworks.com/products/hdp-2/#install
> >
> > Cheers
> >
> >
> > On Mon, Dec 9, 2013 at 10:06 AM, Kyle Lin <ky...@gmail.com> wrote:
> >
> > > Hello Ted
> > >
> > >     Actually I used Hortonworks sandbox 2.0 Beta. Should I get rid off
> > this
> > > problem by using ambari to install HDP2?
> > >
> > > Kyle
> > >
> > >
> > > 2013/12/9 Ted Yu <yu...@gmail.com>
> > >
> > > > Kyle:
> > > > According to http://hortonworks.com/products/hdp-2/ , PIG 0.12
> should
> > be
> > > > used where there is no such problem.
> > > >
> > > > Did you happen to use HDP 2 Beta ?
> > > >
> > > > Cheers
> > > >
> > > >
> > > > On Mon, Dec 9, 2013 at 9:49 AM, Kyle Lin <ky...@gmail.com>
> > wrote:
> > > >
> > > > > Hello Ted
> > > > >
> > > > >     Below is the stack trace. May I say that because HBase having
> > > > > removed WritableByteArrayComparable, but pig is not upgrade for
> this
> > > > > change?
> > > > >
> > > > >
> > > > > Pig Stack Trace
> > > > > ---------------
> > > > > ERROR 2998: Unhandled internal error.
> > > > > org/apache/hadoop/hbase/filter/WritableByteArrayComparable
> > > > >
> > > > > java.lang.NoClassDefFoundError:
> > > > > org/apache/hadoop/hbase/filter/WritableByteArrayComparable
> > > > >         at java.lang.Class.forName0(Native Method)
> > > > >         at java.lang.Class.forName(Class.java:247)
> > > > >         at
> > > > >
> org.apache.pig.impl.PigContext.resolveClassName(PigContext.java:510)
> > > > >         at
> > > > >
> > > > >
> > > >
> > >
> >
> org.apache.pig.parser.LogicalPlanBuilder.validateFuncSpec(LogicalPlanBuilder.java:1220)
> > > > >         at
> > > > >
> > > > >
> > > >
> > >
> >
> org.apache.pig.parser.LogicalPlanBuilder.buildFuncSpec(LogicalPlanBuilder.java:1208)
> > > > >         at
> > > > >
> > > > >
> > > >
> > >
> >
> org.apache.pig.parser.LogicalPlanGenerator.func_clause(LogicalPlanGenerator.java:4849)
> > > > >         at
> > > > >
> > > > >
> > > >
> > >
> >
> org.apache.pig.parser.LogicalPlanGenerator.load_clause(LogicalPlanGenerator.java:3206)
> > > > >         at
> > > > >
> > > > >
> > > >
> > >
> >
> org.apache.pig.parser.LogicalPlanGenerator.op_clause(LogicalPlanGenerator.java:1338)
> > > > >         at
> > > > >
> > > > >
> > > >
> > >
> >
> org.apache.pig.parser.LogicalPlanGenerator.general_statement(LogicalPlanGenerator.java:822)
> > > > >         at
> > > > >
> > > > >
> > > >
> > >
> >
> org.apache.pig.parser.LogicalPlanGenerator.statement(LogicalPlanGenerator.java:540)
> > > > >         at
> > > > >
> > > > >
> > > >
> > >
> >
> org.apache.pig.parser.LogicalPlanGenerator.query(LogicalPlanGenerator.java:415)
> > > > >         at
> > > > >
> > >
> org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:181)
> > > > >         at
> > > > > org.apache.pig.PigServer$Graph.validateQuery(PigServer.java:1633)
> > > > >         at
> > > > > org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1606)
> > > > >         at
> org.apache.pig.PigServer.registerQuery(PigServer.java:565)
> > > > >         at
> > > > >
> > >
> org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:1032)
> > > > >         at
> > > > >
> > > > >
> > > >
> > >
> >
> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:499)
> > > > >         at
> > > > >
> > > > >
> > > >
> > >
> >
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:194)
> > > > >         at
> > > > >
> > > > >
> > > >
> > >
> >
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:170)
> > > > >         at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:69)
> > > > >         at org.apache.pig.Main.run(Main.java:543)
> > > > >         at org.apache.pig.Main.main(Main.java:158)
> > > > >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> > Method)
> > > > >         at
> > > > >
> > > > >
> > > >
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> > > > >         at
> > > > >
> > > > >
> > > >
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > > > >         at java.lang.reflect.Method.invoke(Method.java:597)
> > > > >         at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
> > > > > Caused by: java.lang.ClassNotFoundException:
> > > > > org.apache.hadoop.hbase.filter.WritableByteArrayComparable
> > > > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> > > > >         at java.security.AccessController.doPrivileged(Native
> Method)
> > > > >         at
> java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > > > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > > > >         at
> > > sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> > > > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > > > >         ... 27 more
> > > > >
> > > > >
> > > > >
> > > > > Kyle
> > > > >
> > > > >
> > > > > 2013/12/7 Ted Yu <yu...@gmail.com>
> > > > >
> > > > > > bq. HDP2(HBase 0.95.2.2.0.5.0-64
> > > > > >
> > > > > > HDP2 goes with 0.96.0
> > > > > >
> > > > > > bq. java.lang.ClassNotFoundException:
> > org.apache.hadoop.hbase.filter.
> > > > > > WritableByteArrayComparable.
> > > > > >
> > > > > > Can you show us the stack trace ?
> > > > > > WritableByteArrayComparable doesn't exist in 0.96 and later
> > branches.
> > > > > >
> > > > > > Cheers
> > > > > >
> > > > > >
> > > > > > On Sat, Dec 7, 2013 at 4:22 AM, Rohini Palaniswamy
> > > > > > <ro...@gmail.com>wrote:
> > > > > >
> > > > > > > Do a register of your hbase and zookeeper jars in the pig
> script.
> > > > > > >
> > > > > > > -Rohini
> > > > > > >
> > > > > > >
> > > > > > > On Fri, Dec 6, 2013 at 1:56 AM, Kyle Lin <
> kylelin2000@gmail.com>
> > > > > wrote:
> > > > > > >
> > > > > > > > Hey there
> > > > > > > >
> > > > > > > >     First, my Environment: Hortonworks HDP2(HBase
> > > > 0.95.2.2.0.5.0-64,
> > > > > > Pig
> > > > > > > > 0.11.1).
> > > > > > > >
> > > > > > > >     I use pig to load data from hbase, then got Exception
> > Message
> > > > of
> > > > > > > > java.lang.ClassNotFoundException:
> > > > > > > > org.apache.hadoop.hbase.filter.WritableByteArrayComparable.
> > > > > > > >
> > > > > > > >     My script is like below:
> > > > > > > > samples = LOAD 'hbase://test' using
> > > > > > > >   org.apache.pig.backend.hadoop.hbase.HBaseStorage('cf:name
> > > > cf:phone
> > > > > > > > cf:city cf:address')
> > > > > > > >   as (name, phone, city, address);
> > > > > > > > dump samples;
> > > > > > > >
> > > > > > > >     After googling, people said you need to set PIG_CLASSPATH
> > > > first.
> > > > > > So I
> > > > > > > > try to add the target jar in PIG_CLASSPATH, but cannot
> > > > > > > > find
> org.apache.hadoop.hbase.filter.WritableByteArrayComparable
> > > in
> > > > > any
> > > > > > > > hbase jars.
> > > > > > > >
> > > > > > > >
> > > > > > > > Kyle
> > > > > > > >
> > > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
>

Re: Pig cannot load data using hbasestorage

Posted by Kyle Lin <ky...@gmail.com>.
Hey

    Thanks for your help. But the story not end...

    Finally, I use latest Hortonworks Sandbox 2 as my testing
environment(Pig 0.12.0, HBase 0.96.0). But, got another problem.

    My pig script is below (I type "pig -f xx.pig" to run it)

REGISTER /usr/lib/hbase/lib/zookeeper.jar;
REGISTER /usr/lib/hbase/lib/hbase-*.jar;
REGISTER /usr/lib/hadoop/hadoop*.jar
samples = LOAD 'hbase://test' using
  org.apache.pig.backend.hadoop.hbase.HBaseStorage('cf:name cf:phone
cf:city cf:address')
  as (name, phone, city, address);
dump samples;

    The stack trace

ERROR 1066: Unable to open iterator for alias samples. Backend error :
java.io.IOException: Cannot create a record reader because of a previous
error. Please look at the previous logs lines from the task's full log for
more details.

org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to
open iterator for alias samples. Backend error : java.io.IOException:
Cannot create a record reader because of a previous error. Please look at
the previous logs lines from the task's full log for more details.
        at org.apache.pig.PigServer.openIterator(PigServer.java:870)
        at
org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:774)
        at
org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:372)
        at
org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:198)
        at
org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:173)
        at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:84)
        at org.apache.pig.Main.run(Main.java:478)
        at org.apache.pig.Main.main(Main.java:156)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
Caused by: java.lang.RuntimeException: java.io.IOException: Cannot create a
record reader because of a previous error. Please look at the previous logs
lines from the task's full log for more details.
        at
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.initNextRecordReader(PigRecordReader.java:266)
        at
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.<init>(PigRecordReader.java:123)
        at
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigInputFormat.createRecordReader(PigInputFormat.java:123)
        at
org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.<init>(MapTask.java:491)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:734)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:339)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:396)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
Caused by: java.io.IOException: Cannot create a record reader because of a
previous error. Please look at the previous logs lines from the task's full
log for more details.
        at
org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.createRecordReader(TableInputFormatBase.java:119)
        at
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.initNextRecordReader(PigRecordReader.java:256)


Kyle



2013/12/9 Ted Yu <yu...@gmail.com>

> Please use this link to install:
> http://hortonworks.com/products/hdp-2/#install
>
> Cheers
>
>
> On Mon, Dec 9, 2013 at 10:06 AM, Kyle Lin <ky...@gmail.com> wrote:
>
> > Hello Ted
> >
> >     Actually I used Hortonworks sandbox 2.0 Beta. Should I get rid off
> this
> > problem by using ambari to install HDP2?
> >
> > Kyle
> >
> >
> > 2013/12/9 Ted Yu <yu...@gmail.com>
> >
> > > Kyle:
> > > According to http://hortonworks.com/products/hdp-2/ , PIG 0.12 should
> be
> > > used where there is no such problem.
> > >
> > > Did you happen to use HDP 2 Beta ?
> > >
> > > Cheers
> > >
> > >
> > > On Mon, Dec 9, 2013 at 9:49 AM, Kyle Lin <ky...@gmail.com>
> wrote:
> > >
> > > > Hello Ted
> > > >
> > > >     Below is the stack trace. May I say that because HBase having
> > > > removed WritableByteArrayComparable, but pig is not upgrade for this
> > > > change?
> > > >
> > > >
> > > > Pig Stack Trace
> > > > ---------------
> > > > ERROR 2998: Unhandled internal error.
> > > > org/apache/hadoop/hbase/filter/WritableByteArrayComparable
> > > >
> > > > java.lang.NoClassDefFoundError:
> > > > org/apache/hadoop/hbase/filter/WritableByteArrayComparable
> > > >         at java.lang.Class.forName0(Native Method)
> > > >         at java.lang.Class.forName(Class.java:247)
> > > >         at
> > > > org.apache.pig.impl.PigContext.resolveClassName(PigContext.java:510)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.pig.parser.LogicalPlanBuilder.validateFuncSpec(LogicalPlanBuilder.java:1220)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.pig.parser.LogicalPlanBuilder.buildFuncSpec(LogicalPlanBuilder.java:1208)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.pig.parser.LogicalPlanGenerator.func_clause(LogicalPlanGenerator.java:4849)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.pig.parser.LogicalPlanGenerator.load_clause(LogicalPlanGenerator.java:3206)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.pig.parser.LogicalPlanGenerator.op_clause(LogicalPlanGenerator.java:1338)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.pig.parser.LogicalPlanGenerator.general_statement(LogicalPlanGenerator.java:822)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.pig.parser.LogicalPlanGenerator.statement(LogicalPlanGenerator.java:540)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.pig.parser.LogicalPlanGenerator.query(LogicalPlanGenerator.java:415)
> > > >         at
> > > >
> > org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:181)
> > > >         at
> > > > org.apache.pig.PigServer$Graph.validateQuery(PigServer.java:1633)
> > > >         at
> > > > org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1606)
> > > >         at org.apache.pig.PigServer.registerQuery(PigServer.java:565)
> > > >         at
> > > >
> > org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:1032)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:499)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:194)
> > > >         at
> > > >
> > > >
> > >
> >
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:170)
> > > >         at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:69)
> > > >         at org.apache.pig.Main.run(Main.java:543)
> > > >         at org.apache.pig.Main.main(Main.java:158)
> > > >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> > > >         at
> > > >
> > > >
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> > > >         at
> > > >
> > > >
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > > >         at java.lang.reflect.Method.invoke(Method.java:597)
> > > >         at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
> > > > Caused by: java.lang.ClassNotFoundException:
> > > > org.apache.hadoop.hbase.filter.WritableByteArrayComparable
> > > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> > > >         at java.security.AccessController.doPrivileged(Native Method)
> > > >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > > >         at
> > sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> > > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > > >         ... 27 more
> > > >
> > > >
> > > >
> > > > Kyle
> > > >
> > > >
> > > > 2013/12/7 Ted Yu <yu...@gmail.com>
> > > >
> > > > > bq. HDP2(HBase 0.95.2.2.0.5.0-64
> > > > >
> > > > > HDP2 goes with 0.96.0
> > > > >
> > > > > bq. java.lang.ClassNotFoundException:
> org.apache.hadoop.hbase.filter.
> > > > > WritableByteArrayComparable.
> > > > >
> > > > > Can you show us the stack trace ?
> > > > > WritableByteArrayComparable doesn't exist in 0.96 and later
> branches.
> > > > >
> > > > > Cheers
> > > > >
> > > > >
> > > > > On Sat, Dec 7, 2013 at 4:22 AM, Rohini Palaniswamy
> > > > > <ro...@gmail.com>wrote:
> > > > >
> > > > > > Do a register of your hbase and zookeeper jars in the pig script.
> > > > > >
> > > > > > -Rohini
> > > > > >
> > > > > >
> > > > > > On Fri, Dec 6, 2013 at 1:56 AM, Kyle Lin <ky...@gmail.com>
> > > > wrote:
> > > > > >
> > > > > > > Hey there
> > > > > > >
> > > > > > >     First, my Environment: Hortonworks HDP2(HBase
> > > 0.95.2.2.0.5.0-64,
> > > > > Pig
> > > > > > > 0.11.1).
> > > > > > >
> > > > > > >     I use pig to load data from hbase, then got Exception
> Message
> > > of
> > > > > > > java.lang.ClassNotFoundException:
> > > > > > > org.apache.hadoop.hbase.filter.WritableByteArrayComparable.
> > > > > > >
> > > > > > >     My script is like below:
> > > > > > > samples = LOAD 'hbase://test' using
> > > > > > >   org.apache.pig.backend.hadoop.hbase.HBaseStorage('cf:name
> > > cf:phone
> > > > > > > cf:city cf:address')
> > > > > > >   as (name, phone, city, address);
> > > > > > > dump samples;
> > > > > > >
> > > > > > >     After googling, people said you need to set PIG_CLASSPATH
> > > first.
> > > > > So I
> > > > > > > try to add the target jar in PIG_CLASSPATH, but cannot
> > > > > > > find org.apache.hadoop.hbase.filter.WritableByteArrayComparable
> > in
> > > > any
> > > > > > > hbase jars.
> > > > > > >
> > > > > > >
> > > > > > > Kyle
> > > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
>

Re: Pig cannot load data using hbasestorage

Posted by Ted Yu <yu...@gmail.com>.
Please use this link to install:
http://hortonworks.com/products/hdp-2/#install

Cheers


On Mon, Dec 9, 2013 at 10:06 AM, Kyle Lin <ky...@gmail.com> wrote:

> Hello Ted
>
>     Actually I used Hortonworks sandbox 2.0 Beta. Should I get rid off this
> problem by using ambari to install HDP2?
>
> Kyle
>
>
> 2013/12/9 Ted Yu <yu...@gmail.com>
>
> > Kyle:
> > According to http://hortonworks.com/products/hdp-2/ , PIG 0.12 should be
> > used where there is no such problem.
> >
> > Did you happen to use HDP 2 Beta ?
> >
> > Cheers
> >
> >
> > On Mon, Dec 9, 2013 at 9:49 AM, Kyle Lin <ky...@gmail.com> wrote:
> >
> > > Hello Ted
> > >
> > >     Below is the stack trace. May I say that because HBase having
> > > removed WritableByteArrayComparable, but pig is not upgrade for this
> > > change?
> > >
> > >
> > > Pig Stack Trace
> > > ---------------
> > > ERROR 2998: Unhandled internal error.
> > > org/apache/hadoop/hbase/filter/WritableByteArrayComparable
> > >
> > > java.lang.NoClassDefFoundError:
> > > org/apache/hadoop/hbase/filter/WritableByteArrayComparable
> > >         at java.lang.Class.forName0(Native Method)
> > >         at java.lang.Class.forName(Class.java:247)
> > >         at
> > > org.apache.pig.impl.PigContext.resolveClassName(PigContext.java:510)
> > >         at
> > >
> > >
> >
> org.apache.pig.parser.LogicalPlanBuilder.validateFuncSpec(LogicalPlanBuilder.java:1220)
> > >         at
> > >
> > >
> >
> org.apache.pig.parser.LogicalPlanBuilder.buildFuncSpec(LogicalPlanBuilder.java:1208)
> > >         at
> > >
> > >
> >
> org.apache.pig.parser.LogicalPlanGenerator.func_clause(LogicalPlanGenerator.java:4849)
> > >         at
> > >
> > >
> >
> org.apache.pig.parser.LogicalPlanGenerator.load_clause(LogicalPlanGenerator.java:3206)
> > >         at
> > >
> > >
> >
> org.apache.pig.parser.LogicalPlanGenerator.op_clause(LogicalPlanGenerator.java:1338)
> > >         at
> > >
> > >
> >
> org.apache.pig.parser.LogicalPlanGenerator.general_statement(LogicalPlanGenerator.java:822)
> > >         at
> > >
> > >
> >
> org.apache.pig.parser.LogicalPlanGenerator.statement(LogicalPlanGenerator.java:540)
> > >         at
> > >
> > >
> >
> org.apache.pig.parser.LogicalPlanGenerator.query(LogicalPlanGenerator.java:415)
> > >         at
> > >
> org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:181)
> > >         at
> > > org.apache.pig.PigServer$Graph.validateQuery(PigServer.java:1633)
> > >         at
> > > org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1606)
> > >         at org.apache.pig.PigServer.registerQuery(PigServer.java:565)
> > >         at
> > >
> org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:1032)
> > >         at
> > >
> > >
> >
> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:499)
> > >         at
> > >
> > >
> >
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:194)
> > >         at
> > >
> > >
> >
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:170)
> > >         at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:69)
> > >         at org.apache.pig.Main.run(Main.java:543)
> > >         at org.apache.pig.Main.main(Main.java:158)
> > >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > >         at
> > >
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> > >         at
> > >
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > >         at java.lang.reflect.Method.invoke(Method.java:597)
> > >         at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
> > > Caused by: java.lang.ClassNotFoundException:
> > > org.apache.hadoop.hbase.filter.WritableByteArrayComparable
> > >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> > >         at java.security.AccessController.doPrivileged(Native Method)
> > >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > >         at
> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> > >         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > >         ... 27 more
> > >
> > >
> > >
> > > Kyle
> > >
> > >
> > > 2013/12/7 Ted Yu <yu...@gmail.com>
> > >
> > > > bq. HDP2(HBase 0.95.2.2.0.5.0-64
> > > >
> > > > HDP2 goes with 0.96.0
> > > >
> > > > bq. java.lang.ClassNotFoundException: org.apache.hadoop.hbase.filter.
> > > > WritableByteArrayComparable.
> > > >
> > > > Can you show us the stack trace ?
> > > > WritableByteArrayComparable doesn't exist in 0.96 and later branches.
> > > >
> > > > Cheers
> > > >
> > > >
> > > > On Sat, Dec 7, 2013 at 4:22 AM, Rohini Palaniswamy
> > > > <ro...@gmail.com>wrote:
> > > >
> > > > > Do a register of your hbase and zookeeper jars in the pig script.
> > > > >
> > > > > -Rohini
> > > > >
> > > > >
> > > > > On Fri, Dec 6, 2013 at 1:56 AM, Kyle Lin <ky...@gmail.com>
> > > wrote:
> > > > >
> > > > > > Hey there
> > > > > >
> > > > > >     First, my Environment: Hortonworks HDP2(HBase
> > 0.95.2.2.0.5.0-64,
> > > > Pig
> > > > > > 0.11.1).
> > > > > >
> > > > > >     I use pig to load data from hbase, then got Exception Message
> > of
> > > > > > java.lang.ClassNotFoundException:
> > > > > > org.apache.hadoop.hbase.filter.WritableByteArrayComparable.
> > > > > >
> > > > > >     My script is like below:
> > > > > > samples = LOAD 'hbase://test' using
> > > > > >   org.apache.pig.backend.hadoop.hbase.HBaseStorage('cf:name
> > cf:phone
> > > > > > cf:city cf:address')
> > > > > >   as (name, phone, city, address);
> > > > > > dump samples;
> > > > > >
> > > > > >     After googling, people said you need to set PIG_CLASSPATH
> > first.
> > > > So I
> > > > > > try to add the target jar in PIG_CLASSPATH, but cannot
> > > > > > find org.apache.hadoop.hbase.filter.WritableByteArrayComparable
> in
> > > any
> > > > > > hbase jars.
> > > > > >
> > > > > >
> > > > > > Kyle
> > > > > >
> > > > >
> > > >
> > >
> >
>

Re: Pig cannot load data using hbasestorage

Posted by Kyle Lin <ky...@gmail.com>.
Hello Ted

    Actually I used Hortonworks sandbox 2.0 Beta. Should I get rid off this
problem by using ambari to install HDP2?

Kyle


2013/12/9 Ted Yu <yu...@gmail.com>

> Kyle:
> According to http://hortonworks.com/products/hdp-2/ , PIG 0.12 should be
> used where there is no such problem.
>
> Did you happen to use HDP 2 Beta ?
>
> Cheers
>
>
> On Mon, Dec 9, 2013 at 9:49 AM, Kyle Lin <ky...@gmail.com> wrote:
>
> > Hello Ted
> >
> >     Below is the stack trace. May I say that because HBase having
> > removed WritableByteArrayComparable, but pig is not upgrade for this
> > change?
> >
> >
> > Pig Stack Trace
> > ---------------
> > ERROR 2998: Unhandled internal error.
> > org/apache/hadoop/hbase/filter/WritableByteArrayComparable
> >
> > java.lang.NoClassDefFoundError:
> > org/apache/hadoop/hbase/filter/WritableByteArrayComparable
> >         at java.lang.Class.forName0(Native Method)
> >         at java.lang.Class.forName(Class.java:247)
> >         at
> > org.apache.pig.impl.PigContext.resolveClassName(PigContext.java:510)
> >         at
> >
> >
> org.apache.pig.parser.LogicalPlanBuilder.validateFuncSpec(LogicalPlanBuilder.java:1220)
> >         at
> >
> >
> org.apache.pig.parser.LogicalPlanBuilder.buildFuncSpec(LogicalPlanBuilder.java:1208)
> >         at
> >
> >
> org.apache.pig.parser.LogicalPlanGenerator.func_clause(LogicalPlanGenerator.java:4849)
> >         at
> >
> >
> org.apache.pig.parser.LogicalPlanGenerator.load_clause(LogicalPlanGenerator.java:3206)
> >         at
> >
> >
> org.apache.pig.parser.LogicalPlanGenerator.op_clause(LogicalPlanGenerator.java:1338)
> >         at
> >
> >
> org.apache.pig.parser.LogicalPlanGenerator.general_statement(LogicalPlanGenerator.java:822)
> >         at
> >
> >
> org.apache.pig.parser.LogicalPlanGenerator.statement(LogicalPlanGenerator.java:540)
> >         at
> >
> >
> org.apache.pig.parser.LogicalPlanGenerator.query(LogicalPlanGenerator.java:415)
> >         at
> > org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:181)
> >         at
> > org.apache.pig.PigServer$Graph.validateQuery(PigServer.java:1633)
> >         at
> > org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1606)
> >         at org.apache.pig.PigServer.registerQuery(PigServer.java:565)
> >         at
> > org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:1032)
> >         at
> >
> >
> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:499)
> >         at
> >
> >
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:194)
> >         at
> >
> >
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:170)
> >         at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:69)
> >         at org.apache.pig.Main.run(Main.java:543)
> >         at org.apache.pig.Main.main(Main.java:158)
> >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >         at
> >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >         at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >         at java.lang.reflect.Method.invoke(Method.java:597)
> >         at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
> > Caused by: java.lang.ClassNotFoundException:
> > org.apache.hadoop.hbase.filter.WritableByteArrayComparable
> >         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> >         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> >         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> >         ... 27 more
> >
> >
> >
> > Kyle
> >
> >
> > 2013/12/7 Ted Yu <yu...@gmail.com>
> >
> > > bq. HDP2(HBase 0.95.2.2.0.5.0-64
> > >
> > > HDP2 goes with 0.96.0
> > >
> > > bq. java.lang.ClassNotFoundException: org.apache.hadoop.hbase.filter.
> > > WritableByteArrayComparable.
> > >
> > > Can you show us the stack trace ?
> > > WritableByteArrayComparable doesn't exist in 0.96 and later branches.
> > >
> > > Cheers
> > >
> > >
> > > On Sat, Dec 7, 2013 at 4:22 AM, Rohini Palaniswamy
> > > <ro...@gmail.com>wrote:
> > >
> > > > Do a register of your hbase and zookeeper jars in the pig script.
> > > >
> > > > -Rohini
> > > >
> > > >
> > > > On Fri, Dec 6, 2013 at 1:56 AM, Kyle Lin <ky...@gmail.com>
> > wrote:
> > > >
> > > > > Hey there
> > > > >
> > > > >     First, my Environment: Hortonworks HDP2(HBase
> 0.95.2.2.0.5.0-64,
> > > Pig
> > > > > 0.11.1).
> > > > >
> > > > >     I use pig to load data from hbase, then got Exception Message
> of
> > > > > java.lang.ClassNotFoundException:
> > > > > org.apache.hadoop.hbase.filter.WritableByteArrayComparable.
> > > > >
> > > > >     My script is like below:
> > > > > samples = LOAD 'hbase://test' using
> > > > >   org.apache.pig.backend.hadoop.hbase.HBaseStorage('cf:name
> cf:phone
> > > > > cf:city cf:address')
> > > > >   as (name, phone, city, address);
> > > > > dump samples;
> > > > >
> > > > >     After googling, people said you need to set PIG_CLASSPATH
> first.
> > > So I
> > > > > try to add the target jar in PIG_CLASSPATH, but cannot
> > > > > find org.apache.hadoop.hbase.filter.WritableByteArrayComparable in
> > any
> > > > > hbase jars.
> > > > >
> > > > >
> > > > > Kyle
> > > > >
> > > >
> > >
> >
>

Re: Pig cannot load data using hbasestorage

Posted by Ted Yu <yu...@gmail.com>.
Kyle:
According to http://hortonworks.com/products/hdp-2/ , PIG 0.12 should be
used where there is no such problem.

Did you happen to use HDP 2 Beta ?

Cheers


On Mon, Dec 9, 2013 at 9:49 AM, Kyle Lin <ky...@gmail.com> wrote:

> Hello Ted
>
>     Below is the stack trace. May I say that because HBase having
> removed WritableByteArrayComparable, but pig is not upgrade for this
> change?
>
>
> Pig Stack Trace
> ---------------
> ERROR 2998: Unhandled internal error.
> org/apache/hadoop/hbase/filter/WritableByteArrayComparable
>
> java.lang.NoClassDefFoundError:
> org/apache/hadoop/hbase/filter/WritableByteArrayComparable
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:247)
>         at
> org.apache.pig.impl.PigContext.resolveClassName(PigContext.java:510)
>         at
>
> org.apache.pig.parser.LogicalPlanBuilder.validateFuncSpec(LogicalPlanBuilder.java:1220)
>         at
>
> org.apache.pig.parser.LogicalPlanBuilder.buildFuncSpec(LogicalPlanBuilder.java:1208)
>         at
>
> org.apache.pig.parser.LogicalPlanGenerator.func_clause(LogicalPlanGenerator.java:4849)
>         at
>
> org.apache.pig.parser.LogicalPlanGenerator.load_clause(LogicalPlanGenerator.java:3206)
>         at
>
> org.apache.pig.parser.LogicalPlanGenerator.op_clause(LogicalPlanGenerator.java:1338)
>         at
>
> org.apache.pig.parser.LogicalPlanGenerator.general_statement(LogicalPlanGenerator.java:822)
>         at
>
> org.apache.pig.parser.LogicalPlanGenerator.statement(LogicalPlanGenerator.java:540)
>         at
>
> org.apache.pig.parser.LogicalPlanGenerator.query(LogicalPlanGenerator.java:415)
>         at
> org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:181)
>         at
> org.apache.pig.PigServer$Graph.validateQuery(PigServer.java:1633)
>         at
> org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1606)
>         at org.apache.pig.PigServer.registerQuery(PigServer.java:565)
>         at
> org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:1032)
>         at
>
> org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:499)
>         at
>
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:194)
>         at
>
> org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:170)
>         at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:69)
>         at org.apache.pig.Main.run(Main.java:543)
>         at org.apache.pig.Main.main(Main.java:158)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>         at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.hbase.filter.WritableByteArrayComparable
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>         ... 27 more
>
>
>
> Kyle
>
>
> 2013/12/7 Ted Yu <yu...@gmail.com>
>
> > bq. HDP2(HBase 0.95.2.2.0.5.0-64
> >
> > HDP2 goes with 0.96.0
> >
> > bq. java.lang.ClassNotFoundException: org.apache.hadoop.hbase.filter.
> > WritableByteArrayComparable.
> >
> > Can you show us the stack trace ?
> > WritableByteArrayComparable doesn't exist in 0.96 and later branches.
> >
> > Cheers
> >
> >
> > On Sat, Dec 7, 2013 at 4:22 AM, Rohini Palaniswamy
> > <ro...@gmail.com>wrote:
> >
> > > Do a register of your hbase and zookeeper jars in the pig script.
> > >
> > > -Rohini
> > >
> > >
> > > On Fri, Dec 6, 2013 at 1:56 AM, Kyle Lin <ky...@gmail.com>
> wrote:
> > >
> > > > Hey there
> > > >
> > > >     First, my Environment: Hortonworks HDP2(HBase 0.95.2.2.0.5.0-64,
> > Pig
> > > > 0.11.1).
> > > >
> > > >     I use pig to load data from hbase, then got Exception Message of
> > > > java.lang.ClassNotFoundException:
> > > > org.apache.hadoop.hbase.filter.WritableByteArrayComparable.
> > > >
> > > >     My script is like below:
> > > > samples = LOAD 'hbase://test' using
> > > >   org.apache.pig.backend.hadoop.hbase.HBaseStorage('cf:name cf:phone
> > > > cf:city cf:address')
> > > >   as (name, phone, city, address);
> > > > dump samples;
> > > >
> > > >     After googling, people said you need to set PIG_CLASSPATH first.
> > So I
> > > > try to add the target jar in PIG_CLASSPATH, but cannot
> > > > find org.apache.hadoop.hbase.filter.WritableByteArrayComparable in
> any
> > > > hbase jars.
> > > >
> > > >
> > > > Kyle
> > > >
> > >
> >
>

Re: Pig cannot load data using hbasestorage

Posted by Kyle Lin <ky...@gmail.com>.
Hello Ted

    Below is the stack trace. May I say that because HBase having
removed WritableByteArrayComparable, but pig is not upgrade for this change?


Pig Stack Trace
---------------
ERROR 2998: Unhandled internal error.
org/apache/hadoop/hbase/filter/WritableByteArrayComparable

java.lang.NoClassDefFoundError:
org/apache/hadoop/hbase/filter/WritableByteArrayComparable
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:247)
        at
org.apache.pig.impl.PigContext.resolveClassName(PigContext.java:510)
        at
org.apache.pig.parser.LogicalPlanBuilder.validateFuncSpec(LogicalPlanBuilder.java:1220)
        at
org.apache.pig.parser.LogicalPlanBuilder.buildFuncSpec(LogicalPlanBuilder.java:1208)
        at
org.apache.pig.parser.LogicalPlanGenerator.func_clause(LogicalPlanGenerator.java:4849)
        at
org.apache.pig.parser.LogicalPlanGenerator.load_clause(LogicalPlanGenerator.java:3206)
        at
org.apache.pig.parser.LogicalPlanGenerator.op_clause(LogicalPlanGenerator.java:1338)
        at
org.apache.pig.parser.LogicalPlanGenerator.general_statement(LogicalPlanGenerator.java:822)
        at
org.apache.pig.parser.LogicalPlanGenerator.statement(LogicalPlanGenerator.java:540)
        at
org.apache.pig.parser.LogicalPlanGenerator.query(LogicalPlanGenerator.java:415)
        at
org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:181)
        at org.apache.pig.PigServer$Graph.validateQuery(PigServer.java:1633)
        at org.apache.pig.PigServer$Graph.registerQuery(PigServer.java:1606)
        at org.apache.pig.PigServer.registerQuery(PigServer.java:565)
        at
org.apache.pig.tools.grunt.GruntParser.processPig(GruntParser.java:1032)
        at
org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:499)
        at
org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:194)
        at
org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:170)
        at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:69)
        at org.apache.pig.Main.run(Main.java:543)
        at org.apache.pig.Main.main(Main.java:158)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
Caused by: java.lang.ClassNotFoundException:
org.apache.hadoop.hbase.filter.WritableByteArrayComparable
        at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
        ... 27 more



Kyle


2013/12/7 Ted Yu <yu...@gmail.com>

> bq. HDP2(HBase 0.95.2.2.0.5.0-64
>
> HDP2 goes with 0.96.0
>
> bq. java.lang.ClassNotFoundException: org.apache.hadoop.hbase.filter.
> WritableByteArrayComparable.
>
> Can you show us the stack trace ?
> WritableByteArrayComparable doesn't exist in 0.96 and later branches.
>
> Cheers
>
>
> On Sat, Dec 7, 2013 at 4:22 AM, Rohini Palaniswamy
> <ro...@gmail.com>wrote:
>
> > Do a register of your hbase and zookeeper jars in the pig script.
> >
> > -Rohini
> >
> >
> > On Fri, Dec 6, 2013 at 1:56 AM, Kyle Lin <ky...@gmail.com> wrote:
> >
> > > Hey there
> > >
> > >     First, my Environment: Hortonworks HDP2(HBase 0.95.2.2.0.5.0-64,
> Pig
> > > 0.11.1).
> > >
> > >     I use pig to load data from hbase, then got Exception Message of
> > > java.lang.ClassNotFoundException:
> > > org.apache.hadoop.hbase.filter.WritableByteArrayComparable.
> > >
> > >     My script is like below:
> > > samples = LOAD 'hbase://test' using
> > >   org.apache.pig.backend.hadoop.hbase.HBaseStorage('cf:name cf:phone
> > > cf:city cf:address')
> > >   as (name, phone, city, address);
> > > dump samples;
> > >
> > >     After googling, people said you need to set PIG_CLASSPATH first.
> So I
> > > try to add the target jar in PIG_CLASSPATH, but cannot
> > > find org.apache.hadoop.hbase.filter.WritableByteArrayComparable in any
> > > hbase jars.
> > >
> > >
> > > Kyle
> > >
> >
>

Re: Pig cannot load data using hbasestorage

Posted by Ted Yu <yu...@gmail.com>.
bq. HDP2(HBase 0.95.2.2.0.5.0-64

HDP2 goes with 0.96.0

bq. java.lang.ClassNotFoundException: org.apache.hadoop.hbase.filter.
WritableByteArrayComparable.

Can you show us the stack trace ?
WritableByteArrayComparable doesn't exist in 0.96 and later branches.

Cheers


On Sat, Dec 7, 2013 at 4:22 AM, Rohini Palaniswamy
<ro...@gmail.com>wrote:

> Do a register of your hbase and zookeeper jars in the pig script.
>
> -Rohini
>
>
> On Fri, Dec 6, 2013 at 1:56 AM, Kyle Lin <ky...@gmail.com> wrote:
>
> > Hey there
> >
> >     First, my Environment: Hortonworks HDP2(HBase 0.95.2.2.0.5.0-64, Pig
> > 0.11.1).
> >
> >     I use pig to load data from hbase, then got Exception Message of
> > java.lang.ClassNotFoundException:
> > org.apache.hadoop.hbase.filter.WritableByteArrayComparable.
> >
> >     My script is like below:
> > samples = LOAD 'hbase://test' using
> >   org.apache.pig.backend.hadoop.hbase.HBaseStorage('cf:name cf:phone
> > cf:city cf:address')
> >   as (name, phone, city, address);
> > dump samples;
> >
> >     After googling, people said you need to set PIG_CLASSPATH first. So I
> > try to add the target jar in PIG_CLASSPATH, but cannot
> > find org.apache.hadoop.hbase.filter.WritableByteArrayComparable in any
> > hbase jars.
> >
> >
> > Kyle
> >
>

Re: Pig cannot load data using hbasestorage

Posted by Rohini Palaniswamy <ro...@gmail.com>.
Do a register of your hbase and zookeeper jars in the pig script.

-Rohini


On Fri, Dec 6, 2013 at 1:56 AM, Kyle Lin <ky...@gmail.com> wrote:

> Hey there
>
>     First, my Environment: Hortonworks HDP2(HBase 0.95.2.2.0.5.0-64, Pig
> 0.11.1).
>
>     I use pig to load data from hbase, then got Exception Message of
> java.lang.ClassNotFoundException:
> org.apache.hadoop.hbase.filter.WritableByteArrayComparable.
>
>     My script is like below:
> samples = LOAD 'hbase://test' using
>   org.apache.pig.backend.hadoop.hbase.HBaseStorage('cf:name cf:phone
> cf:city cf:address')
>   as (name, phone, city, address);
> dump samples;
>
>     After googling, people said you need to set PIG_CLASSPATH first. So I
> try to add the target jar in PIG_CLASSPATH, but cannot
> find org.apache.hadoop.hbase.filter.WritableByteArrayComparable in any
> hbase jars.
>
>
> Kyle
>