You are viewing a plain text version of this content. The canonical link for it is here.
Posted to yarn-dev@hadoop.apache.org by Vikas Parashar <vi...@fosteringlinux.com> on 2013/12/26 13:05:40 UTC

hbase hive integration

Hi,

I am integrating hive(0.12) with hbase(0.96). Everything is working fine
there but get stuck between two quires.

When i create table or select * from table then it's working fine .
but in case of select count(*) from table it give me below error.


2013-12-26 13:25:01,864 ERROR ql.Driver (SessionState.java:printError(419))
- FAILED: Execution Error, return code 2 from
org.apache.hadoop.hive.ql.exec.mr.MapRedTask
2013-12-26 13:25:01,869 WARN  mapreduce.Counters
(AbstractCounters.java:getGroup(234)) - Group FileSystemCounters is
deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
2013-12-26 14:25:44,119 WARN  mapreduce.JobSubmitter
(JobSubmitter.java:copyAndConfigureFiles(149)) - Hadoop command-line option
parsing not performed. Implement the Tool interface and execute your
application with ToolRunner to remedy this.
2013-12-26 14:26:14,677 WARN  mapreduce.Counters
(AbstractCounters.java:getGroup(234)) - Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use
org.apache.hadoop.mapreduce.TaskCounter instead
2013-12-26 14:26:33,613 WARN  mapreduce.Counters
(AbstractCounters.java:getGroup(234)) - Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use
org.apache.hadoop.mapreduce.TaskCounter instead
2013-12-26 14:27:30,355 WARN  mapreduce.Counters
(AbstractCounters.java:getGroup(234)) - Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use
org.apache.hadoop.mapreduce.TaskCounter instead
2013-12-26 14:27:32,479 WARN  mapreduce.Counters
(AbstractCounters.java:getGroup(234)) - Group
org.apache.hadoop.mapred.Task$Counter is deprecated. Use
org.apache.hadoop.mapreduce.TaskCounter instead
2013-12-26 14:27:32,528 ERROR exec.Task (SessionState.java:printError(419))
- Ended Job = job_1388037394132_0013 with errors
2013-12-26 14:27:32,530 ERROR exec.Task (SessionState.java:printError(419))
- Error during job, obtaining debugging information...
2013-12-26 14:27:32,538 ERROR exec.Task (SessionState.java:printError(419))
- Examining task ID: task_1388037394132_0013_m_000000 (and more) from job
job_1388037394132_0013
2013-12-26 14:27:32,539 WARN  shims.HadoopShimsSecure
(Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog:
TaskLogServlet is not supported in MR2 mode.
2013-12-26 14:27:32,593 WARN  shims.HadoopShimsSecure
(Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog:
TaskLogServlet is not supported in MR2 mode.
2013-12-26 14:27:32,596 WARN  shims.HadoopShimsSecure
(Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog:
TaskLogServlet is not supported in MR2 mode.
2013-12-26 14:27:32,599 WARN  shims.HadoopShimsSecure
(Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog:
TaskLogServlet is not supported in MR2 mode.
2013-12-26 14:27:32,615 ERROR exec.Task (SessionState.java:printError(419))
-
Task with the most failures(4):
-----
Task ID:
  task_1388037394132_0013_m_000000

URL:

http://ambari1.hadoop.com:8088/taskdetails.jsp?jobid=job_1388037394132_0013&tipid=task_1388037394132_0013_m_000000
-----
Diagnostic Messages for this Task:
Error: java.io.IOException: java.io.IOException:
java.lang.reflect.InvocationTargetException
at
org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97)
 at
org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57)
 at
org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:244)
 at
org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:538)
 at
org.apache.hadoop.mapred.MapTask$TrackedRecordReader.<init>(MapTask.java:167)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:408)
 at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
 at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:396)
 at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
Caused by: java.io.IOException: java.lang.reflect.InvocationTargetException
 at
org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:383)
at
org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:360)
 at
org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:244)
 at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:187)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:164)
 at
org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getRecordReader(HiveHBaseTableInputFormat.java:91)
 at
org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:241)
 ... 9 more
Caused by: java.lang.reflect.InvocationTargetException
 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
 at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
 at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
 at
org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:381)
... 15 more
Caused by: java.lang.NoClassDefFoundError: org/cloudera/htrace/Trace
at
org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:196)
 at org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:479)
 at
org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65)
at
org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83)
 at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:794)
 at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:627)
 ... 20 more
Caused by: java.lang.ClassNotFoundException: org.cloudera.htrace.Trace
 at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
 at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
 at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
 ... 26 more


2013-12-26 14:27:32,870 ERROR ql.Driver (SessionState.java:printError(419))
- FAILED: Execution Error, return code 2 from org.apache.

I think this error is related with mapred job. Whenever my query use the
map-R then i get error.

Any idea!!

-- 
Thanks & Regards:-
Vikas Parashar
Sr. Linux administrator Cum Developer
Mobile: +91 958 208 8852
Email: vikas.parashar@fosteringlinglinux.com

Re: hbase hive integration

Posted by Vikas Parashar <vi...@fosteringlinux.com>.
thank you buddy :)


On Thu, Dec 26, 2013 at 9:36 PM, Ted Yu <yu...@gmail.com> wrote:

> HBase depends on several modules / projects.
> e.g. HBase depends on projects such as zookeeper, netty, protobuf, etc.
> Modules of HBase such as hbase-common are also needed for mapreduce job to
> function.
>
> Your question would get more help on the hive user list.
>
> Cheers
>
>
> On Thu, Dec 26, 2013 at 7:57 AM, Vikas Parashar <
> vikas.parashar@fosteringlinux.com> wrote:
>
> > Thanks buddy,
> >
> > if i remove the htrace 1.5 then again i get same error htrace class not
> > found..
> >
> > Because i new comer for java. i didn't understand java too much.  So, i
> did
> > not understand that "addHBaseDependencyJars() adds HBase and its
> > dependencies (only) to the job configuration."
> >
> > Could you please elaborate this or tell me any method to skip it.
> >
> >
> > On Thu, Dec 26, 2013 at 9:20 PM, Ted Yu <yu...@gmail.com> wrote:
> >
> > > htrace 1.5 won't solve the problem.
> > > htrace-core-2.01.jar is included in the 0.96.1.1 tar ball where
> > TraceScope
> > > can be found.
> > >
> > > I searched under hbase-handler/src/java/org/apache/hadoop/hive/hbase
> but
> > > didn't see the following method being called:
> > >
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addHBaseDependencyJars
> > >
> > > addHBaseDependencyJars() adds HBase and its dependencies (only) to the
> > job
> > > configuration.
> > >  Maybe this was the reason for these errors you saw.
> > >
> > > Cheers
> > >
> > > On Thu, Dec 26, 2013 at 7:33 AM, Vikas Parashar <
> > > vikas.parashar@fosteringlinux.com> wrote:
> > >
> > > > Hi,
> > > >
> > > > if i copied htrace-1.50.jar in map-reduce job. I got below error..
> > > >
> > > > java.io.IOException: java.lang.reflect.InvocationTargetException
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:383)
> > > >  at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:360)
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:244)
> > > >  at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:187)
> > > > at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:164)
> > > >  at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplits(HiveHBaseTableInputFormat.java:442)
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hive.ql.io.HiveInputFormat.getSplits(HiveInputFormat.java:294)
> > > >  at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getSplits(CombineHiveInputFormat.java:303)
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.mapreduce.JobSubmitter.writeOldSplits(JobSubmitter.java:518)
> > > >  at
> > > >
> > >
> >
> org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:510)
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:392)
> > > >  at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
> > > > at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
> > > >  at java.security.AccessController.doPrivileged(Native Method)
> > > > at javax.security.auth.Subject.doAs(Subject.java:396)
> > > >  at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> > > > at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
> > > >  at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562)
> > > > at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557)
> > > >  at java.security.AccessController.doPrivileged(Native Method)
> > > > at javax.security.auth.Subject.doAs(Subject.java:396)
> > > >  at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> > > > at
> > >
> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557)
> > > >  at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548)
> > > > at
> > > >
> > org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:425)
> > > >  at
> > > >
> > org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:136)
> > > > at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151)
> > > >  at
> > > >
> > >
> >
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65)
> > > > at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1437)
> > > >  at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1215)
> > > > at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1043)
> > > >  at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)
> > > > at
> > >
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
> > > >  at
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
> > > > at
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
> > > >  at
> > > org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:781)
> > > > at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
> > > >  at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
> > > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > > >  at
> > > >
> > > >
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> > > > at
> > > >
> > > >
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > > >  at java.lang.reflect.Method.invoke(Method.java:597)
> > > > at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
> > > > Caused by: java.lang.reflect.InvocationTargetException
> > > > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> > Method)
> > > > at
> > > >
> > > >
> > >
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> > > >  at
> > > >
> > > >
> > >
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> > > > at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
> > > >  at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:381)
> > > > ... 42 more
> > > > Caused by: java.lang.NoSuchMethodError:
> > > >
> > > >
> > >
> >
> org.cloudera.htrace.Trace.startSpan(Ljava/lang/String;)Lorg/cloudera/htrace/TraceScope;
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:196)
> > > >  at
> > org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:479)
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65)
> > > >  at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83)
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:794)
> > > >  at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:627)
> > > > ... 47 more
> > > >
> > > >
> > > >
> > > > On Thu, Dec 26, 2013 at 6:26 PM, Ted Yu <yu...@gmail.com> wrote:
> > > >
> > > > > The error was due to htrace jar missing in class path of the map
> > reduce
> > > > > task.
> > > > >
> > > > > Cheers
> > > > >
> > > > > On Dec 26, 2013, at 4:05 AM, Vikas Parashar <
> > > > > vikas.parashar@fosteringlinux.com> wrote:
> > > > >
> > > > > > Hi,
> > > > > >
> > > > > > I am integrating hive(0.12) with hbase(0.96). Everything is
> working
> > > > fine
> > > > > > there but get stuck between two quires.
> > > > > >
> > > > > > When i create table or select * from table then it's working
> fine .
> > > > > > but in case of select count(*) from table it give me below error.
> > > > > >
> > > > > >
> > > > > > 2013-12-26 13:25:01,864 ERROR ql.Driver
> > > > > (SessionState.java:printError(419))
> > > > > > - FAILED: Execution Error, return code 2 from
> > > > > > org.apache.hadoop.hive.ql.exec.mr.MapRedTask
> > > > > > 2013-12-26 13:25:01,869 WARN  mapreduce.Counters
> > > > > > (AbstractCounters.java:getGroup(234)) - Group FileSystemCounters
> is
> > > > > > deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter
> > instead
> > > > > > 2013-12-26 14:25:44,119 WARN  mapreduce.JobSubmitter
> > > > > > (JobSubmitter.java:copyAndConfigureFiles(149)) - Hadoop
> > command-line
> > > > > option
> > > > > > parsing not performed. Implement the Tool interface and execute
> > your
> > > > > > application with ToolRunner to remedy this.
> > > > > > 2013-12-26 14:26:14,677 WARN  mapreduce.Counters
> > > > > > (AbstractCounters.java:getGroup(234)) - Group
> > > > > > org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> > > > > > org.apache.hadoop.mapreduce.TaskCounter instead
> > > > > > 2013-12-26 14:26:33,613 WARN  mapreduce.Counters
> > > > > > (AbstractCounters.java:getGroup(234)) - Group
> > > > > > org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> > > > > > org.apache.hadoop.mapreduce.TaskCounter instead
> > > > > > 2013-12-26 14:27:30,355 WARN  mapreduce.Counters
> > > > > > (AbstractCounters.java:getGroup(234)) - Group
> > > > > > org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> > > > > > org.apache.hadoop.mapreduce.TaskCounter instead
> > > > > > 2013-12-26 14:27:32,479 WARN  mapreduce.Counters
> > > > > > (AbstractCounters.java:getGroup(234)) - Group
> > > > > > org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> > > > > > org.apache.hadoop.mapreduce.TaskCounter instead
> > > > > > 2013-12-26 14:27:32,528 ERROR exec.Task
> > > > > (SessionState.java:printError(419))
> > > > > > - Ended Job = job_1388037394132_0013 with errors
> > > > > > 2013-12-26 14:27:32,530 ERROR exec.Task
> > > > > (SessionState.java:printError(419))
> > > > > > - Error during job, obtaining debugging information...
> > > > > > 2013-12-26 14:27:32,538 ERROR exec.Task
> > > > > (SessionState.java:printError(419))
> > > > > > - Examining task ID: task_1388037394132_0013_m_000000 (and more)
> > from
> > > > job
> > > > > > job_1388037394132_0013
> > > > > > 2013-12-26 14:27:32,539 WARN  shims.HadoopShimsSecure
> > > > > > (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch
> > tasklog:
> > > > > > TaskLogServlet is not supported in MR2 mode.
> > > > > > 2013-12-26 14:27:32,593 WARN  shims.HadoopShimsSecure
> > > > > > (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch
> > tasklog:
> > > > > > TaskLogServlet is not supported in MR2 mode.
> > > > > > 2013-12-26 14:27:32,596 WARN  shims.HadoopShimsSecure
> > > > > > (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch
> > tasklog:
> > > > > > TaskLogServlet is not supported in MR2 mode.
> > > > > > 2013-12-26 14:27:32,599 WARN  shims.HadoopShimsSecure
> > > > > > (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch
> > tasklog:
> > > > > > TaskLogServlet is not supported in MR2 mode.
> > > > > > 2013-12-26 14:27:32,615 ERROR exec.Task
> > > > > (SessionState.java:printError(419))
> > > > > > -
> > > > > > Task with the most failures(4):
> > > > > > -----
> > > > > > Task ID:
> > > > > >  task_1388037394132_0013_m_000000
> > > > > >
> > > > > > URL:
> > > > > >
> > > > > >
> > > > >
> > > >
> > >
> >
> http://ambari1.hadoop.com:8088/taskdetails.jsp?jobid=job_1388037394132_0013&tipid=task_1388037394132_0013_m_000000
> > > > > > -----
> > > > > > Diagnostic Messages for this Task:
> > > > > > Error: java.io.IOException: java.io.IOException:
> > > > > > java.lang.reflect.InvocationTargetException
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97)
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57)
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:244)
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:538)
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.mapred.MapTask$TrackedRecordReader.<init>(MapTask.java:167)
> > > > > > at
> org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:408)
> > > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> > > > > > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
> > > > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > > > at javax.security.auth.Subject.doAs(Subject.java:396)
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> > > > > > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
> > > > > > Caused by: java.io.IOException:
> > > > > java.lang.reflect.InvocationTargetException
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:383)
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:360)
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:244)
> > > > > > at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:187)
> > > > > > at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:164)
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getRecordReader(HiveHBaseTableInputFormat.java:91)
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:241)
> > > > > > ... 9 more
> > > > > > Caused by: java.lang.reflect.InvocationTargetException
> > > > > > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> > > > Method)
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> > > > > > at
> java.lang.reflect.Constructor.newInstance(Constructor.java:513)
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:381)
> > > > > > ... 15 more
> > > > > > Caused by: java.lang.NoClassDefFoundError:
> > org/cloudera/htrace/Trace
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:196)
> > > > > > at
> > > > org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:479)
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65)
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83)
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:794)
> > > > > > at
> > > > > >
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:627)
> > > > > > ... 20 more
> > > > > > Caused by: java.lang.ClassNotFoundException:
> > > org.cloudera.htrace.Trace
> > > > > > at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> > > > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > > > at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > > > > > at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > > > > > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> > > > > > at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > > > > > ... 26 more
> > > > > >
> > > > > >
> > > > > > 2013-12-26 14:27:32,870 ERROR ql.Driver
> > > > > (SessionState.java:printError(419))
> > > > > > - FAILED: Execution Error, return code 2 from org.apache.
> > > > > >
> > > > > > I think this error is related with mapred job. Whenever my query
> > use
> > > > the
> > > > > > map-R then i get error.
> > > > > >
> > > > > > Any idea!!
> > > > > >
> > > > > > --
> > > > > > Thanks & Regards:-
> > > > > > Vikas Parashar
> > > > > > Sr. Linux administrator Cum Developer
> > > > > > Mobile: +91 958 208 8852
> > > > > > Email: vikas.parashar@fosteringlinglinux.com
> > > > >
> > > >
> > > >
> > > >
> > > > --
> > > > Thanks & Regards:-
> > > > Vikas Parashar
> > > > Sr. Linux administrator Cum Developer
> > > > Mobile: +91 958 208 8852
> > > > Email: vikas.parashar@fosteringlinglinux.com
> > > >
> > >
> >
> >
> >
> > --
> > Thanks & Regards:-
> > Vikas Parashar
> > Sr. Linux administrator Cum Developer
> > Mobile: +91 958 208 8852
> > Email: vikas.parashar@fosteringlinglinux.com
> >
>



-- 
Thanks & Regards:-
Vikas Parashar
Sr. Linux administrator Cum Developer
Mobile: +91 958 208 8852
Email: vikas.parashar@fosteringlinglinux.com

Re: hbase hive integration

Posted by Ted Yu <yu...@gmail.com>.
HBase depends on several modules / projects.
e.g. HBase depends on projects such as zookeeper, netty, protobuf, etc.
Modules of HBase such as hbase-common are also needed for mapreduce job to
function.

Your question would get more help on the hive user list.

Cheers


On Thu, Dec 26, 2013 at 7:57 AM, Vikas Parashar <
vikas.parashar@fosteringlinux.com> wrote:

> Thanks buddy,
>
> if i remove the htrace 1.5 then again i get same error htrace class not
> found..
>
> Because i new comer for java. i didn't understand java too much.  So, i did
> not understand that "addHBaseDependencyJars() adds HBase and its
> dependencies (only) to the job configuration."
>
> Could you please elaborate this or tell me any method to skip it.
>
>
> On Thu, Dec 26, 2013 at 9:20 PM, Ted Yu <yu...@gmail.com> wrote:
>
> > htrace 1.5 won't solve the problem.
> > htrace-core-2.01.jar is included in the 0.96.1.1 tar ball where
> TraceScope
> > can be found.
> >
> > I searched under hbase-handler/src/java/org/apache/hadoop/hive/hbase but
> > didn't see the following method being called:
> >
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addHBaseDependencyJars
> >
> > addHBaseDependencyJars() adds HBase and its dependencies (only) to the
> job
> > configuration.
> >  Maybe this was the reason for these errors you saw.
> >
> > Cheers
> >
> > On Thu, Dec 26, 2013 at 7:33 AM, Vikas Parashar <
> > vikas.parashar@fosteringlinux.com> wrote:
> >
> > > Hi,
> > >
> > > if i copied htrace-1.50.jar in map-reduce job. I got below error..
> > >
> > > java.io.IOException: java.lang.reflect.InvocationTargetException
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:383)
> > >  at
> > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:360)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:244)
> > >  at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:187)
> > > at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:164)
> > >  at
> > >
> > >
> >
> org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplits(HiveHBaseTableInputFormat.java:442)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hive.ql.io.HiveInputFormat.getSplits(HiveInputFormat.java:294)
> > >  at
> > >
> > >
> >
> org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getSplits(CombineHiveInputFormat.java:303)
> > > at
> > >
> > >
> >
> org.apache.hadoop.mapreduce.JobSubmitter.writeOldSplits(JobSubmitter.java:518)
> > >  at
> > >
> >
> org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:510)
> > > at
> > >
> > >
> >
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:392)
> > >  at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
> > > at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
> > >  at java.security.AccessController.doPrivileged(Native Method)
> > > at javax.security.auth.Subject.doAs(Subject.java:396)
> > >  at
> > >
> > >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> > > at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
> > >  at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562)
> > > at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557)
> > >  at java.security.AccessController.doPrivileged(Native Method)
> > > at javax.security.auth.Subject.doAs(Subject.java:396)
> > >  at
> > >
> > >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> > > at
> > org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557)
> > >  at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548)
> > > at
> > >
> org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:425)
> > >  at
> > >
> org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:136)
> > > at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151)
> > >  at
> > >
> >
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65)
> > > at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1437)
> > >  at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1215)
> > > at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1043)
> > >  at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)
> > > at
> > org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
> > >  at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
> > > at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
> > >  at
> > org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:781)
> > > at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
> > >  at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> > >  at
> > >
> > >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> > > at
> > >
> > >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> > >  at java.lang.reflect.Method.invoke(Method.java:597)
> > > at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
> > > Caused by: java.lang.reflect.InvocationTargetException
> > > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
> > > at
> > >
> > >
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> > >  at
> > >
> > >
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> > > at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
> > >  at
> > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:381)
> > > ... 42 more
> > > Caused by: java.lang.NoSuchMethodError:
> > >
> > >
> >
> org.cloudera.htrace.Trace.startSpan(Ljava/lang/String;)Lorg/cloudera/htrace/TraceScope;
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:196)
> > >  at
> org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:479)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65)
> > >  at
> > >
> > >
> >
> org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:794)
> > >  at
> > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:627)
> > > ... 47 more
> > >
> > >
> > >
> > > On Thu, Dec 26, 2013 at 6:26 PM, Ted Yu <yu...@gmail.com> wrote:
> > >
> > > > The error was due to htrace jar missing in class path of the map
> reduce
> > > > task.
> > > >
> > > > Cheers
> > > >
> > > > On Dec 26, 2013, at 4:05 AM, Vikas Parashar <
> > > > vikas.parashar@fosteringlinux.com> wrote:
> > > >
> > > > > Hi,
> > > > >
> > > > > I am integrating hive(0.12) with hbase(0.96). Everything is working
> > > fine
> > > > > there but get stuck between two quires.
> > > > >
> > > > > When i create table or select * from table then it's working fine .
> > > > > but in case of select count(*) from table it give me below error.
> > > > >
> > > > >
> > > > > 2013-12-26 13:25:01,864 ERROR ql.Driver
> > > > (SessionState.java:printError(419))
> > > > > - FAILED: Execution Error, return code 2 from
> > > > > org.apache.hadoop.hive.ql.exec.mr.MapRedTask
> > > > > 2013-12-26 13:25:01,869 WARN  mapreduce.Counters
> > > > > (AbstractCounters.java:getGroup(234)) - Group FileSystemCounters is
> > > > > deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter
> instead
> > > > > 2013-12-26 14:25:44,119 WARN  mapreduce.JobSubmitter
> > > > > (JobSubmitter.java:copyAndConfigureFiles(149)) - Hadoop
> command-line
> > > > option
> > > > > parsing not performed. Implement the Tool interface and execute
> your
> > > > > application with ToolRunner to remedy this.
> > > > > 2013-12-26 14:26:14,677 WARN  mapreduce.Counters
> > > > > (AbstractCounters.java:getGroup(234)) - Group
> > > > > org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> > > > > org.apache.hadoop.mapreduce.TaskCounter instead
> > > > > 2013-12-26 14:26:33,613 WARN  mapreduce.Counters
> > > > > (AbstractCounters.java:getGroup(234)) - Group
> > > > > org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> > > > > org.apache.hadoop.mapreduce.TaskCounter instead
> > > > > 2013-12-26 14:27:30,355 WARN  mapreduce.Counters
> > > > > (AbstractCounters.java:getGroup(234)) - Group
> > > > > org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> > > > > org.apache.hadoop.mapreduce.TaskCounter instead
> > > > > 2013-12-26 14:27:32,479 WARN  mapreduce.Counters
> > > > > (AbstractCounters.java:getGroup(234)) - Group
> > > > > org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> > > > > org.apache.hadoop.mapreduce.TaskCounter instead
> > > > > 2013-12-26 14:27:32,528 ERROR exec.Task
> > > > (SessionState.java:printError(419))
> > > > > - Ended Job = job_1388037394132_0013 with errors
> > > > > 2013-12-26 14:27:32,530 ERROR exec.Task
> > > > (SessionState.java:printError(419))
> > > > > - Error during job, obtaining debugging information...
> > > > > 2013-12-26 14:27:32,538 ERROR exec.Task
> > > > (SessionState.java:printError(419))
> > > > > - Examining task ID: task_1388037394132_0013_m_000000 (and more)
> from
> > > job
> > > > > job_1388037394132_0013
> > > > > 2013-12-26 14:27:32,539 WARN  shims.HadoopShimsSecure
> > > > > (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch
> tasklog:
> > > > > TaskLogServlet is not supported in MR2 mode.
> > > > > 2013-12-26 14:27:32,593 WARN  shims.HadoopShimsSecure
> > > > > (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch
> tasklog:
> > > > > TaskLogServlet is not supported in MR2 mode.
> > > > > 2013-12-26 14:27:32,596 WARN  shims.HadoopShimsSecure
> > > > > (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch
> tasklog:
> > > > > TaskLogServlet is not supported in MR2 mode.
> > > > > 2013-12-26 14:27:32,599 WARN  shims.HadoopShimsSecure
> > > > > (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch
> tasklog:
> > > > > TaskLogServlet is not supported in MR2 mode.
> > > > > 2013-12-26 14:27:32,615 ERROR exec.Task
> > > > (SessionState.java:printError(419))
> > > > > -
> > > > > Task with the most failures(4):
> > > > > -----
> > > > > Task ID:
> > > > >  task_1388037394132_0013_m_000000
> > > > >
> > > > > URL:
> > > > >
> > > > >
> > > >
> > >
> >
> http://ambari1.hadoop.com:8088/taskdetails.jsp?jobid=job_1388037394132_0013&tipid=task_1388037394132_0013_m_000000
> > > > > -----
> > > > > Diagnostic Messages for this Task:
> > > > > Error: java.io.IOException: java.io.IOException:
> > > > > java.lang.reflect.InvocationTargetException
> > > > > at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97)
> > > > > at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57)
> > > > > at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:244)
> > > > > at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:538)
> > > > > at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.mapred.MapTask$TrackedRecordReader.<init>(MapTask.java:167)
> > > > > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:408)
> > > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> > > > > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
> > > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > > at javax.security.auth.Subject.doAs(Subject.java:396)
> > > > > at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> > > > > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
> > > > > Caused by: java.io.IOException:
> > > > java.lang.reflect.InvocationTargetException
> > > > > at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:383)
> > > > > at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:360)
> > > > > at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:244)
> > > > > at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:187)
> > > > > at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:164)
> > > > > at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getRecordReader(HiveHBaseTableInputFormat.java:91)
> > > > > at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:241)
> > > > > ... 9 more
> > > > > Caused by: java.lang.reflect.InvocationTargetException
> > > > > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> > > Method)
> > > > > at
> > > > >
> > > >
> > >
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> > > > > at
> > > > >
> > > >
> > >
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> > > > > at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
> > > > > at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:381)
> > > > > ... 15 more
> > > > > Caused by: java.lang.NoClassDefFoundError:
> org/cloudera/htrace/Trace
> > > > > at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:196)
> > > > > at
> > > org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:479)
> > > > > at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65)
> > > > > at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83)
> > > > > at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:794)
> > > > > at
> > > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:627)
> > > > > ... 20 more
> > > > > Caused by: java.lang.ClassNotFoundException:
> > org.cloudera.htrace.Trace
> > > > > at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> > > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > > at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > > > > at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > > > > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> > > > > at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > > > > ... 26 more
> > > > >
> > > > >
> > > > > 2013-12-26 14:27:32,870 ERROR ql.Driver
> > > > (SessionState.java:printError(419))
> > > > > - FAILED: Execution Error, return code 2 from org.apache.
> > > > >
> > > > > I think this error is related with mapred job. Whenever my query
> use
> > > the
> > > > > map-R then i get error.
> > > > >
> > > > > Any idea!!
> > > > >
> > > > > --
> > > > > Thanks & Regards:-
> > > > > Vikas Parashar
> > > > > Sr. Linux administrator Cum Developer
> > > > > Mobile: +91 958 208 8852
> > > > > Email: vikas.parashar@fosteringlinglinux.com
> > > >
> > >
> > >
> > >
> > > --
> > > Thanks & Regards:-
> > > Vikas Parashar
> > > Sr. Linux administrator Cum Developer
> > > Mobile: +91 958 208 8852
> > > Email: vikas.parashar@fosteringlinglinux.com
> > >
> >
>
>
>
> --
> Thanks & Regards:-
> Vikas Parashar
> Sr. Linux administrator Cum Developer
> Mobile: +91 958 208 8852
> Email: vikas.parashar@fosteringlinglinux.com
>

Re: hbase hive integration

Posted by Vikas Parashar <vi...@fosteringlinux.com>.
Thanks buddy,

if i remove the htrace 1.5 then again i get same error htrace class not
found..

Because i new comer for java. i didn't understand java too much.  So, i did
not understand that "addHBaseDependencyJars() adds HBase and its
dependencies (only) to the job configuration."

Could you please elaborate this or tell me any method to skip it.


On Thu, Dec 26, 2013 at 9:20 PM, Ted Yu <yu...@gmail.com> wrote:

> htrace 1.5 won't solve the problem.
> htrace-core-2.01.jar is included in the 0.96.1.1 tar ball where TraceScope
> can be found.
>
> I searched under hbase-handler/src/java/org/apache/hadoop/hive/hbase but
> didn't see the following method being called:
> org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addHBaseDependencyJars
>
> addHBaseDependencyJars() adds HBase and its dependencies (only) to the job
> configuration.
>  Maybe this was the reason for these errors you saw.
>
> Cheers
>
> On Thu, Dec 26, 2013 at 7:33 AM, Vikas Parashar <
> vikas.parashar@fosteringlinux.com> wrote:
>
> > Hi,
> >
> > if i copied htrace-1.50.jar in map-reduce job. I got below error..
> >
> > java.io.IOException: java.lang.reflect.InvocationTargetException
> > at
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:383)
> >  at
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:360)
> > at
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:244)
> >  at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:187)
> > at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:164)
> >  at
> >
> >
> org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplits(HiveHBaseTableInputFormat.java:442)
> > at
> >
> >
> org.apache.hadoop.hive.ql.io.HiveInputFormat.getSplits(HiveInputFormat.java:294)
> >  at
> >
> >
> org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getSplits(CombineHiveInputFormat.java:303)
> > at
> >
> >
> org.apache.hadoop.mapreduce.JobSubmitter.writeOldSplits(JobSubmitter.java:518)
> >  at
> >
> org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:510)
> > at
> >
> >
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:392)
> >  at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
> > at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
> >  at java.security.AccessController.doPrivileged(Native Method)
> > at javax.security.auth.Subject.doAs(Subject.java:396)
> >  at
> >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> > at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
> >  at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562)
> > at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557)
> >  at java.security.AccessController.doPrivileged(Native Method)
> > at javax.security.auth.Subject.doAs(Subject.java:396)
> >  at
> >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> > at
> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557)
> >  at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548)
> > at
> > org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:425)
> >  at
> > org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:136)
> > at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151)
> >  at
> >
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65)
> > at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1437)
> >  at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1215)
> > at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1043)
> >  at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)
> > at
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
> >  at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
> > at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
> >  at
> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:781)
> > at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
> >  at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
> > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >  at
> >
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> > at
> >
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >  at java.lang.reflect.Method.invoke(Method.java:597)
> > at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
> > Caused by: java.lang.reflect.InvocationTargetException
> > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> > at
> >
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> >  at
> >
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> > at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
> >  at
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:381)
> > ... 42 more
> > Caused by: java.lang.NoSuchMethodError:
> >
> >
> org.cloudera.htrace.Trace.startSpan(Ljava/lang/String;)Lorg/cloudera/htrace/TraceScope;
> > at
> >
> >
> org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:196)
> >  at org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:479)
> > at
> >
> >
> org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65)
> >  at
> >
> >
> org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83)
> > at
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:794)
> >  at
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:627)
> > ... 47 more
> >
> >
> >
> > On Thu, Dec 26, 2013 at 6:26 PM, Ted Yu <yu...@gmail.com> wrote:
> >
> > > The error was due to htrace jar missing in class path of the map reduce
> > > task.
> > >
> > > Cheers
> > >
> > > On Dec 26, 2013, at 4:05 AM, Vikas Parashar <
> > > vikas.parashar@fosteringlinux.com> wrote:
> > >
> > > > Hi,
> > > >
> > > > I am integrating hive(0.12) with hbase(0.96). Everything is working
> > fine
> > > > there but get stuck between two quires.
> > > >
> > > > When i create table or select * from table then it's working fine .
> > > > but in case of select count(*) from table it give me below error.
> > > >
> > > >
> > > > 2013-12-26 13:25:01,864 ERROR ql.Driver
> > > (SessionState.java:printError(419))
> > > > - FAILED: Execution Error, return code 2 from
> > > > org.apache.hadoop.hive.ql.exec.mr.MapRedTask
> > > > 2013-12-26 13:25:01,869 WARN  mapreduce.Counters
> > > > (AbstractCounters.java:getGroup(234)) - Group FileSystemCounters is
> > > > deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
> > > > 2013-12-26 14:25:44,119 WARN  mapreduce.JobSubmitter
> > > > (JobSubmitter.java:copyAndConfigureFiles(149)) - Hadoop command-line
> > > option
> > > > parsing not performed. Implement the Tool interface and execute your
> > > > application with ToolRunner to remedy this.
> > > > 2013-12-26 14:26:14,677 WARN  mapreduce.Counters
> > > > (AbstractCounters.java:getGroup(234)) - Group
> > > > org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> > > > org.apache.hadoop.mapreduce.TaskCounter instead
> > > > 2013-12-26 14:26:33,613 WARN  mapreduce.Counters
> > > > (AbstractCounters.java:getGroup(234)) - Group
> > > > org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> > > > org.apache.hadoop.mapreduce.TaskCounter instead
> > > > 2013-12-26 14:27:30,355 WARN  mapreduce.Counters
> > > > (AbstractCounters.java:getGroup(234)) - Group
> > > > org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> > > > org.apache.hadoop.mapreduce.TaskCounter instead
> > > > 2013-12-26 14:27:32,479 WARN  mapreduce.Counters
> > > > (AbstractCounters.java:getGroup(234)) - Group
> > > > org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> > > > org.apache.hadoop.mapreduce.TaskCounter instead
> > > > 2013-12-26 14:27:32,528 ERROR exec.Task
> > > (SessionState.java:printError(419))
> > > > - Ended Job = job_1388037394132_0013 with errors
> > > > 2013-12-26 14:27:32,530 ERROR exec.Task
> > > (SessionState.java:printError(419))
> > > > - Error during job, obtaining debugging information...
> > > > 2013-12-26 14:27:32,538 ERROR exec.Task
> > > (SessionState.java:printError(419))
> > > > - Examining task ID: task_1388037394132_0013_m_000000 (and more) from
> > job
> > > > job_1388037394132_0013
> > > > 2013-12-26 14:27:32,539 WARN  shims.HadoopShimsSecure
> > > > (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog:
> > > > TaskLogServlet is not supported in MR2 mode.
> > > > 2013-12-26 14:27:32,593 WARN  shims.HadoopShimsSecure
> > > > (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog:
> > > > TaskLogServlet is not supported in MR2 mode.
> > > > 2013-12-26 14:27:32,596 WARN  shims.HadoopShimsSecure
> > > > (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog:
> > > > TaskLogServlet is not supported in MR2 mode.
> > > > 2013-12-26 14:27:32,599 WARN  shims.HadoopShimsSecure
> > > > (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog:
> > > > TaskLogServlet is not supported in MR2 mode.
> > > > 2013-12-26 14:27:32,615 ERROR exec.Task
> > > (SessionState.java:printError(419))
> > > > -
> > > > Task with the most failures(4):
> > > > -----
> > > > Task ID:
> > > >  task_1388037394132_0013_m_000000
> > > >
> > > > URL:
> > > >
> > > >
> > >
> >
> http://ambari1.hadoop.com:8088/taskdetails.jsp?jobid=job_1388037394132_0013&tipid=task_1388037394132_0013_m_000000
> > > > -----
> > > > Diagnostic Messages for this Task:
> > > > Error: java.io.IOException: java.io.IOException:
> > > > java.lang.reflect.InvocationTargetException
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:244)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:538)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.mapred.MapTask$TrackedRecordReader.<init>(MapTask.java:167)
> > > > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:408)
> > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> > > > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
> > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > at javax.security.auth.Subject.doAs(Subject.java:396)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> > > > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
> > > > Caused by: java.io.IOException:
> > > java.lang.reflect.InvocationTargetException
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:383)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:360)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:244)
> > > > at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:187)
> > > > at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:164)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getRecordReader(HiveHBaseTableInputFormat.java:91)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:241)
> > > > ... 9 more
> > > > Caused by: java.lang.reflect.InvocationTargetException
> > > > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> > Method)
> > > > at
> > > >
> > >
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> > > > at
> > > >
> > >
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> > > > at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:381)
> > > > ... 15 more
> > > > Caused by: java.lang.NoClassDefFoundError: org/cloudera/htrace/Trace
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:196)
> > > > at
> > org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:479)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:794)
> > > > at
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:627)
> > > > ... 20 more
> > > > Caused by: java.lang.ClassNotFoundException:
> org.cloudera.htrace.Trace
> > > > at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > > > at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > > > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> > > > at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > > > ... 26 more
> > > >
> > > >
> > > > 2013-12-26 14:27:32,870 ERROR ql.Driver
> > > (SessionState.java:printError(419))
> > > > - FAILED: Execution Error, return code 2 from org.apache.
> > > >
> > > > I think this error is related with mapred job. Whenever my query use
> > the
> > > > map-R then i get error.
> > > >
> > > > Any idea!!
> > > >
> > > > --
> > > > Thanks & Regards:-
> > > > Vikas Parashar
> > > > Sr. Linux administrator Cum Developer
> > > > Mobile: +91 958 208 8852
> > > > Email: vikas.parashar@fosteringlinglinux.com
> > >
> >
> >
> >
> > --
> > Thanks & Regards:-
> > Vikas Parashar
> > Sr. Linux administrator Cum Developer
> > Mobile: +91 958 208 8852
> > Email: vikas.parashar@fosteringlinglinux.com
> >
>



-- 
Thanks & Regards:-
Vikas Parashar
Sr. Linux administrator Cum Developer
Mobile: +91 958 208 8852
Email: vikas.parashar@fosteringlinglinux.com

Re: hbase hive integration

Posted by Ted Yu <yu...@gmail.com>.
htrace 1.5 won't solve the problem.
htrace-core-2.01.jar is included in the 0.96.1.1 tar ball where TraceScope
can be found.

I searched under hbase-handler/src/java/org/apache/hadoop/hive/hbase but
didn't see the following method being called:
org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil.addHBaseDependencyJars

addHBaseDependencyJars() adds HBase and its dependencies (only) to the job
configuration.
 Maybe this was the reason for these errors you saw.

Cheers

On Thu, Dec 26, 2013 at 7:33 AM, Vikas Parashar <
vikas.parashar@fosteringlinux.com> wrote:

> Hi,
>
> if i copied htrace-1.50.jar in map-reduce job. I got below error..
>
> java.io.IOException: java.lang.reflect.InvocationTargetException
> at
>
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:383)
>  at
>
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:360)
> at
>
> org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:244)
>  at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:187)
> at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:164)
>  at
>
> org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplits(HiveHBaseTableInputFormat.java:442)
> at
>
> org.apache.hadoop.hive.ql.io.HiveInputFormat.getSplits(HiveInputFormat.java:294)
>  at
>
> org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getSplits(CombineHiveInputFormat.java:303)
> at
>
> org.apache.hadoop.mapreduce.JobSubmitter.writeOldSplits(JobSubmitter.java:518)
>  at
> org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:510)
> at
>
> org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:392)
>  at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
> at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
>  at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:396)
>  at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
>  at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562)
> at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557)
>  at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:396)
>  at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557)
>  at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548)
> at
> org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:425)
>  at
> org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:136)
> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151)
>  at
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65)
> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1437)
>  at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1215)
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1043)
>  at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)
> at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
>  at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
>  at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:781)
> at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
>  at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>  at java.lang.reflect.Method.invoke(Method.java:597)
> at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
> Caused by: java.lang.reflect.InvocationTargetException
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>  at
>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>  at
>
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:381)
> ... 42 more
> Caused by: java.lang.NoSuchMethodError:
>
> org.cloudera.htrace.Trace.startSpan(Ljava/lang/String;)Lorg/cloudera/htrace/TraceScope;
> at
>
> org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:196)
>  at org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:479)
> at
>
> org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65)
>  at
>
> org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83)
> at
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:794)
>  at
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:627)
> ... 47 more
>
>
>
> On Thu, Dec 26, 2013 at 6:26 PM, Ted Yu <yu...@gmail.com> wrote:
>
> > The error was due to htrace jar missing in class path of the map reduce
> > task.
> >
> > Cheers
> >
> > On Dec 26, 2013, at 4:05 AM, Vikas Parashar <
> > vikas.parashar@fosteringlinux.com> wrote:
> >
> > > Hi,
> > >
> > > I am integrating hive(0.12) with hbase(0.96). Everything is working
> fine
> > > there but get stuck between two quires.
> > >
> > > When i create table or select * from table then it's working fine .
> > > but in case of select count(*) from table it give me below error.
> > >
> > >
> > > 2013-12-26 13:25:01,864 ERROR ql.Driver
> > (SessionState.java:printError(419))
> > > - FAILED: Execution Error, return code 2 from
> > > org.apache.hadoop.hive.ql.exec.mr.MapRedTask
> > > 2013-12-26 13:25:01,869 WARN  mapreduce.Counters
> > > (AbstractCounters.java:getGroup(234)) - Group FileSystemCounters is
> > > deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
> > > 2013-12-26 14:25:44,119 WARN  mapreduce.JobSubmitter
> > > (JobSubmitter.java:copyAndConfigureFiles(149)) - Hadoop command-line
> > option
> > > parsing not performed. Implement the Tool interface and execute your
> > > application with ToolRunner to remedy this.
> > > 2013-12-26 14:26:14,677 WARN  mapreduce.Counters
> > > (AbstractCounters.java:getGroup(234)) - Group
> > > org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> > > org.apache.hadoop.mapreduce.TaskCounter instead
> > > 2013-12-26 14:26:33,613 WARN  mapreduce.Counters
> > > (AbstractCounters.java:getGroup(234)) - Group
> > > org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> > > org.apache.hadoop.mapreduce.TaskCounter instead
> > > 2013-12-26 14:27:30,355 WARN  mapreduce.Counters
> > > (AbstractCounters.java:getGroup(234)) - Group
> > > org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> > > org.apache.hadoop.mapreduce.TaskCounter instead
> > > 2013-12-26 14:27:32,479 WARN  mapreduce.Counters
> > > (AbstractCounters.java:getGroup(234)) - Group
> > > org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> > > org.apache.hadoop.mapreduce.TaskCounter instead
> > > 2013-12-26 14:27:32,528 ERROR exec.Task
> > (SessionState.java:printError(419))
> > > - Ended Job = job_1388037394132_0013 with errors
> > > 2013-12-26 14:27:32,530 ERROR exec.Task
> > (SessionState.java:printError(419))
> > > - Error during job, obtaining debugging information...
> > > 2013-12-26 14:27:32,538 ERROR exec.Task
> > (SessionState.java:printError(419))
> > > - Examining task ID: task_1388037394132_0013_m_000000 (and more) from
> job
> > > job_1388037394132_0013
> > > 2013-12-26 14:27:32,539 WARN  shims.HadoopShimsSecure
> > > (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog:
> > > TaskLogServlet is not supported in MR2 mode.
> > > 2013-12-26 14:27:32,593 WARN  shims.HadoopShimsSecure
> > > (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog:
> > > TaskLogServlet is not supported in MR2 mode.
> > > 2013-12-26 14:27:32,596 WARN  shims.HadoopShimsSecure
> > > (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog:
> > > TaskLogServlet is not supported in MR2 mode.
> > > 2013-12-26 14:27:32,599 WARN  shims.HadoopShimsSecure
> > > (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog:
> > > TaskLogServlet is not supported in MR2 mode.
> > > 2013-12-26 14:27:32,615 ERROR exec.Task
> > (SessionState.java:printError(419))
> > > -
> > > Task with the most failures(4):
> > > -----
> > > Task ID:
> > >  task_1388037394132_0013_m_000000
> > >
> > > URL:
> > >
> > >
> >
> http://ambari1.hadoop.com:8088/taskdetails.jsp?jobid=job_1388037394132_0013&tipid=task_1388037394132_0013_m_000000
> > > -----
> > > Diagnostic Messages for this Task:
> > > Error: java.io.IOException: java.io.IOException:
> > > java.lang.reflect.InvocationTargetException
> > > at
> > >
> >
> org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97)
> > > at
> > >
> >
> org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57)
> > > at
> > >
> >
> org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:244)
> > > at
> > >
> >
> org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:538)
> > > at
> > >
> >
> org.apache.hadoop.mapred.MapTask$TrackedRecordReader.<init>(MapTask.java:167)
> > > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:408)
> > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> > > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
> > > at java.security.AccessController.doPrivileged(Native Method)
> > > at javax.security.auth.Subject.doAs(Subject.java:396)
> > > at
> > >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> > > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
> > > Caused by: java.io.IOException:
> > java.lang.reflect.InvocationTargetException
> > > at
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:383)
> > > at
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:360)
> > > at
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:244)
> > > at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:187)
> > > at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:164)
> > > at
> > >
> >
> org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getRecordReader(HiveHBaseTableInputFormat.java:91)
> > > at
> > >
> >
> org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:241)
> > > ... 9 more
> > > Caused by: java.lang.reflect.InvocationTargetException
> > > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
> > > at
> > >
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> > > at
> > >
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> > > at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
> > > at
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:381)
> > > ... 15 more
> > > Caused by: java.lang.NoClassDefFoundError: org/cloudera/htrace/Trace
> > > at
> > >
> >
> org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:196)
> > > at
> org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:479)
> > > at
> > >
> >
> org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65)
> > > at
> > >
> >
> org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83)
> > > at
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:794)
> > > at
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:627)
> > > ... 20 more
> > > Caused by: java.lang.ClassNotFoundException: org.cloudera.htrace.Trace
> > > at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> > > at java.security.AccessController.doPrivileged(Native Method)
> > > at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > > at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> > > at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > > ... 26 more
> > >
> > >
> > > 2013-12-26 14:27:32,870 ERROR ql.Driver
> > (SessionState.java:printError(419))
> > > - FAILED: Execution Error, return code 2 from org.apache.
> > >
> > > I think this error is related with mapred job. Whenever my query use
> the
> > > map-R then i get error.
> > >
> > > Any idea!!
> > >
> > > --
> > > Thanks & Regards:-
> > > Vikas Parashar
> > > Sr. Linux administrator Cum Developer
> > > Mobile: +91 958 208 8852
> > > Email: vikas.parashar@fosteringlinglinux.com
> >
>
>
>
> --
> Thanks & Regards:-
> Vikas Parashar
> Sr. Linux administrator Cum Developer
> Mobile: +91 958 208 8852
> Email: vikas.parashar@fosteringlinglinux.com
>

Re: hbase hive integration

Posted by Vikas Parashar <vi...@fosteringlinux.com>.
Hi,

if i copied htrace-1.50.jar in map-reduce job. I got below error..

java.io.IOException: java.lang.reflect.InvocationTargetException
at
org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:383)
 at
org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:360)
at
org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:244)
 at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:187)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:164)
 at
org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getSplits(HiveHBaseTableInputFormat.java:442)
at
org.apache.hadoop.hive.ql.io.HiveInputFormat.getSplits(HiveInputFormat.java:294)
 at
org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getSplits(CombineHiveInputFormat.java:303)
at
org.apache.hadoop.mapreduce.JobSubmitter.writeOldSplits(JobSubmitter.java:518)
 at
org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:510)
at
org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:392)
 at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1268)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1265)
 at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
 at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1265)
 at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562)
at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557)
 at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
 at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:557)
 at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:548)
at org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:425)
 at
org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:136)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151)
 at
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1437)
 at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1215)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1043)
 at org.apache.hadoop.hive.ql.Driver.run(Driver.java:911)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
 at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
 at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:781)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:675)
 at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:614)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
 at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
 at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
 at
org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:381)
... 42 more
Caused by: java.lang.NoSuchMethodError:
org.cloudera.htrace.Trace.startSpan(Ljava/lang/String;)Lorg/cloudera/htrace/TraceScope;
at
org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:196)
 at org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:479)
at
org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65)
 at
org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83)
at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:794)
 at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:627)
... 47 more



On Thu, Dec 26, 2013 at 6:26 PM, Ted Yu <yu...@gmail.com> wrote:

> The error was due to htrace jar missing in class path of the map reduce
> task.
>
> Cheers
>
> On Dec 26, 2013, at 4:05 AM, Vikas Parashar <
> vikas.parashar@fosteringlinux.com> wrote:
>
> > Hi,
> >
> > I am integrating hive(0.12) with hbase(0.96). Everything is working fine
> > there but get stuck between two quires.
> >
> > When i create table or select * from table then it's working fine .
> > but in case of select count(*) from table it give me below error.
> >
> >
> > 2013-12-26 13:25:01,864 ERROR ql.Driver
> (SessionState.java:printError(419))
> > - FAILED: Execution Error, return code 2 from
> > org.apache.hadoop.hive.ql.exec.mr.MapRedTask
> > 2013-12-26 13:25:01,869 WARN  mapreduce.Counters
> > (AbstractCounters.java:getGroup(234)) - Group FileSystemCounters is
> > deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
> > 2013-12-26 14:25:44,119 WARN  mapreduce.JobSubmitter
> > (JobSubmitter.java:copyAndConfigureFiles(149)) - Hadoop command-line
> option
> > parsing not performed. Implement the Tool interface and execute your
> > application with ToolRunner to remedy this.
> > 2013-12-26 14:26:14,677 WARN  mapreduce.Counters
> > (AbstractCounters.java:getGroup(234)) - Group
> > org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> > org.apache.hadoop.mapreduce.TaskCounter instead
> > 2013-12-26 14:26:33,613 WARN  mapreduce.Counters
> > (AbstractCounters.java:getGroup(234)) - Group
> > org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> > org.apache.hadoop.mapreduce.TaskCounter instead
> > 2013-12-26 14:27:30,355 WARN  mapreduce.Counters
> > (AbstractCounters.java:getGroup(234)) - Group
> > org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> > org.apache.hadoop.mapreduce.TaskCounter instead
> > 2013-12-26 14:27:32,479 WARN  mapreduce.Counters
> > (AbstractCounters.java:getGroup(234)) - Group
> > org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> > org.apache.hadoop.mapreduce.TaskCounter instead
> > 2013-12-26 14:27:32,528 ERROR exec.Task
> (SessionState.java:printError(419))
> > - Ended Job = job_1388037394132_0013 with errors
> > 2013-12-26 14:27:32,530 ERROR exec.Task
> (SessionState.java:printError(419))
> > - Error during job, obtaining debugging information...
> > 2013-12-26 14:27:32,538 ERROR exec.Task
> (SessionState.java:printError(419))
> > - Examining task ID: task_1388037394132_0013_m_000000 (and more) from job
> > job_1388037394132_0013
> > 2013-12-26 14:27:32,539 WARN  shims.HadoopShimsSecure
> > (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog:
> > TaskLogServlet is not supported in MR2 mode.
> > 2013-12-26 14:27:32,593 WARN  shims.HadoopShimsSecure
> > (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog:
> > TaskLogServlet is not supported in MR2 mode.
> > 2013-12-26 14:27:32,596 WARN  shims.HadoopShimsSecure
> > (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog:
> > TaskLogServlet is not supported in MR2 mode.
> > 2013-12-26 14:27:32,599 WARN  shims.HadoopShimsSecure
> > (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog:
> > TaskLogServlet is not supported in MR2 mode.
> > 2013-12-26 14:27:32,615 ERROR exec.Task
> (SessionState.java:printError(419))
> > -
> > Task with the most failures(4):
> > -----
> > Task ID:
> >  task_1388037394132_0013_m_000000
> >
> > URL:
> >
> >
> http://ambari1.hadoop.com:8088/taskdetails.jsp?jobid=job_1388037394132_0013&tipid=task_1388037394132_0013_m_000000
> > -----
> > Diagnostic Messages for this Task:
> > Error: java.io.IOException: java.io.IOException:
> > java.lang.reflect.InvocationTargetException
> > at
> >
> org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97)
> > at
> >
> org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57)
> > at
> >
> org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:244)
> > at
> >
> org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:538)
> > at
> >
> org.apache.hadoop.mapred.MapTask$TrackedRecordReader.<init>(MapTask.java:167)
> > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:408)
> > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
> > at java.security.AccessController.doPrivileged(Native Method)
> > at javax.security.auth.Subject.doAs(Subject.java:396)
> > at
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
> > Caused by: java.io.IOException:
> java.lang.reflect.InvocationTargetException
> > at
> >
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:383)
> > at
> >
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:360)
> > at
> >
> org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:244)
> > at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:187)
> > at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:164)
> > at
> >
> org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getRecordReader(HiveHBaseTableInputFormat.java:91)
> > at
> >
> org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:241)
> > ... 9 more
> > Caused by: java.lang.reflect.InvocationTargetException
> > at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> > at
> >
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> > at
> >
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> > at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
> > at
> >
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:381)
> > ... 15 more
> > Caused by: java.lang.NoClassDefFoundError: org/cloudera/htrace/Trace
> > at
> >
> org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:196)
> > at org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:479)
> > at
> >
> org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65)
> > at
> >
> org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83)
> > at
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:794)
> > at
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:627)
> > ... 20 more
> > Caused by: java.lang.ClassNotFoundException: org.cloudera.htrace.Trace
> > at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> > at java.security.AccessController.doPrivileged(Native Method)
> > at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> > at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> > at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> > ... 26 more
> >
> >
> > 2013-12-26 14:27:32,870 ERROR ql.Driver
> (SessionState.java:printError(419))
> > - FAILED: Execution Error, return code 2 from org.apache.
> >
> > I think this error is related with mapred job. Whenever my query use the
> > map-R then i get error.
> >
> > Any idea!!
> >
> > --
> > Thanks & Regards:-
> > Vikas Parashar
> > Sr. Linux administrator Cum Developer
> > Mobile: +91 958 208 8852
> > Email: vikas.parashar@fosteringlinglinux.com
>



-- 
Thanks & Regards:-
Vikas Parashar
Sr. Linux administrator Cum Developer
Mobile: +91 958 208 8852
Email: vikas.parashar@fosteringlinglinux.com

Re: hbase hive integration

Posted by Ted Yu <yu...@gmail.com>.
The error was due to htrace jar missing in class path of the map reduce task. 

Cheers

On Dec 26, 2013, at 4:05 AM, Vikas Parashar <vi...@fosteringlinux.com> wrote:

> Hi,
> 
> I am integrating hive(0.12) with hbase(0.96). Everything is working fine
> there but get stuck between two quires.
> 
> When i create table or select * from table then it's working fine .
> but in case of select count(*) from table it give me below error.
> 
> 
> 2013-12-26 13:25:01,864 ERROR ql.Driver (SessionState.java:printError(419))
> - FAILED: Execution Error, return code 2 from
> org.apache.hadoop.hive.ql.exec.mr.MapRedTask
> 2013-12-26 13:25:01,869 WARN  mapreduce.Counters
> (AbstractCounters.java:getGroup(234)) - Group FileSystemCounters is
> deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
> 2013-12-26 14:25:44,119 WARN  mapreduce.JobSubmitter
> (JobSubmitter.java:copyAndConfigureFiles(149)) - Hadoop command-line option
> parsing not performed. Implement the Tool interface and execute your
> application with ToolRunner to remedy this.
> 2013-12-26 14:26:14,677 WARN  mapreduce.Counters
> (AbstractCounters.java:getGroup(234)) - Group
> org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> org.apache.hadoop.mapreduce.TaskCounter instead
> 2013-12-26 14:26:33,613 WARN  mapreduce.Counters
> (AbstractCounters.java:getGroup(234)) - Group
> org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> org.apache.hadoop.mapreduce.TaskCounter instead
> 2013-12-26 14:27:30,355 WARN  mapreduce.Counters
> (AbstractCounters.java:getGroup(234)) - Group
> org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> org.apache.hadoop.mapreduce.TaskCounter instead
> 2013-12-26 14:27:32,479 WARN  mapreduce.Counters
> (AbstractCounters.java:getGroup(234)) - Group
> org.apache.hadoop.mapred.Task$Counter is deprecated. Use
> org.apache.hadoop.mapreduce.TaskCounter instead
> 2013-12-26 14:27:32,528 ERROR exec.Task (SessionState.java:printError(419))
> - Ended Job = job_1388037394132_0013 with errors
> 2013-12-26 14:27:32,530 ERROR exec.Task (SessionState.java:printError(419))
> - Error during job, obtaining debugging information...
> 2013-12-26 14:27:32,538 ERROR exec.Task (SessionState.java:printError(419))
> - Examining task ID: task_1388037394132_0013_m_000000 (and more) from job
> job_1388037394132_0013
> 2013-12-26 14:27:32,539 WARN  shims.HadoopShimsSecure
> (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog:
> TaskLogServlet is not supported in MR2 mode.
> 2013-12-26 14:27:32,593 WARN  shims.HadoopShimsSecure
> (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog:
> TaskLogServlet is not supported in MR2 mode.
> 2013-12-26 14:27:32,596 WARN  shims.HadoopShimsSecure
> (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog:
> TaskLogServlet is not supported in MR2 mode.
> 2013-12-26 14:27:32,599 WARN  shims.HadoopShimsSecure
> (Hadoop23Shims.java:getTaskAttemptLogUrl(72)) - Can't fetch tasklog:
> TaskLogServlet is not supported in MR2 mode.
> 2013-12-26 14:27:32,615 ERROR exec.Task (SessionState.java:printError(419))
> -
> Task with the most failures(4):
> -----
> Task ID:
>  task_1388037394132_0013_m_000000
> 
> URL:
> 
> http://ambari1.hadoop.com:8088/taskdetails.jsp?jobid=job_1388037394132_0013&tipid=task_1388037394132_0013_m_000000
> -----
> Diagnostic Messages for this Task:
> Error: java.io.IOException: java.io.IOException:
> java.lang.reflect.InvocationTargetException
> at
> org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97)
> at
> org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57)
> at
> org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:244)
> at
> org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:538)
> at
> org.apache.hadoop.mapred.MapTask$TrackedRecordReader.<init>(MapTask.java:167)
> at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:408)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:396)
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
> Caused by: java.io.IOException: java.lang.reflect.InvocationTargetException
> at
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:383)
> at
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:360)
> at
> org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:244)
> at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:187)
> at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:164)
> at
> org.apache.hadoop.hive.hbase.HiveHBaseTableInputFormat.getRecordReader(HiveHBaseTableInputFormat.java:91)
> at
> org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:241)
> ... 9 more
> Caused by: java.lang.reflect.InvocationTargetException
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
> at
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:381)
> ... 15 more
> Caused by: java.lang.NoClassDefFoundError: org/cloudera/htrace/Trace
> at
> org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:196)
> at org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:479)
> at
> org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65)
> at
> org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83)
> at
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:794)
> at
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:627)
> ... 20 more
> Caused by: java.lang.ClassNotFoundException: org.cloudera.htrace.Trace
> at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
> ... 26 more
> 
> 
> 2013-12-26 14:27:32,870 ERROR ql.Driver (SessionState.java:printError(419))
> - FAILED: Execution Error, return code 2 from org.apache.
> 
> I think this error is related with mapred job. Whenever my query use the
> map-R then i get error.
> 
> Any idea!!
> 
> -- 
> Thanks & Regards:-
> Vikas Parashar
> Sr. Linux administrator Cum Developer
> Mobile: +91 958 208 8852
> Email: vikas.parashar@fosteringlinglinux.com