You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by yeshwanth kumar <ye...@gmail.com> on 2014/09/02 19:25:41 UTC

java.util.concurrent.ExecutionException

hi i am running HBase 0.94.20 on Hadoop 2.2.0

i am working on a mapreduce job, where it reads input from a table and
writes the processed back to that table and to another table,
i am using MultiTableOutputFormat class for that.

while running the mapreduce job, i encounter this exception, as a result
regionserver is crashing.

2014-09-02 07:56:47,790 WARN [main]
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation:
Failed all from
region=crawl_webpage,,1408774462347.a311e4aed343baf54f49ac6519d0bbe8.,
hostname=localhost, port=60020
java.util.concurrent.ExecutionException: java.io.IOException: Call to
localhost/127.0.0.1:60020 failed on local exception: java.io.EOFException
at java.util.concurrent.FutureTask.report(FutureTask.java:122)
at java.util.concurrent.FutureTask.get(FutureTask.java:188)
at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.processBatchCallback(HConnectionManager.java:1708)
at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.processBatch(HConnectionManager.java:1560)
at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:994)
at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:850)
at org.apache.hadoop.hbase.client.HTable.put(HTable.java:826)
at
org.apache.hadoop.hbase.mapreduce.MultiTableOutputFormat$MultiTableRecordWriter.write(MultiTableOutputFormat.java:132)
at
org.apache.hadoop.hbase.mapreduce.MultiTableOutputFormat$MultiTableRecordWriter.write(MultiTableOutputFormat.java:68)
at
org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:634)
at
org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
at
org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
at
com.serendio.icvs.analysis.text.EntitySearcherMR$EntitySearcherMapper.map(EntitySearcherMR.java:119)
at
com.serendio.icvs.analysis.text.EntitySearcherMR$EntitySearcherMapper.map(EntitySearcherMR.java:33)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:339)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
Caused by: java.io.IOException: Call to localhost/127.0.0.1:60020 failed on
local exception: java.io.EOFException
at
org.apache.hadoop.hbase.ipc.HBaseClient.wrapException(HBaseClient.java:1047)
at org.apache.hadoop.hbase.ipc.HBaseClient.call(HBaseClient.java:1016)
at
org.apache.hadoop.hbase.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:87)
at com.sun.proxy.$Proxy12.multi(Unknown Source)
at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$3$1.call(HConnectionManager.java:1537)
at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$3$1.call(HConnectionManager.java:1535)
at
org.apache.hadoop.hbase.client.ServerCallable.withoutRetries(ServerCallable.java:229)
at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$3.call(HConnectionManager.java:1544)
at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$3.call(HConnectionManager.java:1532)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.EOFException
at java.io.DataInputStream.readInt(DataInputStream.java:392)
at
org.apache.hadoop.hbase.ipc.HBaseClient$Connection.receiveResponse(HBaseClient.java:654)
at
org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:588)

any suggestions, for overcoming this issue,

thanks,
yeshwanth

Re: java.util.concurrent.ExecutionException

Posted by yeshwanth kumar <ye...@gmail.com>.
hi ted,

i verified both regionserver and hmaster logs,
no kind of exception is seen in regionserver,
but when regionserver dies, i see these  messages in hmaster logs

2014-09-03 03:50:37,496 DEBUG
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation:
Looked up root region location,
connection=org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@68163524;
serverName=localhost,60020,1409740435608
2014-09-03 03:50:37,497 DEBUG
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation:
locateRegionInMeta parentTable=-ROOT-,
metaLocation={region=-ROOT-,,0.70236052, hostname=localhost, port=60020},
attempt=15 of 140 failed; retrying after sleep of 64605 because: Connection
refused
2014-09-03 03:50:37,498 DEBUG
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation:
Looked up root region location,
connection=org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@68163524;
serverName=localhost,60020,1409740435608
2014-09-03 03:50:37,538 DEBUG
org.apache.hadoop.hbase.master.SplitLogManager: total tasks = 13 unassigned
= 13
2014-09-03 03:50:38,539 DEBUG
org.apache.hadoop.hbase.master.SplitLogManager: total tasks = 13 unassigned
= 13
2014-09-03 03:50:39,539 DEBUG
org.apache.hadoop.hbase.master.SplitLogManager: total tasks = 13 unassigned
= 13
2014-09-03 03:50:40,539 DEBUG
org.apache.hadoop.hbase.master.SplitLogManager: total tasks = 13 unassigned
= 13
................................................................................................................................................................................................

i thought it was the issue with hbase.regionserver.lease.period and
increased it to 5 mins,
but still the same scenario

and while starting regionserver  these are the logged messages,

2014-09-03 03:59:14,833 DEBUG
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation:
Looked up root region location,
connection=org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@1487380e;
serverName=localhost,60020,1409741944897
2014-09-03 03:59:14,835 DEBUG
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation:
Looked up root region location,
connection=org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@1487380e;
serverName=localhost,60020,1409741944897
2014-09-03 03:59:14,838 INFO
org.apache.hadoop.hbase.catalog.CatalogTracker: Failed verification of
.META.,,1 at address=localhost,60020,1409741527395;
org.apache.hadoop.hbase.NotServingRegionException:
org.apache.hadoop.hbase.NotServingRegionException: Region is not online:
.META.,,1
at
org.apache.hadoop.hbase.regionserver.HRegionServer.getRegion(HRegionServer.java:3588)
at
org.apache.hadoop.hbase.regionserver.HRegionServer.getRegionInfo(HRegionServer.java:2190)
at sun.reflect.GeneratedMethodAccessor25.invoke(Unknown Source)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at
org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:323)
at
org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1434)

after several attempts region comes online,
any idea what's causing the issue,

-yeshwanth


On Wed, Sep 3, 2014 at 2:45 AM, Ted Yu <yu...@gmail.com> wrote:

> Have you checked the region server (on the same node as the mapper) log to
> see if there was anything special around 07:56 ?
>
> Cheers
>
>
> On Tue, Sep 2, 2014 at 10:36 AM, yeshwanth kumar <ye...@gmail.com>
> wrote:
>
> > hi ted,
> >
> > configuration is gud, i got couple of mapreduce jobs on hbase running
> > without such issue,
> > going through logs of the job, i noticed that after processing some rows
> > this exception shows up
> >
> > -yeshwanth
> >
> >
> > On Tue, Sep 2, 2014 at 10:59 PM, Ted Yu <yu...@gmail.com> wrote:
> >
> > > bq. Call to localhost/127.0.0.1:60020 failed
> > >
> > > Can you check whether configuration from hbase-site.xml is correctly
> > passed
> > > to your mapper ?
> > >
> > > Cheers
> > >
> > >
> > > On Tue, Sep 2, 2014 at 10:25 AM, yeshwanth kumar <
> yeshwanth43@gmail.com>
> > > wrote:
> > >
> > > > hi i am running HBase 0.94.20 on Hadoop 2.2.0
> > > >
> > > > i am working on a mapreduce job, where it reads input from a table
> and
> > > > writes the processed back to that table and to another table,
> > > > i am using MultiTableOutputFormat class for that.
> > > >
> > > > while running the mapreduce job, i encounter this exception, as a
> > result
> > > > regionserver is crashing.
> > > >
> > > > 2014-09-02 07:56:47,790 WARN [main]
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation:
> > > > Failed all from
> > > >
> region=crawl_webpage,,1408774462347.a311e4aed343baf54f49ac6519d0bbe8.,
> > > > hostname=localhost, port=60020
> > > > java.util.concurrent.ExecutionException: java.io.IOException: Call to
> > > > localhost/127.0.0.1:60020 failed on local exception:
> > > java.io.EOFException
> > > > at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> > > > at java.util.concurrent.FutureTask.get(FutureTask.java:188)
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.processBatchCallback(HConnectionManager.java:1708)
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.processBatch(HConnectionManager.java:1560)
> > > > at
> org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:994)
> > > > at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:850)
> > > > at org.apache.hadoop.hbase.client.HTable.put(HTable.java:826)
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.mapreduce.MultiTableOutputFormat$MultiTableRecordWriter.write(MultiTableOutputFormat.java:132)
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.mapreduce.MultiTableOutputFormat$MultiTableRecordWriter.write(MultiTableOutputFormat.java:68)
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:634)
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
> > > > at
> > > >
> > > >
> > >
> >
> com.serendio.icvs.analysis.text.EntitySearcherMR$EntitySearcherMapper.map(EntitySearcherMR.java:119)
> > > > at
> > > >
> > > >
> > >
> >
> com.serendio.icvs.analysis.text.EntitySearcherMR$EntitySearcherMapper.map(EntitySearcherMR.java:33)
> > > > at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
> > > > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763)
> > > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:339)
> > > > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
> > > > at java.security.AccessController.doPrivileged(Native Method)
> > > > at javax.security.auth.Subject.doAs(Subject.java:415)
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> > > > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
> > > > Caused by: java.io.IOException: Call to localhost/127.0.0.1:60020
> > failed
> > > > on
> > > > local exception: java.io.EOFException
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseClient.wrapException(HBaseClient.java:1047)
> > > > at
> org.apache.hadoop.hbase.ipc.HBaseClient.call(HBaseClient.java:1016)
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:87)
> > > > at com.sun.proxy.$Proxy12.multi(Unknown Source)
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$3$1.call(HConnectionManager.java:1537)
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$3$1.call(HConnectionManager.java:1535)
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.ServerCallable.withoutRetries(ServerCallable.java:229)
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$3.call(HConnectionManager.java:1544)
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$3.call(HConnectionManager.java:1532)
> > > > at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> > > > at
> > > >
> > > >
> > >
> >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> > > > at
> > > >
> > > >
> > >
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> > > > at java.lang.Thread.run(Thread.java:745)
> > > > Caused by: java.io.EOFException
> > > > at java.io.DataInputStream.readInt(DataInputStream.java:392)
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.receiveResponse(HBaseClient.java:654)
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:588)
> > > >
> > > > any suggestions, for overcoming this issue,
> > > >
> > > > thanks,
> > > > yeshwanth
> > > >
> > >
> >
>

Re: java.util.concurrent.ExecutionException

Posted by Ted Yu <yu...@gmail.com>.
Have you checked the region server (on the same node as the mapper) log to
see if there was anything special around 07:56 ?

Cheers


On Tue, Sep 2, 2014 at 10:36 AM, yeshwanth kumar <ye...@gmail.com>
wrote:

> hi ted,
>
> configuration is gud, i got couple of mapreduce jobs on hbase running
> without such issue,
> going through logs of the job, i noticed that after processing some rows
> this exception shows up
>
> -yeshwanth
>
>
> On Tue, Sep 2, 2014 at 10:59 PM, Ted Yu <yu...@gmail.com> wrote:
>
> > bq. Call to localhost/127.0.0.1:60020 failed
> >
> > Can you check whether configuration from hbase-site.xml is correctly
> passed
> > to your mapper ?
> >
> > Cheers
> >
> >
> > On Tue, Sep 2, 2014 at 10:25 AM, yeshwanth kumar <ye...@gmail.com>
> > wrote:
> >
> > > hi i am running HBase 0.94.20 on Hadoop 2.2.0
> > >
> > > i am working on a mapreduce job, where it reads input from a table and
> > > writes the processed back to that table and to another table,
> > > i am using MultiTableOutputFormat class for that.
> > >
> > > while running the mapreduce job, i encounter this exception, as a
> result
> > > regionserver is crashing.
> > >
> > > 2014-09-02 07:56:47,790 WARN [main]
> > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation:
> > > Failed all from
> > > region=crawl_webpage,,1408774462347.a311e4aed343baf54f49ac6519d0bbe8.,
> > > hostname=localhost, port=60020
> > > java.util.concurrent.ExecutionException: java.io.IOException: Call to
> > > localhost/127.0.0.1:60020 failed on local exception:
> > java.io.EOFException
> > > at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> > > at java.util.concurrent.FutureTask.get(FutureTask.java:188)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.processBatchCallback(HConnectionManager.java:1708)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.processBatch(HConnectionManager.java:1560)
> > > at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:994)
> > > at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:850)
> > > at org.apache.hadoop.hbase.client.HTable.put(HTable.java:826)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.mapreduce.MultiTableOutputFormat$MultiTableRecordWriter.write(MultiTableOutputFormat.java:132)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.mapreduce.MultiTableOutputFormat$MultiTableRecordWriter.write(MultiTableOutputFormat.java:68)
> > > at
> > >
> > >
> >
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:634)
> > > at
> > >
> > >
> >
> org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
> > > at
> > >
> > >
> >
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
> > > at
> > >
> > >
> >
> com.serendio.icvs.analysis.text.EntitySearcherMR$EntitySearcherMapper.map(EntitySearcherMR.java:119)
> > > at
> > >
> > >
> >
> com.serendio.icvs.analysis.text.EntitySearcherMR$EntitySearcherMapper.map(EntitySearcherMR.java:33)
> > > at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
> > > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763)
> > > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:339)
> > > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
> > > at java.security.AccessController.doPrivileged(Native Method)
> > > at javax.security.auth.Subject.doAs(Subject.java:415)
> > > at
> > >
> > >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> > > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
> > > Caused by: java.io.IOException: Call to localhost/127.0.0.1:60020
> failed
> > > on
> > > local exception: java.io.EOFException
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseClient.wrapException(HBaseClient.java:1047)
> > > at org.apache.hadoop.hbase.ipc.HBaseClient.call(HBaseClient.java:1016)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:87)
> > > at com.sun.proxy.$Proxy12.multi(Unknown Source)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$3$1.call(HConnectionManager.java:1537)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$3$1.call(HConnectionManager.java:1535)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.client.ServerCallable.withoutRetries(ServerCallable.java:229)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$3.call(HConnectionManager.java:1544)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$3.call(HConnectionManager.java:1532)
> > > at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> > > at
> > >
> > >
> >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> > > at
> > >
> > >
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> > > at java.lang.Thread.run(Thread.java:745)
> > > Caused by: java.io.EOFException
> > > at java.io.DataInputStream.readInt(DataInputStream.java:392)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.receiveResponse(HBaseClient.java:654)
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:588)
> > >
> > > any suggestions, for overcoming this issue,
> > >
> > > thanks,
> > > yeshwanth
> > >
> >
>

Re: java.util.concurrent.ExecutionException

Posted by yeshwanth kumar <ye...@gmail.com>.
hi ted,

configuration is gud, i got couple of mapreduce jobs on hbase running
without such issue,
going through logs of the job, i noticed that after processing some rows
this exception shows up

-yeshwanth


On Tue, Sep 2, 2014 at 10:59 PM, Ted Yu <yu...@gmail.com> wrote:

> bq. Call to localhost/127.0.0.1:60020 failed
>
> Can you check whether configuration from hbase-site.xml is correctly passed
> to your mapper ?
>
> Cheers
>
>
> On Tue, Sep 2, 2014 at 10:25 AM, yeshwanth kumar <ye...@gmail.com>
> wrote:
>
> > hi i am running HBase 0.94.20 on Hadoop 2.2.0
> >
> > i am working on a mapreduce job, where it reads input from a table and
> > writes the processed back to that table and to another table,
> > i am using MultiTableOutputFormat class for that.
> >
> > while running the mapreduce job, i encounter this exception, as a result
> > regionserver is crashing.
> >
> > 2014-09-02 07:56:47,790 WARN [main]
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation:
> > Failed all from
> > region=crawl_webpage,,1408774462347.a311e4aed343baf54f49ac6519d0bbe8.,
> > hostname=localhost, port=60020
> > java.util.concurrent.ExecutionException: java.io.IOException: Call to
> > localhost/127.0.0.1:60020 failed on local exception:
> java.io.EOFException
> > at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> > at java.util.concurrent.FutureTask.get(FutureTask.java:188)
> > at
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.processBatchCallback(HConnectionManager.java:1708)
> > at
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.processBatch(HConnectionManager.java:1560)
> > at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:994)
> > at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:850)
> > at org.apache.hadoop.hbase.client.HTable.put(HTable.java:826)
> > at
> >
> >
> org.apache.hadoop.hbase.mapreduce.MultiTableOutputFormat$MultiTableRecordWriter.write(MultiTableOutputFormat.java:132)
> > at
> >
> >
> org.apache.hadoop.hbase.mapreduce.MultiTableOutputFormat$MultiTableRecordWriter.write(MultiTableOutputFormat.java:68)
> > at
> >
> >
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:634)
> > at
> >
> >
> org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
> > at
> >
> >
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
> > at
> >
> >
> com.serendio.icvs.analysis.text.EntitySearcherMR$EntitySearcherMapper.map(EntitySearcherMR.java:119)
> > at
> >
> >
> com.serendio.icvs.analysis.text.EntitySearcherMR$EntitySearcherMapper.map(EntitySearcherMR.java:33)
> > at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
> > at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763)
> > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:339)
> > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
> > at java.security.AccessController.doPrivileged(Native Method)
> > at javax.security.auth.Subject.doAs(Subject.java:415)
> > at
> >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
> > Caused by: java.io.IOException: Call to localhost/127.0.0.1:60020 failed
> > on
> > local exception: java.io.EOFException
> > at
> >
> >
> org.apache.hadoop.hbase.ipc.HBaseClient.wrapException(HBaseClient.java:1047)
> > at org.apache.hadoop.hbase.ipc.HBaseClient.call(HBaseClient.java:1016)
> > at
> >
> >
> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:87)
> > at com.sun.proxy.$Proxy12.multi(Unknown Source)
> > at
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$3$1.call(HConnectionManager.java:1537)
> > at
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$3$1.call(HConnectionManager.java:1535)
> > at
> >
> >
> org.apache.hadoop.hbase.client.ServerCallable.withoutRetries(ServerCallable.java:229)
> > at
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$3.call(HConnectionManager.java:1544)
> > at
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$3.call(HConnectionManager.java:1532)
> > at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> > at
> >
> >
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> > at
> >
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> > at java.lang.Thread.run(Thread.java:745)
> > Caused by: java.io.EOFException
> > at java.io.DataInputStream.readInt(DataInputStream.java:392)
> > at
> >
> >
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.receiveResponse(HBaseClient.java:654)
> > at
> >
> >
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:588)
> >
> > any suggestions, for overcoming this issue,
> >
> > thanks,
> > yeshwanth
> >
>

Re: java.util.concurrent.ExecutionException

Posted by Ted Yu <yu...@gmail.com>.
bq. Call to localhost/127.0.0.1:60020 failed

Can you check whether configuration from hbase-site.xml is correctly passed
to your mapper ?

Cheers


On Tue, Sep 2, 2014 at 10:25 AM, yeshwanth kumar <ye...@gmail.com>
wrote:

> hi i am running HBase 0.94.20 on Hadoop 2.2.0
>
> i am working on a mapreduce job, where it reads input from a table and
> writes the processed back to that table and to another table,
> i am using MultiTableOutputFormat class for that.
>
> while running the mapreduce job, i encounter this exception, as a result
> regionserver is crashing.
>
> 2014-09-02 07:56:47,790 WARN [main]
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation:
> Failed all from
> region=crawl_webpage,,1408774462347.a311e4aed343baf54f49ac6519d0bbe8.,
> hostname=localhost, port=60020
> java.util.concurrent.ExecutionException: java.io.IOException: Call to
> localhost/127.0.0.1:60020 failed on local exception: java.io.EOFException
> at java.util.concurrent.FutureTask.report(FutureTask.java:122)
> at java.util.concurrent.FutureTask.get(FutureTask.java:188)
> at
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.processBatchCallback(HConnectionManager.java:1708)
> at
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.processBatch(HConnectionManager.java:1560)
> at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:994)
> at org.apache.hadoop.hbase.client.HTable.doPut(HTable.java:850)
> at org.apache.hadoop.hbase.client.HTable.put(HTable.java:826)
> at
>
> org.apache.hadoop.hbase.mapreduce.MultiTableOutputFormat$MultiTableRecordWriter.write(MultiTableOutputFormat.java:132)
> at
>
> org.apache.hadoop.hbase.mapreduce.MultiTableOutputFormat$MultiTableRecordWriter.write(MultiTableOutputFormat.java:68)
> at
>
> org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:634)
> at
>
> org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89)
> at
>
> org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112)
> at
>
> com.serendio.icvs.analysis.text.EntitySearcherMR$EntitySearcherMapper.map(EntitySearcherMR.java:119)
> at
>
> com.serendio.icvs.analysis.text.EntitySearcherMR$EntitySearcherMapper.map(EntitySearcherMR.java:33)
> at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
> at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:763)
> at org.apache.hadoop.mapred.MapTask.run(MapTask.java:339)
> at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:415)
> at
>
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
> at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)
> Caused by: java.io.IOException: Call to localhost/127.0.0.1:60020 failed
> on
> local exception: java.io.EOFException
> at
>
> org.apache.hadoop.hbase.ipc.HBaseClient.wrapException(HBaseClient.java:1047)
> at org.apache.hadoop.hbase.ipc.HBaseClient.call(HBaseClient.java:1016)
> at
>
> org.apache.hadoop.hbase.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:87)
> at com.sun.proxy.$Proxy12.multi(Unknown Source)
> at
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$3$1.call(HConnectionManager.java:1537)
> at
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$3$1.call(HConnectionManager.java:1535)
> at
>
> org.apache.hadoop.hbase.client.ServerCallable.withoutRetries(ServerCallable.java:229)
> at
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$3.call(HConnectionManager.java:1544)
> at
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation$3.call(HConnectionManager.java:1532)
> at java.util.concurrent.FutureTask.run(FutureTask.java:262)
> at
>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> at
>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.io.EOFException
> at java.io.DataInputStream.readInt(DataInputStream.java:392)
> at
>
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.receiveResponse(HBaseClient.java:654)
> at
>
> org.apache.hadoop.hbase.ipc.HBaseClient$Connection.run(HBaseClient.java:588)
>
> any suggestions, for overcoming this issue,
>
> thanks,
> yeshwanth
>