You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@accumulo.apache.org by Scott Roberts <sc...@jhu.edu> on 2012/02/17 08:00:45 UTC

Error executing the Bulk Ingest Example

All,

I'm getting a stack trace when I run the Bulk Ingest example and the entries never get added to the test_bulk table.  I'm using the example straight from the documentation, substituting the appropriate values for instance, zookeepers, username, and password:

./bin/tool.sh lib/accumulo-examples-*[^c].jar org.apache.accumulo.examples.mapreduce.bulk.BulkIngestExample <instance> <zookeepers> <username> <PW> test_bulk bulk tmp/bulkWork

------
Mapreduce log snippet:

2/02/17 01:45:58 INFO input.FileInputFormat: Total input paths to process : 1
12/02/17 01:45:59 INFO mapred.JobClient: Running job: job_201202162354_0003
12/02/17 01:46:00 INFO mapred.JobClient:  map 0% reduce 0%
12/02/17 01:46:13 INFO mapred.JobClient:  map 100% reduce 0%
12/02/17 01:46:25 INFO mapred.JobClient:  map 100% reduce 100%
12/02/17 01:46:30 INFO mapred.JobClient: Job complete: job_201202162354_0003
12/02/17 01:46:30 INFO mapred.JobClient: Counters: 25
12/02/17 01:46:30 INFO mapred.JobClient:   Job Counters
12/02/17 01:46:30 INFO mapred.JobClient:     Launched reduce tasks=3
12/02/17 01:46:30 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=12712
12/02/17 01:46:30 INFO mapred.JobClient:     Total time spent by all reduces waiting after reserving slots (ms)=0
12/02/17 01:46:30 INFO mapred.JobClient:     Total time spent by all maps waiting after reserving slots (ms)=0
12/02/17 01:46:30 INFO mapred.JobClient:     Rack-local map tasks=1
12/02/17 01:46:30 INFO mapred.JobClient:     Launched map tasks=1
12/02/17 01:46:30 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=30570
12/02/17 01:46:30 INFO mapred.JobClient:   File Output Format Counters
12/02/17 01:46:30 INFO mapred.JobClient:     Bytes Written=5552
12/02/17 01:46:30 INFO mapred.JobClient:   FileSystemCounters
12/02/17 01:46:30 INFO mapred.JobClient:     FILE_BYTES_READ=30018
12/02/17 01:46:30 INFO mapred.JobClient:     HDFS_BYTES_READ=28111
12/02/17 01:46:30 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=181390
12/02/17 01:46:30 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=5552
12/02/17 01:46:30 INFO mapred.JobClient:   File Input Format Counters12/02/17 01:46:30 INFO mapred.JobClient:     Bytes Read=28000
12/02/17 01:46:30 INFO mapred.JobClient:   Map-Reduce Framework
12/02/17 01:46:30 INFO mapred.JobClient:     Reduce input groups=1000
12/02/17 01:46:30 INFO mapred.JobClient:     Map output materialized bytes=30018
12/02/17 01:46:30 INFO mapred.JobClient:     Combine output records=0
12/02/17 01:46:30 INFO mapred.JobClient:     Map input records=100012/02/17 01:46:30 INFO mapred.JobClient:     Reduce shuffle bytes=0
12/02/17 01:46:30 INFO mapred.JobClient:     Reduce output records=1000
12/02/17 01:46:30 INFO mapred.JobClient:     Spilled Records=2000
12/02/17 01:46:30 INFO mapred.JobClient:     Map output bytes=28000
12/02/17 01:46:30 INFO mapred.JobClient:     Combine input records=0
12/02/17 01:46:30 INFO mapred.JobClient:     Map output records=100012/02/17 01:46:30 INFO mapred.JobClient:     SPLIT_RAW_BYTES=111
12/02/17 01:46:30 INFO mapred.JobClient:     Reduce input records=100012/02/17 01:46:30 ERROR util.BulkImportHelper: org.apache.thrift.TApplicationException: prepareBulkImport failed: unknown result
org.apache.thrift.TApplicationException: prepareBulkImport failed: unknown result        at org.apache.accumulo.core.client.impl.thrift.ClientService$Client.recv_prepareBulkImport(ClientService.java:245)
        at org.apache.accumulo.core.client.impl.thrift.ClientService$Client.prepareBulkImport(ClientService.java:206)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)        at cloudtrace.instrument.thrift.TraceWrap$2.invoke(TraceWrap.java:83)
        at $Proxy1.prepareBulkImport(Unknown Source)
        at org.apache.accumulo.core.util.BulkImportHelper.importDirectory(BulkImportHelper.java:152)
        at org.apache.accumulo.core.client.admin.TableOperationsImpl.importDirectory(TableOperationsImpl.java:717)
        at org.apache.accumulo.examples.mapreduce.bulk.BulkIngestExample.run(BulkIngestExample.java:143)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.accumulo.examples.mapreduce.bulk.BulkIngestExample.main(BulkIngestExample.java:161)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
12/02/17 01:46:30 ERROR util.BulkImportHelper: prepareBulkImport failed: unknown result
Exception in thread "main" java.lang.RuntimeException: java.lang.RuntimeException: prepareBulkImport failed: unknown result
        at org.apache.accumulo.examples.mapreduce.bulk.BulkIngestExample.run(BulkIngestExample.java:146)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.accumulo.examples.mapreduce.bulk.BulkIngestExample.main(BulkIngestExample.java:161)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: java.lang.RuntimeException: prepareBulkImport failed: unknown result
        at org.apache.accumulo.core.util.BulkImportHelper.importDirectory(BulkImportHelper.java:160)
        at org.apache.accumulo.core.client.admin.TableOperationsImpl.importDirectory(TableOperationsImpl.java:717)
        at org.apache.accumulo.examples.mapreduce.bulk.BulkIngestExample.run(BulkIngestExample.java:143)
        ... 7 more


------
Monitor log snippet:

17 01:46:30,364 [client.ClientServiceHandler] ERROR: tserver:compute-0-0.local Error preparing bulk import directory tmp/bulkWork/files
java.lang.NullPointerException
        at org.apache.accumulo.server.client.ClientServiceHandler.prepareBulkImport(ClientServiceHandler.java:268)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at cloudtrace.instrument.thrift.TraceWrap$1.invoke(TraceWrap.java:58)
        at $Proxy2.prepareBulkImport(Unknown Source)
        at org.apache.accumulo.core.client.impl.thrift.ClientService$Processor$prepareBulkImport.process(ClientService.java:984)
        at org.apache.accumulo.core.tabletserver.thrift.TabletClientService$Processor.process(TabletClientService.java:904)
        at org.apache.accumulo.server.util.TServerUtils$TimedProcessor.process(TServerUtils.java:141)
        at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:253)
        at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
        at java.lang.Thread.run(Thread.java:662)
------

I created the table and test data with no issues.

Re: Error executing the Bulk Ingest Example

Posted by Eric Newton <er...@gmail.com>.
Please use full pathnames for the bulk directories:

./bin/tool.sh lib/accumulo-examples-*[^c].jar
org.apache.accumulo.examples.mapreduce.bulk.BulkIngestExample <instance>
<zookeepers> <username> <PW> test_bulk */tmp/*bulk */tmp*/bulkWork

-Eric

On Fri, Feb 17, 2012 at 2:00 AM, Scott Roberts <sc...@jhu.edu> wrote:

> All,
>
> I'm getting a stack trace when I run the Bulk Ingest example and the
> entries never get added to the test_bulk table.  I'm using the example
> straight from the documentation, substituting the appropriate values for
> instance, zookeepers, username, and password:
>
> ./bin/tool.sh lib/accumulo-examples-*[^c].jar
> org.apache.accumulo.examples.mapreduce.bulk.BulkIngestExample <instance>
> <zookeepers> <username> <PW> test_bulk bulk tmp/bulkWork
>
> ------
> Mapreduce log snippet:
>
> 2/02/17 01:45:58 INFO input.FileInputFormat: Total input paths to process
> : 1
> 12/02/17 01:45:59 INFO mapred.JobClient: Running job: job_201202162354_0003
> 12/02/17 01:46:00 INFO mapred.JobClient:  map 0% reduce 0%
> 12/02/17 01:46:13 INFO mapred.JobClient:  map 100% reduce 0%
> 12/02/17 01:46:25 INFO mapred.JobClient:  map 100% reduce 100%
> 12/02/17 01:46:30 INFO mapred.JobClient: Job complete:
> job_201202162354_0003
> 12/02/17 01:46:30 INFO mapred.JobClient: Counters: 25
> 12/02/17 01:46:30 INFO mapred.JobClient:   Job Counters
> 12/02/17 01:46:30 INFO mapred.JobClient:     Launched reduce tasks=3
> 12/02/17 01:46:30 INFO mapred.JobClient:     SLOTS_MILLIS_MAPS=12712
> 12/02/17 01:46:30 INFO mapred.JobClient:     Total time spent by all
> reduces waiting after reserving slots (ms)=0
> 12/02/17 01:46:30 INFO mapred.JobClient:     Total time spent by all maps
> waiting after reserving slots (ms)=0
> 12/02/17 01:46:30 INFO mapred.JobClient:     Rack-local map tasks=1
> 12/02/17 01:46:30 INFO mapred.JobClient:     Launched map tasks=1
> 12/02/17 01:46:30 INFO mapred.JobClient:     SLOTS_MILLIS_REDUCES=30570
> 12/02/17 01:46:30 INFO mapred.JobClient:   File Output Format Counters
> 12/02/17 01:46:30 INFO mapred.JobClient:     Bytes Written=5552
> 12/02/17 01:46:30 INFO mapred.JobClient:   FileSystemCounters
> 12/02/17 01:46:30 INFO mapred.JobClient:     FILE_BYTES_READ=30018
> 12/02/17 01:46:30 INFO mapred.JobClient:     HDFS_BYTES_READ=28111
> 12/02/17 01:46:30 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=181390
> 12/02/17 01:46:30 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=5552
> 12/02/17 01:46:30 INFO mapred.JobClient:   File Input Format
> Counters12/02/17 01:46:30 INFO mapred.JobClient:     Bytes Read=28000
> 12/02/17 01:46:30 INFO mapred.JobClient:   Map-Reduce Framework
> 12/02/17 01:46:30 INFO mapred.JobClient:     Reduce input groups=1000
> 12/02/17 01:46:30 INFO mapred.JobClient:     Map output materialized
> bytes=30018
> 12/02/17 01:46:30 INFO mapred.JobClient:     Combine output records=0
> 12/02/17 01:46:30 INFO mapred.JobClient:     Map input
> records=100012/02/17 01:46:30 INFO mapred.JobClient:     Reduce shuffle
> bytes=0
> 12/02/17 01:46:30 INFO mapred.JobClient:     Reduce output records=1000
> 12/02/17 01:46:30 INFO mapred.JobClient:     Spilled Records=2000
> 12/02/17 01:46:30 INFO mapred.JobClient:     Map output bytes=28000
> 12/02/17 01:46:30 INFO mapred.JobClient:     Combine input records=0
> 12/02/17 01:46:30 INFO mapred.JobClient:     Map output
> records=100012/02/17 01:46:30 INFO mapred.JobClient:     SPLIT_RAW_BYTES=111
> 12/02/17 01:46:30 INFO mapred.JobClient:     Reduce input
> records=100012/02/17 01:46:30 ERROR util.BulkImportHelper:
> org.apache.thrift.TApplicationException: prepareBulkImport failed: unknown
> result
> org.apache.thrift.TApplicationException: prepareBulkImport failed: unknown
> result        at
> org.apache.accumulo.core.client.impl.thrift.ClientService$Client.recv_prepareBulkImport(ClientService.java:245)
>        at
> org.apache.accumulo.core.client.impl.thrift.ClientService$Client.prepareBulkImport(ClientService.java:206)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)        at
> cloudtrace.instrument.thrift.TraceWrap$2.invoke(TraceWrap.java:83)
>        at $Proxy1.prepareBulkImport(Unknown Source)
>        at
> org.apache.accumulo.core.util.BulkImportHelper.importDirectory(BulkImportHelper.java:152)
>        at
> org.apache.accumulo.core.client.admin.TableOperationsImpl.importDirectory(TableOperationsImpl.java:717)
>        at
> org.apache.accumulo.examples.mapreduce.bulk.BulkIngestExample.run(BulkIngestExample.java:143)
>        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>        at
> org.apache.accumulo.examples.mapreduce.bulk.BulkIngestExample.main(BulkIngestExample.java:161)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> 12/02/17 01:46:30 ERROR util.BulkImportHelper: prepareBulkImport failed:
> unknown result
> Exception in thread "main" java.lang.RuntimeException:
> java.lang.RuntimeException: prepareBulkImport failed: unknown result
>        at
> org.apache.accumulo.examples.mapreduce.bulk.BulkIngestExample.run(BulkIngestExample.java:146)
>        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>        at
> org.apache.accumulo.examples.mapreduce.bulk.BulkIngestExample.main(BulkIngestExample.java:161)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> Caused by: java.lang.RuntimeException: prepareBulkImport failed: unknown
> result
>        at
> org.apache.accumulo.core.util.BulkImportHelper.importDirectory(BulkImportHelper.java:160)
>        at
> org.apache.accumulo.core.client.admin.TableOperationsImpl.importDirectory(TableOperationsImpl.java:717)
>        at
> org.apache.accumulo.examples.mapreduce.bulk.BulkIngestExample.run(BulkIngestExample.java:143)
>        ... 7 more
>
>
> ------
> Monitor log snippet:
>
> 17 01:46:30,364 [client.ClientServiceHandler] ERROR:
> tserver:compute-0-0.local Error preparing bulk import directory
> tmp/bulkWork/files
> java.lang.NullPointerException
>        at
> org.apache.accumulo.server.client.ClientServiceHandler.prepareBulkImport(ClientServiceHandler.java:268)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at
> cloudtrace.instrument.thrift.TraceWrap$1.invoke(TraceWrap.java:58)
>        at $Proxy2.prepareBulkImport(Unknown Source)
>        at
> org.apache.accumulo.core.client.impl.thrift.ClientService$Processor$prepareBulkImport.process(ClientService.java:984)
>        at
> org.apache.accumulo.core.tabletserver.thrift.TabletClientService$Processor.process(TabletClientService.java:904)
>        at
> org.apache.accumulo.server.util.TServerUtils$TimedProcessor.process(TServerUtils.java:141)
>        at
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:253)
>        at
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>        at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>        at java.lang.Thread.run(Thread.java:662)
> ------
>
> I created the table and test data with no issues.