You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hbase.apache.org by Ted Yu <yu...@gmail.com> on 2016/09/05 22:43:53 UTC

Re: There are some errors with hbase java example

Xi:
Thanks for spotting these.

snappy requires some setup - setting compression type to others would be
easier for first time users.

Please open a JIRA to modify refguide.

Cheers


On Mon, Sep 5, 2016 at 12:31 PM, Xi Yang <al...@gmail.com> wrote:

> Hi ,
>
> There are some errors with hbase java example:
> https://hbase.apache.org/book.html#_examples
>
> 1. if (admin.tableExists(tableName)) {
>
>         System.out.println("Table does not exist.");
>         System.exit(-1);
>       }
>
> If you write code like this, you will find that the code will always go
> into this block.
> [image: 内嵌图片 1]
>
> So we should change it into
>
> if ( *!*admin.tableExists(tableName)) {....}
>
>
>
> 2. table.addFamily(new HColumnDescriptor(CF_DEFAULT).
> setCompressionType(Algorithm.SNAPPY));
>
> Some people may occur the example like me:
>
> Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.
> apache.hadoop.hbase.DoNotRetryIOException): org.apache.hadoop.hbase.DoNotRetryIOException:
> Compression algorithm 'snappy' previously failed test. Set
> hbase.table.sanity.checks to false at conf or table descriptor if you want
> to bypass sanity checks
> at org.apache.hadoop.hbase.master.HMaster.warnOrThrowExceptionForFailure
> (HMaster.java:1701)
> at org.apache.hadoop.hbase.master.HMaster.sanityCheckTableDescriptor(
> HMaster.java:1569)
> at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1491)
> at org.apache.hadoop.hbase.master.MasterRpcServices.
> createTable(MasterRpcServices.java:462)
> at org.apache.hadoop.hbase.protobuf.generated.
> MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:55682)
> at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2178)
> at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:112)
> at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(
> RpcExecutor.java:133)
> at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
> at java.lang.Thread.run(Thread.java:745)
>
> So I think we should change this line into
> table.addFamily(new HColumnDescriptor(CF_DEFAULT).
> setCompressionType(Algorithm.*NONE*));
>
> 3. // Update existing column family
>
>       HColumnDescriptor existingColumn = new HColumnDescriptor(CF_DEFAULT);
>       existingColumn.setCompactionCompressionType(Algorithm.GZ);
>       existingColumn.setMaxVersions(HConstants.ALL_VERSIONS);
>       table.modifyFamily(existingColumn);
>       admin.modifyTable(tableName, table);
>
> You will get this exception
>
> Exception in thread "main" java.lang.IllegalArgumentException: Column
> family 'DEFAULT_COLUMN_FAMILY' does not exist
> at org.apache.hadoop.hbase.HTableDescriptor.modifyFamily(
> HTableDescriptor.java:893)
> at org.alex.hbasetest.Example.modifySchema(Example.java:66)
> at org.alex.hbasetest.Example.main(Example.java:88)
>
> Because we should get the HTableDescriptor via Admin first. Like this:
> Change
> HTableDescriptor table = new HTableDescriptor(tableName);
> into
> HTableDescriptor table = *admin.getTableDescriptor(tableName)*;
>
> 4. Some other places but not error:
> There are some other places need to be improve;
>
> try *(*Connection connection = ConnectionFactory.createConnection(config);
>          Admin admin = connection.getAdmin()*)* *{*
>
>       HTableDescriptor table = new HTableDescriptor(TableName.valueOf(TABLE_NAME));
>       table.addFamily(new HColumnDescriptor(CF_DEFAULT).setCompressionType(Algorithm.SNAPPY));
>
>       System.out.print("Creating table. ");
>       createOrOverwrite(admin, table);
>       System.out.println(" Done.");
>     *}*
>
> I'm confused with that code and keep thinking why write code like this?
>
>
> Thanks,
> Alex
>
>
>
>
>
>
>