You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hbase.apache.org by "stack (JIRA)" <ji...@apache.org> on 2013/08/18 17:45:52 UTC
[jira] [Comment Edited] (HBASE-8165) Move to Hadoop 2.1.0-beta from
2.0.x-alpha (WAS: Update our protobuf to 2.5 from 2.4.1)
[ https://issues.apache.org/jira/browse/HBASE-8165?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13743222#comment-13743222 ]
stack edited comment on HBASE-8165 at 8/18/13 3:45 PM:
-------------------------------------------------------
Running unit tests locally, a bunch fail. Needs investigation.
{code}
Results :
Failed tests: testCopyTable(org.apache.hadoop.hbase.mapreduce.TestCopyTable): expected:<0> but was:<1>
testStartStopRow(org.apache.hadoop.hbase.mapreduce.TestCopyTable): expected:<0> but was:<1>
testRenameFamily(org.apache.hadoop.hbase.mapreduce.TestCopyTable)
testExcludeMinorCompaction(org.apache.hadoop.hbase.mapreduce.TestHFileOutputFormat)
testMRIncrementalLoad(org.apache.hadoop.hbase.mapreduce.TestHFileOutputFormat)
testMRIncrementalLoadWithSplit(org.apache.hadoop.hbase.mapreduce.TestHFileOutputFormat)
testMultithreadedTableMapper(org.apache.hadoop.hbase.mapreduce.TestMultithreadedTableMapper)
testSimpleCase(org.apache.hadoop.hbase.mapreduce.TestImportExport)
testMetaExport(org.apache.hadoop.hbase.mapreduce.TestImportExport)
testExportScannerBatching(org.apache.hadoop.hbase.mapreduce.TestImportExport)
testWithFilter(org.apache.hadoop.hbase.mapreduce.TestImportExport)
testWithDeletes(org.apache.hadoop.hbase.mapreduce.TestImportExport)
testMROnTable(org.apache.hadoop.hbase.mapreduce.TestImportTsv): expected:<0> but was:<1>
testMROnTableWithCustomMapper(org.apache.hadoop.hbase.mapreduce.TestImportTsv): expected:<0> but was:<1>
testBulkOutputWithTsvImporterTextMapper(org.apache.hadoop.hbase.mapreduce.TestImportTsv): expected:<0> but was:<1>
testBulkOutputWithAnExistingTable(org.apache.hadoop.hbase.mapreduce.TestImportTsv): expected:<0> but was:<1>
testMROnTableWithTimestamp(org.apache.hadoop.hbase.mapreduce.TestImportTsv): expected:<0> but was:<1>
testBulkOutputWithoutAnExistingTable(org.apache.hadoop.hbase.mapreduce.TestImportTsv): expected:<0> but was:<1>
testRowCounterNoColumn(org.apache.hadoop.hbase.mapreduce.TestRowCounter)
testRowCounterHiddenColumn(org.apache.hadoop.hbase.mapreduce.TestRowCounter)
testRowCounterExclusiveColumn(org.apache.hadoop.hbase.mapreduce.TestRowCounter)
testCombiner(org.apache.hadoop.hbase.mapreduce.TestTableMapReduce)
testMultiRegionTable(org.apache.hadoop.hbase.mapreduce.TestTableMapReduce)
testScanEmptyToAPP(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan1)
testScanEmptyToBBA(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan1)
testScanEmptyToBBB(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan1)
testScanEmptyToOPP(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan1)
testScanEmptyToEmpty(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan1)
testWALPlayer(org.apache.hadoop.hbase.mapreduce.TestWALPlayer): expected:<0> but was:<1>
testScanYZYToEmpty(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan2)
testScanOPPToEmpty(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan2)
testScanYYXToEmpty(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan2)
testScanOBBToOPP(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan2)
testScanOBBToQPP(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan2)
testScanFromConfiguration(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan2)
testScanYYYToEmpty(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan2)
Tests in error:
org.apache.hadoop.hbase.client.TestSnapshotFromClient: Waiting for startup of standalone server
testMultiRegionTable(org.apache.hadoop.hbase.mapred.TestTableMapReduce): Job failed!
testCellCounter(org.apache.hadoop.hbase.mapreduce.TestCellCounter): target/test-data/output/part-r-00000 (No such file or directory)
Tests run: 1614, Failures: 36, Errors: 3, Skipped: 20
{code}
If anyone wants to help digging in on any of the above, it would be appreciated.
was (Author: stack):
Running unit tests locally, a bunch fail. Needs investigation.
{code}
Results :
Failed tests: testCopyTable(org.apache.hadoop.hbase.mapreduce.TestCopyTable): expected:<0> but was:<1>
testStartStopRow(org.apache.hadoop.hbase.mapreduce.TestCopyTable): expected:<0> but was:<1>
testRenameFamily(org.apache.hadoop.hbase.mapreduce.TestCopyTable)
testExcludeMinorCompaction(org.apache.hadoop.hbase.mapreduce.TestHFileOutputFormat)
testMRIncrementalLoad(org.apache.hadoop.hbase.mapreduce.TestHFileOutputFormat)
testMRIncrementalLoadWithSplit(org.apache.hadoop.hbase.mapreduce.TestHFileOutputFormat)
testMultithreadedTableMapper(org.apache.hadoop.hbase.mapreduce.TestMultithreadedTableMapper)
testSimpleCase(org.apache.hadoop.hbase.mapreduce.TestImportExport)
testMetaExport(org.apache.hadoop.hbase.mapreduce.TestImportExport)
testExportScannerBatching(org.apache.hadoop.hbase.mapreduce.TestImportExport)
testWithFilter(org.apache.hadoop.hbase.mapreduce.TestImportExport)
testWithDeletes(org.apache.hadoop.hbase.mapreduce.TestImportExport)
testMROnTable(org.apache.hadoop.hbase.mapreduce.TestImportTsv): expected:<0> but was:<1>
testMROnTableWithCustomMapper(org.apache.hadoop.hbase.mapreduce.TestImportTsv): expected:<0> but was:<1>
testBulkOutputWithTsvImporterTextMapper(org.apache.hadoop.hbase.mapreduce.TestImportTsv): expected:<0> but was:<1>
testBulkOutputWithAnExistingTable(org.apache.hadoop.hbase.mapreduce.TestImportTsv): expected:<0> but was:<1>
testMROnTableWithTimestamp(org.apache.hadoop.hbase.mapreduce.TestImportTsv): expected:<0> but was:<1>
testBulkOutputWithoutAnExistingTable(org.apache.hadoop.hbase.mapreduce.TestImportTsv): expected:<0> but was:<1>
testRowCounterNoColumn(org.apache.hadoop.hbase.mapreduce.TestRowCounter)
testRowCounterHiddenColumn(org.apache.hadoop.hbase.mapreduce.TestRowCounter)
testRowCounterExclusiveColumn(org.apache.hadoop.hbase.mapreduce.TestRowCounter)
testCombiner(org.apache.hadoop.hbase.mapreduce.TestTableMapReduce)
testMultiRegionTable(org.apache.hadoop.hbase.mapreduce.TestTableMapReduce)
testScanEmptyToAPP(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan1)
testScanEmptyToBBA(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan1)
testScanEmptyToBBB(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan1)
testScanEmptyToOPP(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan1)
testScanEmptyToEmpty(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan1)
testWALPlayer(org.apache.hadoop.hbase.mapreduce.TestWALPlayer): expected:<0> but was:<1>
testScanYZYToEmpty(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan2)
testScanOPPToEmpty(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan2)
testScanYYXToEmpty(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan2)
testScanOBBToOPP(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan2)
testScanOBBToQPP(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan2)
testScanFromConfiguration(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan2)
testScanYYYToEmpty(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan2)
Tests in error:
org.apache.hadoop.hbase.client.TestSnapshotFromClient: Waiting for startup of standalone server
testMultiRegionTable(org.apache.hadoop.hbase.mapred.TestTableMapReduce): Job failed!
testCellCounter(org.apache.hadoop.hbase.mapreduce.TestCellCounter): target/test-data/output/part-r-00000 (No such file or directory)
{code}
If anyone wants to help digging in on any of the above, it would be appreciated.
> Move to Hadoop 2.1.0-beta from 2.0.x-alpha (WAS: Update our protobuf to 2.5 from 2.4.1)
> ---------------------------------------------------------------------------------------
>
> Key: HBASE-8165
> URL: https://issues.apache.org/jira/browse/HBASE-8165
> Project: HBase
> Issue Type: Bug
> Reporter: stack
> Assignee: stack
> Fix For: 0.98.0, 0.96.0
>
> Attachments: 8165_trunkv6.txt, 8165.txt, 8165v2.txt, 8165v3.txt, 8165v4.txt, 8165v5.txt, HBASE-8165-rebased.patch
>
>
> Update to new 2.5 pb. Some speedups and a new PARSER idiom that bypasses making a builder.
--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira