You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hbase.apache.org by nd...@apache.org on 2020/03/04 17:13:02 UTC

[hbase] branch HBASE-23876/jdk11-nightly-master updated (eb08237 -> bd362a4)

This is an automated email from the ASF dual-hosted git repository.

ndimiduk pushed a change to branch HBASE-23876/jdk11-nightly-master
in repository https://gitbox.apache.org/repos/asf/hbase.git.


    omit eb08237  [DO NOT MERGE] include YETUS-943
    omit ec13f95  HBASE-23876 Add JDK11 compilation and unit test support to nightly job
    omit fc21450  HBASE-23767 Add JDK11 compilation and unit test support to Github precommit
     add 9f223c2  HBASE-23781 Removed deprecated createTableDescriptor(String) from HBaseTestingUtility
     add ecbed33  HBASE-23146 Support CheckAndMutate with multiple conditions (#1114)
     add 3c3aae9  HBASE-23740 Invalid StoreFile WARN log message printed for recovered.… (#1198)
     add b24ea32  HBASE-22978 : Online slow response log
     add 00ef6c6  HBASE-23892 SecureTestCluster should allow its subclasses to pass their Class reference on HBaseKerberosUtils.setSSLConfiguration (#1207)
     add 6777e2c  HBASE-23899 [Flakey Test] Stabilizations and Debug
     add a420f04  HBASE-23904 Procedure updating meta and Master shutdown are incompatible: CODE-BUG
     add 48a3ccf  HBASE-23914 : Set hbase.client.retries.number for TestThriftHBaseServiceHandler (#1227)
     add fbdaa21  HBASE-22664 Move protobuf stuff in hbase-rsgroup to hbase-protocol-shaded (#362)
     add 662149e  HBASE-22662 Move RSGroupInfoManager to hbase-server (#368)
     add f0a13bb  HBASE-22676 Move all the code in hbase-rsgroup to hbase-server and remove hbase-rsgroup module (#399)
     add 3709f33  HBASE-22695 Store the rsgroup of a table in table configuration (#426)
     add 44ddb30  HBASE-22809 Allow creating table in group when rs group contains no live servers (#464)
     add 2e0039c  HBASE-22820 Do not need to persist default rs group now (#482)
     add b3399ef  HBASE-22819 Automatically migrate the rs group config for table after HBASE-22695 (#498)
     add 2bf2781  HBASE-22729 Start RSGroupInfoManager as default (#555)
     add 0e29d62  HBASE-22987 Calculate the region servers in default group in foreground (#599)
     add b10b39a  HBASE-22971 Deprecated RSGroupAdminEndpoint and make RSGroup feature always enabled (#595)
     add 0f4a87c  HBASE-23081 Add an option to enable/disable rs group feature (#691)
     add e12064b  HBASE-23232 Remove rsgroup profile from pom.xml of hbase-assembly (#779)
     add e8e9eec  HBASE-23050 Use RSGroupInfoManager to get rsgroups in master UI's rsgroup part (#776)
     add 72cbb12  HBASE-22932 Add rs group management methods in Admin and AsyncAdmin (#657)
     add c8d892c  HBASE-23253 Rewrite rsgroup related UTs with the new methods introduced in HBASE-22932 (#813)
     add 5de0a5e  HBASE-23235 Re-enable TestRSGroupsKillRS.testLowerMetaGroupVersion (#1117)
     add 7386369  HBASE-23276 Add admin methods to get tables within a group (#1118)
     add 37e87ae  HBASE-23807 Make rsgroup related shell command to use the new admin methods (#1148)
     add 7f2d823  HBASE-23818 Cleanup the remaining RSGroupInfo.getTables call in the code base (#1152)
     add 420e380  HBASE-23890 Update the rsgroup section in our ref guide (#1206)
     add 346d087  HBASE-23911 Attach the new rsgroup implementation design doc to our code base (#1224)
     add c21b28f  HBASE-23743 Add HBase 1.4.13 to the downloads page (#1226)
     add c0301e3  HBASE-23868 : Replace usages of HColumnDescriptor(byte [] familyName)… (#1222)
     add 5dcbe68  HBASE-23912 Resolve the TODO of FSTableDescriptor's construct method (#1225)
     add 72adfff  HBASE-23920 Pass --copy-to argument in ExportSnapshot tests (#1234)
     add f411e39  HBASE-22978 : Online slow response log (#1228) Addendum to fix errorprone ERROR
     add 29ea96f  HBASE-23921 Findbugs is OOM on master
     add 6d9802f  HBASE-23861. Reconcile Hadoop version. (#1179)
     new 31f8086  HBASE-23767 Add JDK11 compilation and unit test support to Github precommit
     new bd362a4  HBASE-23876 Add JDK11 compilation and unit test support to nightly job

This update added new revisions after undoing existing revisions.
That is to say, some revisions that were in the old version of the
branch are not in the new version.  This situation occurs
when a user --force pushes a change and generates a repository
containing something like this:

 * -- * -- B -- O -- O -- O   (eb08237)
            \
             N -- N -- N   refs/heads/HBASE-23876/jdk11-nightly-master (bd362a4)

You should already have received notification emails for all of the O
revisions, and so the following emails describe only the N revisions
from the common base, B.

Any revisions marked "omit" are not gone; other references still
refer to them.  Any revisions marked "discard" are gone forever.

The 2 revisions listed above as "new" are entirely new to this
repository and will be described in separate emails.  The revisions
listed as "add" were already present in the repository and have only
been added to this reference.


Summary of changes:
 .editorconfig                                      |    1 +
 dev-support/Jenkinsfile                            |   27 +-
 dev-support/Jenkinsfile_GitHub                     |  463 +++++---
 ...roup-feature-and-move-it-into-core-of-HBase.pdf |  Bin 0 -> 27888 bytes
 dev-support/docker/Dockerfile                      |   15 +-
 dev-support/hbase_nightly_yetus.sh                 |    2 +-
 dev-support/jenkins_precommit_github_yetus.sh      |  144 +++
 .../hbase/testclassification/RSGroupTests.java     |   22 +-
 .../exemplars/shaded_client/HelloHBase.java        |   10 +-
 hbase-assembly/pom.xml                             |   17 -
 hbase-assembly/src/main/assembly/components.xml    |    8 -
 .../src/main/assembly/hadoop-two-compat.xml        |    1 -
 .../hadoop/hbase/backup/util/BackupUtils.java      |    3 +-
 .../apache/hadoop/hbase/backup/TestBackupBase.java |   19 +-
 .../hadoop/hbase/backup/TestIncrementalBackup.java |   24 +-
 .../backup/TestIncrementalBackupWithFailures.java  |    7 +-
 .../hadoop/hbase/backup/TestRemoteBackup.java      |   16 +-
 .../org/apache/hadoop/hbase/HColumnDescriptor.java |    2 +-
 .../org/apache/hadoop/hbase/HTableDescriptor.java  |   12 +-
 .../org/apache/hadoop/hbase/MetaTableAccessor.java |   68 +-
 .../apache/hadoop/hbase/RSGroupTableAccessor.java  |   86 --
 .../java/org/apache/hadoop/hbase/client/Admin.java |  127 +++
 .../hadoop/hbase/client/AdminOverAsyncAdmin.java   |   75 ++
 .../org/apache/hadoop/hbase/client/AsyncAdmin.java |  126 ++-
 .../hadoop/hbase/client/AsyncHBaseAdmin.java       |   75 ++
 .../org/apache/hadoop/hbase/client/AsyncTable.java |   55 +
 .../apache/hadoop/hbase/client/AsyncTableImpl.java |   31 +
 .../hadoop/hbase/client/RawAsyncHBaseAdmin.java    |  282 ++++-
 .../hadoop/hbase/client/RawAsyncTableImpl.java     |   87 +-
 .../apache/hadoop/hbase/client/SlowLogParams.java  |   89 ++
 .../hadoop/hbase/client/SlowLogQueryFilter.java    |  122 ++
 .../apache/hadoop/hbase/client/SlowLogRecord.java  |  319 ++++++
 .../java/org/apache/hadoop/hbase/client/Table.java |   48 +
 .../hadoop/hbase/client/TableDescriptor.java       |    8 +
 .../hbase/client/TableDescriptorBuilder.java       |   34 +
 .../hadoop/hbase/client/TableOverAsyncTable.java   |  105 +-
 .../apache/hadoop/hbase/protobuf/ProtobufUtil.java |   29 +-
 .../hadoop/hbase/shaded/protobuf/ProtobufUtil.java |  140 +++
 .../hbase/shaded/protobuf/RequestConverter.java    |  169 ++-
 .../apache/hadoop/hbase/TestHColumnDescriptor.java |   12 +-
 .../apache/hadoop/hbase/TestHTableDescriptor.java  |   64 +-
 .../hbase/client/TestTableDescriptorBuilder.java   |   12 +-
 .../java/org/apache/hadoop/hbase/HConstants.java   |    6 +
 .../apache/hadoop/hbase/rsgroup/RSGroupInfo.java   |   46 +-
 hbase-common/src/main/resources/hbase-default.xml  |   23 +
 .../coprocessor/TestBatchCoprocessorEndpoint.java  |   13 +-
 .../hbase/coprocessor/TestCoprocessorEndpoint.java |   13 +-
 .../coprocessor/TestCoprocessorTableEndpoint.java  |   48 +-
 hbase-it/pom.xml                                   |   10 -
 .../IntegrationTestIngestStripeCompactions.java    |   17 +-
 .../hadoop/hbase/IntegrationTestLazyCfLoading.java |   14 +-
 .../hadoop/hbase/IntegrationTestMobCompaction.java |   26 +-
 .../StripeCompactionsPerformanceEvaluation.java    |   39 +-
 .../hbase/rsgroup/IntegrationTestRSGroup.java      |   30 +-
 .../hbase/test/IntegrationTestBigLinkedList.java   |   44 +-
 ...IntegrationTestBigLinkedListWithVisibility.java |   16 +-
 .../hbase/test/IntegrationTestLoadAndVerify.java   |   46 +-
 ...grationTestWithCellVisibilityLoadAndVerify.java |   34 +-
 .../apache/hadoop/hbase/mapreduce/ImportTsv.java   |   14 +-
 .../apache/hadoop/hbase/PerformanceEvaluation.java |   24 +-
 .../hadoop/hbase/TestPerformanceEvaluation.java    |    8 +-
 .../mapreduce/TestTableInputFormatScanBase.java    |    7 -
 .../hbase/mapreduce/TestTimeRangeMapRed.java       |   20 +-
 .../hadoop/hbase/snapshot/TestExportSnapshot.java  |   61 +-
 .../snapshot/TestExportSnapshotV1NoCluster.java    |   44 +-
 .../snapshot/TestExportSnapshotV2NoCluster.java    |   15 +-
 .../TestExportSnapshotWithTemporaryDirectory.java  |   23 +-
 .../store/wal/TestWALProcedureStore.java           |    3 +
 .../src/main/protobuf/Admin.proto                  |   27 +
 .../src/main/protobuf/Client.proto                 |    9 +-
 .../src/main/protobuf/Master.proto                 |   34 +
 .../src/main/protobuf/RSGroup.proto                |   23 +-
 .../src/main/protobuf/RSGroupAdmin.proto           |   19 +-
 .../src/main/protobuf/TooSlowLog.proto             |   45 +
 hbase-protocol/src/main/protobuf/Client.proto      |    9 +-
 .../src/main/protobuf/RSGroupAdmin.proto           |    0
 .../hadoop/hbase/rest/client/RemoteAdmin.java      |    4 +-
 .../hadoop/hbase/rest/client/RemoteHTable.java     |    7 +
 .../hadoop/hbase/rest/model/TableSchemaModel.java  |   11 +-
 .../hadoop/hbase/rest/PerformanceEvaluation.java   |   24 +-
 .../hadoop/hbase/rest/TestScannersWithFilters.java |   15 +-
 .../hadoop/hbase/rest/client/TestRemoteTable.java  |   22 +-
 hbase-rsgroup/README.txt                           |   13 -
 hbase-rsgroup/pom.xml                              |  284 -----
 .../apache/hadoop/hbase/rsgroup/RSGroupAdmin.java  |  101 --
 .../hadoop/hbase/rsgroup/RSGroupAdminEndpoint.java |  561 ----------
 .../hadoop/hbase/rsgroup/RSGroupAdminServer.java   |  647 -----------
 .../hbase/rsgroup/RSGroupInfoManagerImpl.java      |  801 -------------
 .../hadoop/hbase/rsgroup/RSGroupProtobufUtil.java  |   63 --
 .../hbase/rsgroup/VerifyingRSGroupAdminClient.java |  159 ---
 hbase-rsgroup/src/test/resources/log4j.properties  |   68 --
 hbase-server/pom.xml                               |    7 -
 .../hbase/tmpl/master/MasterStatusTmpl.jamon       |    4 +-
 .../hadoop/hbase/tmpl/master/RSGroupListTmpl.jamon |    6 +-
 .../hadoop/hbase/constraint/Constraints.java       |  100 +-
 .../hadoop/hbase/coprocessor/MasterObserver.java   |   34 +
 .../hadoop/hbase/coprocessor/RegionObserver.java   |  137 ++-
 .../hbase/favored/FavoredNodeLoadBalancer.java     |    1 +
 .../hadoop/hbase/favored/FavoredNodesPromoter.java |    2 +
 .../org/apache/hadoop/hbase/ipc/RpcServer.java     |   77 +-
 .../hadoop/hbase/ipc/RpcServerInterface.java       |   17 +-
 .../org/apache/hadoop/hbase/master/HMaster.java    |   76 +-
 .../apache/hadoop/hbase/master/LoadBalancer.java   |   58 +-
 .../hadoop/hbase/master/MasterCoprocessorHost.java |   48 +-
 .../hadoop/hbase/master/MasterFileSystem.java      |    3 +-
 .../hadoop/hbase/master/MasterRpcServices.java     |  341 +++++-
 .../apache/hadoop/hbase/master/MasterServices.java |   17 +-
 .../hbase/master/assignment/AssignmentManager.java |   21 +-
 .../master/balancer/FavoredStochasticBalancer.java |    1 +
 .../hbase/master/balancer/LoadBalancerFactory.java |   18 +-
 .../AbstractStateMachineNamespaceProcedure.java    |   11 +
 .../master/procedure/CreateNamespaceProcedure.java |    1 +
 .../master/procedure/CreateTableProcedure.java     |   27 +-
 .../master/procedure/MasterProcedureUtil.java      |   50 +-
 .../master/procedure/ModifyNamespaceProcedure.java |   19 +-
 .../master/procedure/ModifyTableProcedure.java     |   13 +-
 .../org/apache/hadoop/hbase/quotas/QuotaUtil.java  |   18 +-
 .../apache/hadoop/hbase/regionserver/HRegion.java  |   81 +-
 .../hbase/regionserver/HRegionFileSystem.java      |   12 +-
 .../hadoop/hbase/regionserver/HRegionServer.java   |   38 +-
 .../hadoop/hbase/regionserver/RSRpcServices.java   |  152 ++-
 .../apache/hadoop/hbase/regionserver/Region.java   |   53 +
 .../hbase/regionserver/RegionCoprocessorHost.java  |  151 ++-
 .../slowlog/DisruptorExceptionHandler.java         |   34 +-
 .../regionserver/slowlog/RingBufferEnvelope.java   |   41 +-
 .../hbase/regionserver/slowlog/RpcLogDetails.java  |   71 ++
 .../regionserver/slowlog/SlowLogEventHandler.java  |  208 ++++
 .../regionserver/slowlog/SlowLogRecorder.java      |  153 +++
 .../replication/regionserver/Replication.java      |    1 +
 .../hbase/rsgroup/DisabledRSGroupInfoManager.java  |  118 ++
 .../hadoop/hbase/rsgroup/RSGroupAdminClient.java   |  145 ++-
 .../hadoop/hbase/rsgroup/RSGroupAdminEndpoint.java |   61 +
 .../hbase/rsgroup/RSGroupAdminServiceImpl.java     |  398 +++++++
 .../hbase/rsgroup/RSGroupBasedLoadBalancer.java    |  140 +--
 .../hadoop/hbase/rsgroup/RSGroupInfoManager.java   |   76 +-
 .../hbase/rsgroup/RSGroupInfoManagerImpl.java      | 1184 ++++++++++++++++++++
 .../hbase/rsgroup/RSGroupMajorCompactionTTL.java   |   53 +-
 .../apache/hadoop/hbase/rsgroup/RSGroupUtil.java   |  127 +++
 .../hbase/security/access/AccessController.java    |   95 ++
 .../hbase/security/token/AuthenticationKey.java    |   14 +-
 .../token/AuthenticationTokenSecretManager.java    |   18 +-
 .../hbase/security/token/ZKSecretWatcher.java      |    1 +
 .../security/visibility/VisibilityController.java  |   22 +-
 .../hadoop/hbase/util/FSTableDescriptors.java      |   83 +-
 .../resources/hbase-webapps/master/rsgroup.jsp     |   10 +-
 .../org/apache/hadoop/hbase/HBaseTestCase.java     |   36 +-
 .../apache/hadoop/hbase/HBaseTestingUtility.java   |  219 ++--
 .../hbase/TestFSTableDescriptorForceCreation.java  |    6 +-
 .../TestHColumnDescriptorDefaultVersions.java      |   47 +-
 .../org/apache/hadoop/hbase/TestMultiVersions.java |   49 +-
 .../org/apache/hadoop/hbase/TestNamespace.java     |    3 +-
 .../apache/hadoop/hbase/TestRegionRebalancing.java |   15 +-
 .../org/apache/hadoop/hbase/TestSerialization.java |   14 +-
 .../org/apache/hadoop/hbase/TestZooKeeper.java     |    1 +
 .../hadoop/hbase/client/DummyAsyncTable.java       |    6 +
 .../org/apache/hadoop/hbase/client/TestAdmin2.java |  101 +-
 .../apache/hadoop/hbase/client/TestAdminBase.java  |    1 +
 .../hbase/client/TestAlwaysSetScannerId.java       |    4 +-
 .../apache/hadoop/hbase/client/TestAsyncTable.java |  203 ++++
 .../hadoop/hbase/client/TestCheckAndMutate.java    |  190 ++++
 .../hadoop/hbase/client/TestEnableTable.java       |   21 +-
 .../hadoop/hbase/client/TestFromClientSide.java    |   32 +-
 .../hadoop/hbase/client/TestFromClientSide3.java   |   42 +-
 .../hbase/client/TestIllegalTableDescriptor.java   |  157 +--
 .../hbase/client/TestIntraRowPagination.java       |   14 +-
 .../hbase/client/TestMalformedCellFromClient.java  |    5 +-
 .../hbase/client/TestReplicaWithCluster.java       |   54 +-
 .../hadoop/hbase/client/TestReplicasClient.java    |    6 +-
 .../hbase/client/TestScannersFromClientSide.java   |    4 +-
 .../hbase/client/TestSeparateClientZKCluster.java  |    5 +
 .../hadoop/hbase/client/TestSizeFailures.java      |   15 +-
 .../client/TestSnapshotCloneIndependence.java      |    6 +-
 .../hadoop/hbase/client/TestSnapshotMetadata.java  |   40 +-
 .../hbase/client/TestSplitOrMergeStatus.java       |   15 +-
 .../hadoop/hbase/constraint/TestConstraint.java    |   77 +-
 .../hbase/coprocessor/SimpleRegionObserver.java    |   97 +-
 .../coprocessor/TestCoprocessorInterface.java      |   16 +-
 .../hbase/coprocessor/TestCoprocessorMetrics.java  |   72 +-
 .../coprocessor/TestCoreMasterCoprocessor.java     |    1 -
 .../TestMasterCoprocessorExceptionWithAbort.java   |   16 +-
 .../TestMasterCoprocessorExceptionWithRemove.java  |   23 +-
 .../hbase/coprocessor/TestMasterObserver.java      |   44 +-
 .../coprocessor/TestOpenTableInCoprocessor.java    |   26 +-
 ...ObserverForAddingMutationsFromCoprocessors.java |   15 +-
 .../coprocessor/TestRegionObserverInterface.java   |   66 +-
 .../TestRegionObserverScannerOpenHook.java         |   49 +-
 .../coprocessor/TestRegionObserverStacking.java    |   20 +-
 .../hadoop/hbase/filter/FilterTestingCluster.java  |   15 +-
 .../hbase/filter/TestDependentColumnFilter.java    |   29 +-
 .../org/apache/hadoop/hbase/filter/TestFilter.java |   59 +-
 .../hbase/filter/TestFilterFromRegionSide.java     |   21 +-
 .../hadoop/hbase/filter/TestFilterWrapper.java     |   17 +-
 .../hbase/filter/TestInvocationRecordFilter.java   |   14 +-
 .../io/encoding/TestSeekBeforeWithReverseScan.java |   17 +-
 .../hfile/TestScannerSelectionUsingKeyRange.java   |   18 +-
 .../io/hfile/bucket/TestBucketCacheRefCnt.java     |    5 +-
 .../apache/hadoop/hbase/ipc/AbstractTestIPC.java   |    6 +-
 .../hbase/master/MockNoopMasterServices.java       |   14 +-
 .../hadoop/hbase/master/MockRegionServer.java      |   16 +
 .../hadoop/hbase/master/TestClusterRestart.java    |    6 +-
 .../org/apache/hadoop/hbase/master/TestMaster.java |   11 +-
 .../hbase/master/TestMasterMetricsWrapper.java     |   16 +-
 .../hadoop/hbase/master/TestRegionPlacement.java   |   16 +-
 .../hadoop/hbase/master/TestRegionPlacement2.java  |    6 +-
 .../TestMasterAbortWhileMergingTable.java          |    5 +-
 .../hbase/master/balancer/BalancerTestBase.java    |    8 +-
 .../balancer/RSGroupableBalancerTestBase.java      |   84 +-
 .../balancer/TestFavoredNodeTableImport.java       |   18 +-
 .../TestFavoredStochasticBalancerPickers.java      |    2 +-
 .../TestFavoredStochasticLoadBalancer.java         |  100 +-
 .../balancer/TestRSGroupBasedLoadBalancer.java     |   42 +-
 ...lancerWithStochasticLoadBalancerAsInternal.java |    4 +-
 .../master/cleaner/TestSnapshotFromMaster.java     |   12 +-
 .../procedure/TestRestoreSnapshotProcedure.java    |   36 +-
 .../hadoop/hbase/master/procedure/TestSCPBase.java |   13 +-
 .../procedure/TestSCPWithReplicasWithRSGroup.java  |    0
 .../master/procedure/TestSplitWALProcedure.java    |    6 +-
 .../TestTableDescriptorModificationFromClient.java |   97 +-
 .../hadoop/hbase/mob/MobStressToolRunner.java      |   39 +-
 .../hbase/mob/TestDefaultMobStoreFlusher.java      |   63 +-
 .../hadoop/hbase/mob/TestMobCompactionBase.java    |   38 +-
 .../hadoop/hbase/mob/TestMobCompactionOptMode.java |   10 +-
 .../mob/TestMobCompactionOptRegionBatchMode.java   |   18 +-
 .../hbase/mob/TestMobCompactionRegularMode.java    |   18 +-
 .../TestMobCompactionRegularRegionBatchMode.java   |   18 +-
 .../hadoop/hbase/mob/TestMobDataBlockEncoding.java |   31 +-
 .../apache/hadoop/hbase/mob/TestMobFileCache.java  |    6 +-
 .../hadoop/hbase/mob/TestMobFileCleanerChore.java  |   33 +-
 .../hadoop/hbase/mob/TestMobStoreCompaction.java   |   41 +-
 .../hadoop/hbase/mob/TestMobStoreScanner.java      |   28 +-
 .../hbase/namespace/TestNamespaceAuditor.java      |   23 +-
 .../hadoop/hbase/regionserver/RegionAsTable.java   |    6 +
 .../hbase/regionserver/TestAtomicOperation.java    |   19 +-
 .../hbase/regionserver/TestBlocksScanned.java      |   24 +-
 .../hadoop/hbase/regionserver/TestBulkLoad.java    |   15 +-
 .../hbase/regionserver/TestColumnSeeking.java      |   19 +-
 .../hbase/regionserver/TestCompactSplitThread.java |   13 +-
 .../hbase/regionserver/TestCompactingMemStore.java |   20 +-
 .../hadoop/hbase/regionserver/TestCompaction.java  |   44 +-
 .../hbase/regionserver/TestCompactionPolicy.java   |   25 +-
 .../hbase/regionserver/TestDefaultMemStore.java    |    1 +
 .../hbase/regionserver/TestDeleteMobTable.java     |   80 +-
 .../hbase/regionserver/TestFSErrorsExposed.java    |   12 +-
 .../regionserver/TestGetClosestAtOrBefore.java     |    1 +
 .../hadoop/hbase/regionserver/TestHRegion.java     |  261 ++++-
 .../hadoop/hbase/regionserver/TestHRegionInfo.java |    1 +
 .../hbase/regionserver/TestHRegionOnCluster.java   |   12 +-
 .../hbase/regionserver/TestJoinedScanners.java     |   14 +-
 .../hbase/regionserver/TestMajorCompaction.java    |    7 +-
 .../hbase/regionserver/TestMinorCompaction.java    |    6 +-
 .../hbase/regionserver/TestMutateRowsRecovery.java |   12 +-
 .../TestNewVersionBehaviorFromClientSide.java      |   18 +-
 .../hadoop/hbase/regionserver/TestParallelPut.java |   14 +-
 .../regionserver/TestPerColumnFamilyFlush.java     |   50 +-
 .../hbase/regionserver/TestRegionInfoBuilder.java  |    1 +
 .../hadoop/hbase/regionserver/TestRegionOpen.java  |   22 +-
 .../regionserver/TestRegionReplicaFailover.java    |    9 +-
 .../regionserver/TestRegionServerAbortTimeout.java |   11 +-
 .../hbase/regionserver/TestResettingCounters.java  |   16 +-
 .../hbase/regionserver/TestReversibleScanners.java |   13 +-
 .../hadoop/hbase/regionserver/TestRowTooBig.java   |   43 +-
 .../regionserver/TestSCVFWithMiniCluster.java      |   18 +-
 .../hadoop/hbase/regionserver/TestScanner.java     |   23 +-
 .../regionserver/TestScannerWithCorruptHFile.java  |   14 +-
 .../TestSplitTransactionOnCluster.java             |    5 +-
 .../regionserver/TestStoreScannerClosure.java      |   13 +-
 .../apache/hadoop/hbase/regionserver/TestTags.java |   56 +-
 .../TestWalAndCompactingMemStoreFlush.java         |   19 +-
 .../hadoop/hbase/regionserver/TestWideScanner.java |   11 +-
 .../compactions/TestCompactedHFilesDischarger.java |   13 +-
 .../compactions/TestDateTieredCompactor.java       |   10 +-
 .../compactions/TestStripeCompactionPolicy.java    |    7 +-
 .../compactions/TestStripeCompactor.java           |   10 +-
 .../regionserver/slowlog/TestSlowLogRecorder.java  |  594 ++++++++++
 .../regionserver/wal/AbstractTestWALReplay.java    |   75 +-
 .../regionserver/wal/TestLogRollingNoCluster.java  |    5 +-
 .../replication/TestMultiSlaveReplication.java     |   20 +-
 ...TestNamespaceReplicationWithBulkLoadedData.java |    7 +-
 .../replication/TestPerTableCFReplication.java     |   92 +-
 .../hbase/replication/TestReplicationBase.java     |    3 +
 .../hbase/replication/TestReplicationStatus.java   |   69 +-
 .../hbase/replication/TestReplicationWithTags.java |   20 +-
 .../TestGlobalReplicationThrottler.java            |   18 +-
 .../TestRegionReplicaReplicationEndpoint.java      |   23 +-
 ...stRegionReplicaReplicationEndpointNoMaster.java |    5 +-
 .../hbase/rsgroup/EnableRSGroupsTestBase.java      |   28 +-
 .../hadoop/hbase/rsgroup/TestEnableRSGroups.java   |   29 +-
 .../rsgroup/TestEnableRSGroupsCompatibility.java   |   37 +-
 .../hbase/rsgroup/TestMigrateRSGroupInfo.java      |  190 ++++
 .../rsgroup/TestRSGroupMajorCompactionTTL.java     |   16 +-
 .../hadoop/hbase/rsgroup/TestRSGroupsAdmin1.java   |  259 ++---
 .../hadoop/hbase/rsgroup/TestRSGroupsAdmin2.java   |  383 +++----
 .../hadoop/hbase/rsgroup/TestRSGroupsBalance.java  |   57 +-
 .../hadoop/hbase/rsgroup/TestRSGroupsBase.java     |  296 ++---
 .../hadoop/hbase/rsgroup/TestRSGroupsBasics.java   |  132 +--
 .../hbase/rsgroup/TestRSGroupsCPHookCalled.java    |   91 ++
 .../hadoop/hbase/rsgroup/TestRSGroupsKillRS.java   |  104 +-
 .../hbase/rsgroup/TestRSGroupsOfflineMode.java     |   48 +-
 .../hadoop/hbase/rsgroup/TestRSGroupsWithACL.java  |  113 +-
 .../hbase/rsgroup/VerifyingRSGroupAdmin.java       |  897 +++++++++++++++
 .../security/access/TestAccessController.java      |   99 +-
 .../security/access/TestAccessController2.java     |   28 +-
 .../security/access/TestAccessController3.java     |   28 +-
 .../access/TestCellACLWithMultipleVersions.java    |   27 +-
 .../hadoop/hbase/security/access/TestCellACLs.java |   18 +-
 .../TestCoprocessorWhitelistMasterObserver.java    |   36 +-
 .../security/access/TestNamespaceCommands.java     |   13 +-
 .../security/access/TestScanEarlyTermination.java  |   27 +-
 .../security/access/TestTablePermissions.java      |    1 +
 .../access/TestWithDisabledAuthorization.java      |   38 +-
 .../hbase/security/token/SecureTestCluster.java    |   35 +-
 .../token/TestDelegationTokenWithEncryption.java   |   10 +-
 .../hbase/security/token/TestZKSecretWatcher.java  |   23 +-
 ...tVisibilityLabelReplicationWithExpAsString.java |   18 +-
 .../security/visibility/TestVisibilityLabels.java  |   48 +-
 .../TestVisibilityLabelsReplication.java           |   18 +-
 .../visibility/TestVisibilityWithCheckAuths.java   |   16 +-
 .../hadoop/hbase/util/TestFSTableDescriptors.java  |   46 +-
 .../hbase/util/TestMiniClusterLoadSequential.java  |   21 +-
 hbase-shell/pom.xml                                |   35 -
 hbase-shell/src/main/ruby/hbase/admin.rb           |   83 ++
 hbase-shell/src/main/ruby/hbase/rsgroup_admin.rb   |   32 +-
 hbase-shell/src/main/ruby/shell.rb                 |    2 +
 .../ruby/shell/commands/clear_slowlog_responses.rb |   47 +
 .../src/main/ruby/shell/commands/get_rsgroup.rb    |    3 +-
 .../ruby/shell/commands/get_slowlog_responses.rb   |   78 ++
 .../src/main/ruby/shell/commands/list_rsgroups.rb  |    4 +-
 .../hadoop/hbase/client/TestRSGroupShell.java      |   11 +-
 hbase-shell/src/test/ruby/hbase/admin_test.rb      |   14 +
 .../src/test/ruby/shell/rsgroup_shell_test.rb      |   30 +-
 .../hbase/thrift/ThriftHBaseServiceHandler.java    |   12 +-
 .../hadoop/hbase/thrift/ThriftUtilities.java       |   12 +-
 .../hadoop/hbase/thrift2/client/ThriftAdmin.java   |   77 ++
 .../hadoop/hbase/thrift2/client/ThriftTable.java   |    6 +
 .../hbase/thrift2/TestThrift2ServerCmdLine.java    |    2 +-
 .../thrift2/TestThriftHBaseServiceHandler.java     |   48 +-
 .../TestThriftHBaseServiceHandlerWithLabels.java   |   21 +-
 .../TestThriftHBaseServiceHandlerWithReadOnly.java |   22 +-
 .../hbase/zookeeper/MiniZooKeeperCluster.java      |  127 ++-
 pom.xml                                            |   63 +-
 src/main/asciidoc/_chapters/hbase-default.adoc     |   38 +
 src/main/asciidoc/_chapters/ops_mgt.adoc           |  266 ++++-
 src/main/asciidoc/_chapters/upgrading.adoc         |    4 +
 src/site/xdoc/downloads.xml                        |   14 +-
 344 files changed, 13377 insertions(+), 6928 deletions(-)
 create mode 100644 dev-support/design-docs/HBASE-22514-Reimplement-rsgroup-feature-and-move-it-into-core-of-HBase.pdf
 create mode 100755 dev-support/jenkins_precommit_github_yetus.sh
 rename hbase-rsgroup/src/test/resources/hbase-site.xml => hbase-annotations/src/test/java/org/apache/hadoop/hbase/testclassification/RSGroupTests.java (70%)
 delete mode 100644 hbase-client/src/main/java/org/apache/hadoop/hbase/RSGroupTableAccessor.java
 create mode 100644 hbase-client/src/main/java/org/apache/hadoop/hbase/client/SlowLogParams.java
 create mode 100644 hbase-client/src/main/java/org/apache/hadoop/hbase/client/SlowLogQueryFilter.java
 create mode 100644 hbase-client/src/main/java/org/apache/hadoop/hbase/client/SlowLogRecord.java
 copy hbase-rsgroup/src/main/java/org/apache/hadoop/hbase/rsgroup/RSGroupableBalancer.java => hbase-protocol-shaded/src/main/protobuf/RSGroup.proto (62%)
 copy {hbase-rsgroup => hbase-protocol-shaded}/src/main/protobuf/RSGroupAdmin.proto (89%)
 create mode 100644 hbase-protocol-shaded/src/main/protobuf/TooSlowLog.proto
 rename {hbase-rsgroup => hbase-protocol}/src/main/protobuf/RSGroupAdmin.proto (100%)
 delete mode 100644 hbase-rsgroup/README.txt
 delete mode 100644 hbase-rsgroup/pom.xml
 delete mode 100644 hbase-rsgroup/src/main/java/org/apache/hadoop/hbase/rsgroup/RSGroupAdmin.java
 delete mode 100644 hbase-rsgroup/src/main/java/org/apache/hadoop/hbase/rsgroup/RSGroupAdminEndpoint.java
 delete mode 100644 hbase-rsgroup/src/main/java/org/apache/hadoop/hbase/rsgroup/RSGroupAdminServer.java
 delete mode 100644 hbase-rsgroup/src/main/java/org/apache/hadoop/hbase/rsgroup/RSGroupInfoManagerImpl.java
 delete mode 100644 hbase-rsgroup/src/main/java/org/apache/hadoop/hbase/rsgroup/RSGroupProtobufUtil.java
 delete mode 100644 hbase-rsgroup/src/test/java/org/apache/hadoop/hbase/rsgroup/VerifyingRSGroupAdminClient.java
 delete mode 100644 hbase-rsgroup/src/test/resources/log4j.properties
 copy hbase-rsgroup/src/main/java/org/apache/hadoop/hbase/rsgroup/RSGroupableBalancer.java => hbase-server/src/main/java/org/apache/hadoop/hbase/regionserver/slowlog/DisruptorExceptionHandler.java (52%)
 copy hbase-rsgroup/src/main/java/org/apache/hadoop/hbase/rsgroup/RSGroupableBalancer.java => hbase-server/src/main/java/org/apache/hadoop/hbase/regionserver/slowlog/RingBufferEnvelope.java (50%)
 create mode 100644 hbase-server/src/main/java/org/apache/hadoop/hbase/regionserver/slowlog/RpcLogDetails.java
 create mode 100644 hbase-server/src/main/java/org/apache/hadoop/hbase/regionserver/slowlog/SlowLogEventHandler.java
 create mode 100644 hbase-server/src/main/java/org/apache/hadoop/hbase/regionserver/slowlog/SlowLogRecorder.java
 create mode 100644 hbase-server/src/main/java/org/apache/hadoop/hbase/rsgroup/DisabledRSGroupInfoManager.java
 rename {hbase-rsgroup => hbase-server}/src/main/java/org/apache/hadoop/hbase/rsgroup/RSGroupAdminClient.java (66%)
 create mode 100644 hbase-server/src/main/java/org/apache/hadoop/hbase/rsgroup/RSGroupAdminEndpoint.java
 create mode 100644 hbase-server/src/main/java/org/apache/hadoop/hbase/rsgroup/RSGroupAdminServiceImpl.java
 rename {hbase-rsgroup => hbase-server}/src/main/java/org/apache/hadoop/hbase/rsgroup/RSGroupBasedLoadBalancer.java (80%)
 rename {hbase-rsgroup => hbase-server}/src/main/java/org/apache/hadoop/hbase/rsgroup/RSGroupInfoManager.java (55%)
 create mode 100644 hbase-server/src/main/java/org/apache/hadoop/hbase/rsgroup/RSGroupInfoManagerImpl.java
 rename {hbase-rsgroup => hbase-server}/src/main/java/org/apache/hadoop/hbase/rsgroup/RSGroupMajorCompactionTTL.java (76%)
 create mode 100644 hbase-server/src/main/java/org/apache/hadoop/hbase/rsgroup/RSGroupUtil.java
 rename {hbase-rsgroup => hbase-server}/src/test/java/org/apache/hadoop/hbase/master/balancer/RSGroupableBalancerTestBase.java (85%)
 rename {hbase-rsgroup => hbase-server}/src/test/java/org/apache/hadoop/hbase/master/balancer/TestRSGroupBasedLoadBalancer.java (86%)
 rename {hbase-rsgroup => hbase-server}/src/test/java/org/apache/hadoop/hbase/master/balancer/TestRSGroupBasedLoadBalancerWithStochasticLoadBalancerAsInternal.java (98%)
 copy {hbase-rsgroup => hbase-server}/src/test/java/org/apache/hadoop/hbase/master/procedure/TestSCPWithReplicasWithRSGroup.java (100%)
 create mode 100644 hbase-server/src/test/java/org/apache/hadoop/hbase/regionserver/slowlog/TestSlowLogRecorder.java
 rename hbase-rsgroup/src/test/java/org/apache/hadoop/hbase/rsgroup/TestEnableRSGroups.java => hbase-server/src/test/java/org/apache/hadoop/hbase/rsgroup/EnableRSGroupsTestBase.java (73%)
 rename hbase-rsgroup/src/main/java/org/apache/hadoop/hbase/rsgroup/RSGroupableBalancer.java => hbase-server/src/test/java/org/apache/hadoop/hbase/rsgroup/TestEnableRSGroups.java (55%)
 rename hbase-rsgroup/src/test/java/org/apache/hadoop/hbase/master/procedure/TestSCPWithReplicasWithRSGroup.java => hbase-server/src/test/java/org/apache/hadoop/hbase/rsgroup/TestEnableRSGroupsCompatibility.java (60%)
 create mode 100644 hbase-server/src/test/java/org/apache/hadoop/hbase/rsgroup/TestMigrateRSGroupInfo.java
 rename {hbase-rsgroup => hbase-server}/src/test/java/org/apache/hadoop/hbase/rsgroup/TestRSGroupMajorCompactionTTL.java (91%)
 rename {hbase-rsgroup => hbase-server}/src/test/java/org/apache/hadoop/hbase/rsgroup/TestRSGroupsAdmin1.java (58%)
 rename {hbase-rsgroup => hbase-server}/src/test/java/org/apache/hadoop/hbase/rsgroup/TestRSGroupsAdmin2.java (60%)
 rename {hbase-rsgroup => hbase-server}/src/test/java/org/apache/hadoop/hbase/rsgroup/TestRSGroupsBalance.java (75%)
 rename {hbase-rsgroup => hbase-server}/src/test/java/org/apache/hadoop/hbase/rsgroup/TestRSGroupsBase.java (57%)
 rename {hbase-rsgroup => hbase-server}/src/test/java/org/apache/hadoop/hbase/rsgroup/TestRSGroupsBasics.java (64%)
 create mode 100644 hbase-server/src/test/java/org/apache/hadoop/hbase/rsgroup/TestRSGroupsCPHookCalled.java
 rename {hbase-rsgroup => hbase-server}/src/test/java/org/apache/hadoop/hbase/rsgroup/TestRSGroupsKillRS.java (73%)
 rename {hbase-rsgroup => hbase-server}/src/test/java/org/apache/hadoop/hbase/rsgroup/TestRSGroupsOfflineMode.java (80%)
 rename {hbase-rsgroup => hbase-server}/src/test/java/org/apache/hadoop/hbase/rsgroup/TestRSGroupsWithACL.java (76%)
 create mode 100644 hbase-server/src/test/java/org/apache/hadoop/hbase/rsgroup/VerifyingRSGroupAdmin.java
 create mode 100644 hbase-shell/src/main/ruby/shell/commands/clear_slowlog_responses.rb
 create mode 100644 hbase-shell/src/main/ruby/shell/commands/get_slowlog_responses.rb


[hbase] 02/02: HBASE-23876 Add JDK11 compilation and unit test support to nightly job

Posted by nd...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ndimiduk pushed a commit to branch HBASE-23876/jdk11-nightly-master
in repository https://gitbox.apache.org/repos/asf/hbase.git

commit bd362a4caf3942ba4e64a9116418de065e594a8e
Author: Nick Dimiduk <nd...@apache.org>
AuthorDate: Thu Feb 20 16:14:23 2020 -0800

    HBASE-23876 Add JDK11 compilation and unit test support to nightly job
    
    Builds on the Dockerfile changes provided by HBASE-23767.
---
 dev-support/Jenkinsfile            | 147 +++++++++++++++++++++++++++++--------
 dev-support/hbase_nightly_yetus.sh |   4 +
 2 files changed, 122 insertions(+), 29 deletions(-)

diff --git a/dev-support/Jenkinsfile b/dev-support/Jenkinsfile
index 2eb0947..5271d35 100644
--- a/dev-support/Jenkinsfile
+++ b/dev-support/Jenkinsfile
@@ -35,8 +35,9 @@ pipeline {
     // where we'll write everything from different steps. Need a copy here so the final step can check for success/failure.
     OUTPUT_DIR_RELATIVE_GENERAL = 'output-general'
     OUTPUT_DIR_RELATIVE_JDK7 = 'output-jdk7'
-    OUTPUT_DIR_RELATIVE_HADOOP2 = 'output-jdk8-hadoop2'
-    OUTPUT_DIR_RELATIVE_HADOOP3 = 'output-jdk8-hadoop3'
+    OUTPUT_DIR_RELATIVE_JDK8_HADOOP2 = 'output-jdk8-hadoop2'
+    OUTPUT_DIR_RELATIVE_JDK8_HADOOP3 = 'output-jdk8-hadoop3'
+    OUTPUT_DIR_RELATIVE_JDK11_HADOOP3 = 'output-jdk11-hadoop3'
 
     PROJECT = 'hbase'
     PROJECT_PERSONALITY = 'https://raw.githubusercontent.com/apache/hbase/master/dev-support/hbase-personality.sh'
@@ -49,6 +50,8 @@ pipeline {
     // These tests currently have known failures. Once they burn down to 0, remove from here so that new problems will cause a failure.
     TESTS_FILTER = 'cc,checkstyle,javac,javadoc,pylint,shellcheck,whitespace,perlcritic,ruby-lint,rubocop,mvnsite'
     EXCLUDE_TESTS_URL = "${JENKINS_URL}/job/HBase-Find-Flaky-Tests/job/${BRANCH_NAME}/lastSuccessfulBuild/artifact/excludes"
+    SHALLOW_CHECKS = 'all,-unit,-findbugs' // run by the 'yetus general check'
+    DEEP_CHECKS = 'maven,mvninstall,compile,javac,unit,htmlout' // run by 'yetus jdkX HadoopY checks'
   }
   parameters {
     booleanParam(name: 'USE_YETUS_PRERELEASE', defaultValue: false, description: '''Check to use the current HEAD of apache/yetus rather than our configured release.
@@ -108,7 +111,7 @@ pipeline {
             dir ("tools") {
               sh """#!/usr/bin/env bash
                 set -e
-                echo "Downloading Project personality."
+                echo "Downloading Project personality from ${env.PROJECT_PERSONALITY}"
                 curl -L  -o personality.sh "${env.PROJECT_PERSONALITY}"
               """
             }
@@ -177,8 +180,9 @@ pipeline {
         // we skip some due to e.g. branch-specific JDK or Hadoop support
         stash name: 'general-result', allowEmpty: true, includes: "${OUTPUT_DIR_RELATIVE_GENERAL}/doesn't-match"
         stash name: 'jdk7-result', allowEmpty: true, includes: "${OUTPUT_DIR_RELATIVE_JDK7}/doesn't-match"
-        stash name: 'hadoop2-result', allowEmpty: true, includes: "${OUTPUT_DIR_RELATIVE_HADOOP2}/doesn't-match"
-        stash name: 'hadoop3-result', allowEmpty: true, includes: "${OUTPUT_DIR_RELATIVE_HADOOP3}/doesn't-match"
+        stash name: 'jdk8-hadoop2-result', allowEmpty: true, includes: "${OUTPUT_DIR_RELATIVE_JDK8_HADOOP2}/doesn't-match"
+        stash name: 'jdk8-hadoop3-result', allowEmpty: true, includes: "${OUTPUT_DIR_RELATIVE_JDK8_HADOOP3}/doesn't-match"
+        stash name: 'jdk11-hadoop3-result', allowEmpty: true, includes: "${OUTPUT_DIR_RELATIVE_JDK11_HADOOP3}/doesn't-match"
         stash name: 'srctarball-result', allowEmpty: true, includes: "output-srctarball/doesn't-match"
       }
     }
@@ -194,10 +198,9 @@ pipeline {
             BASEDIR = "${env.WORKSPACE}/component"
             // TODO does hadoopcheck need to be jdk specific?
             // Should be things that work with multijdk
-            TESTS = 'all,-unit,-findbugs'
-            // on branches that don't support jdk7, this will already be JAVA_HOME, so we'll end up not
-            // doing multijdk there.
-            MULTIJDK = '/usr/lib/jvm/java-8-openjdk-amd64'
+            TESTS = "${env.SHALLOW_CHECKS}"
+            SET_JAVA_HOME = '/usr/lib/jvm/java-8'
+            MULTIJDK = '/usr/lib/jvm/java-8,/usr/lib/jvm/java-11'
             OUTPUT_DIR_RELATIVE = "${env.OUTPUT_DIR_RELATIVE_GENERAL}"
             OUTPUT_DIR = "${env.WORKSPACE}/${env.OUTPUT_DIR_RELATIVE_GENERAL}"
           }
@@ -264,10 +267,10 @@ pipeline {
           }
           environment {
             BASEDIR = "${env.WORKSPACE}/component"
-            TESTS = 'maven,mvninstall,compile,javac,unit,htmlout'
+            TESTS = "${env.DEEP_CHECKS}"
             OUTPUT_DIR_RELATIVE = "${env.OUTPUT_DIR_RELATIVE_JDK7}"
             OUTPUT_DIR = "${env.WORKSPACE}/${env.OUTPUT_DIR_RELATIVE_JDK7}"
-            // On branches where we do jdk7 checks, jdk7 will be JAVA_HOME already.
+            SET_JAVA_HOME = "/usr/lib/jvm/java-7"
           }
           steps {
             // Must do prior to anything else, since if one of them timesout we'll stash the commentfile
@@ -342,12 +345,10 @@ pipeline {
           }
           environment {
             BASEDIR = "${env.WORKSPACE}/component"
-            TESTS = 'maven,mvninstall,compile,javac,unit,findbugs,htmlout'
-            OUTPUT_DIR_RELATIVE = "${env.OUTPUT_DIR_RELATIVE_HADOOP2}"
-            OUTPUT_DIR = "${env.WORKSPACE}/${env.OUTPUT_DIR_RELATIVE_HADOOP2}"
-            // This isn't strictly needed on branches that only support jdk8, but doesn't hurt
-            // and is needed on branches that do both jdk7 and jdk8
-            SET_JAVA_HOME = '/usr/lib/jvm/java-8-openjdk-amd64'
+            TESTS = "${env.DEEP_CHECKS}"
+            OUTPUT_DIR_RELATIVE = "${env.OUTPUT_DIR_RELATIVE_JDK8_HADOOP2}"
+            OUTPUT_DIR = "${env.WORKSPACE}/${env.OUTPUT_DIR_RELATIVE_JDK8_HADOOP2}"
+            SET_JAVA_HOME = '/usr/lib/jvm/java-8'
           }
           steps {
             // Must do prior to anything else, since if one of them timesout we'll stash the commentfile
@@ -383,7 +384,7 @@ pipeline {
           }
           post {
             always {
-              stash name: 'hadoop2-result', includes: "${OUTPUT_DIR_RELATIVE}/commentfile"
+              stash name: 'jdk8-hadoop2-result', includes: "${OUTPUT_DIR_RELATIVE}/commentfile"
               junit testResults: "${env.OUTPUT_DIR_RELATIVE}/**/target/**/TEST-*.xml", allowEmptyResults: true
               // zip surefire reports.
               sh '''#!/bin/bash -e
@@ -427,12 +428,10 @@ pipeline {
           }
           environment {
             BASEDIR = "${env.WORKSPACE}/component"
-            TESTS = 'maven,mvninstall,compile,javac,unit,htmlout'
-            OUTPUT_DIR_RELATIVE = "${env.OUTPUT_DIR_RELATIVE_HADOOP3}"
-            OUTPUT_DIR = "${env.WORKSPACE}/${env.OUTPUT_DIR_RELATIVE_HADOOP3}"
-            // This isn't strictly needed on branches that only support jdk8, but doesn't hurt
-            // and is needed on branches that do both jdk7 and jdk8
-            SET_JAVA_HOME = '/usr/lib/jvm/java-8-openjdk-amd64'
+            TESTS = "${env.DEEP_CHECKS}"
+            OUTPUT_DIR_RELATIVE = "${env.OUTPUT_DIR_RELATIVE_JDK8_HADOOP3}"
+            OUTPUT_DIR = "${env.WORKSPACE}/${env.OUTPUT_DIR_RELATIVE_JDK8_HADOOP3}"
+            SET_JAVA_HOME = '/usr/lib/jvm/java-8'
             // Activates hadoop 3.0 profile in maven runs.
             HADOOP_PROFILE = '3.0'
           }
@@ -470,7 +469,7 @@ pipeline {
           }
           post {
             always {
-              stash name: 'hadoop3-result', includes: "${OUTPUT_DIR_RELATIVE}/commentfile"
+              stash name: 'jdk8-hadoop3-result', includes: "${OUTPUT_DIR_RELATIVE}/commentfile"
               junit testResults: "${env.OUTPUT_DIR_RELATIVE}/**/target/**/TEST-*.xml", allowEmptyResults: true
               // zip surefire reports.
               sh '''#!/bin/bash -e
@@ -501,8 +500,96 @@ pipeline {
             }
           }
         }
+        stage ('yetus jdk11 hadoop3 checks') {
+          agent {
+            node {
+              label 'Hadoop'
+            }
+          }
+          when {
+            not {
+              branch 'branch-1*'
+            }
+          }
+          environment {
+            BASEDIR = "${env.WORKSPACE}/component"
+            TESTS = "${env.DEEP_CHECKS}"
+            OUTPUT_DIR_RELATIVE = "${env.OUTPUT_DIR_RELATIVE_JDK11_HADOOP3}"
+            OUTPUT_DIR = "${env.WORKSPACE}/${env.OUTPUT_DIR_RELATIVE_JDK11_HADOOP3}"
+            SET_JAVA_HOME = "/usr/lib/jvm/java-11"
+            // Activates hadoop 3.0 profile in maven runs.
+            HADOOP_PROFILE = '3.0'
+            // ErrorProne is broken on JDK11, see HBASE-23894
+            SKIP_ERROR_PRONE = 'true'
+          }
+          steps {
+            // Must do prior to anything else, since if one of them timesout we'll stash the commentfile
+            sh '''#!/usr/bin/env bash
+              set -e
+              rm -rf "${OUTPUT_DIR}" && mkdir "${OUTPUT_DIR}"
+              echo '(x) {color:red}-1 jdk11 hadoop3 checks{color}' >"${OUTPUT_DIR}/commentfile"
+              echo "-- Something went wrong running this stage, please [check relevant console output|${BUILD_URL}/console]." >> "${OUTPUT_DIR}/commentfile"
+'''
+            unstash 'yetus'
+            dir('component') {
+              checkout scm
+            }
+            sh '''#!/usr/bin/env bash
+              set -e
+              rm -rf "${OUTPUT_DIR}/machine" && mkdir "${OUTPUT_DIR}/machine"
+              "${BASEDIR}/dev-support/gather_machine_environment.sh" "${OUTPUT_DIR_RELATIVE}/machine"
+              echo "got the following saved stats in '${OUTPUT_DIR_RELATIVE}/machine'"
+              ls -lh "${OUTPUT_DIR_RELATIVE}/machine"
+'''
+            sh '''#!/usr/bin/env bash
+              set -e
+              declare -i status=0
+              if "${BASEDIR}/dev-support/hbase_nightly_yetus.sh" ; then
+                echo '(/) {color:green}+1 jdk11 hadoop3 checks{color}' > "${OUTPUT_DIR}/commentfile"
+              else
+                echo '(x) {color:red}-1 jdk11 hadoop3 checks{color}' > "${OUTPUT_DIR}/commentfile"
+                status=1
+              fi
+              echo "-- For more information [see jdk11 report|${BUILD_URL}/JDK11_Nightly_Build_Report/]" >> "${OUTPUT_DIR}/commentfile"
+              exit "${status}"
+            '''
+          }
+          post {
+            always {
+              stash name: 'jdk11-hadoop3-result', includes: "${OUTPUT_DIR_RELATIVE}/commentfile"
+              junit testResults: "${env.OUTPUT_DIR_RELATIVE}/**/target/**/TEST-*.xml", allowEmptyResults: true
+              // zip surefire reports.
+              sh '''#!/bin/bash -e
+                if [ -d "${OUTPUT_DIR}/archiver" ]; then
+                  count=$(find "${OUTPUT_DIR}/archiver" -type f | wc -l)
+                  if [[ 0 -ne ${count} ]]; then
+                    echo "zipping ${count} archived files"
+                    zip -q -m -r "${OUTPUT_DIR}/test_logs.zip" "${OUTPUT_DIR}/archiver"
+                  else
+                    echo "No archived files, skipping compressing."
+                  fi
+                else
+                  echo "No archiver directory, skipping compressing."
+                fi
+'''
+              // Has to be relative to WORKSPACE.
+              archiveArtifacts artifacts: "${env.OUTPUT_DIR_RELATIVE}/*"
+              archiveArtifacts artifacts: "${env.OUTPUT_DIR_RELATIVE}/**/*"
+              publishHTML target: [
+                allowMissing         : true,
+                keepAll              : true,
+                alwaysLinkToLastBuild: true,
+                // Has to be relative to WORKSPACE.
+                reportDir            : "${env.OUTPUT_DIR_RELATIVE}",
+                reportFiles          : 'console-report.html',
+                reportName           : 'JDK11 Nightly Build Report'
+              ]
+            }
+          }
+        }
         // This is meant to mimic what a release manager will do to create RCs.
         // See http://hbase.apache.org/book.html#maven.release
+        // TODO (HBASE-23870): replace this with invocation of the release tool
         stage ('packaging and integration') {
           tools {
             maven 'Maven (latest)'
@@ -636,14 +723,16 @@ pipeline {
          try {
            unstash 'general-result'
            unstash 'jdk7-result'
-           unstash 'hadoop2-result'
-           unstash 'hadoop3-result'
+           unstash 'jdk8-hadoop2-result'
+           unstash 'jdk8-hadoop3-result'
+           unstash 'jdk11-hadoop3-result'
            unstash 'srctarball-result'
            sh "printenv"
            def results = ["${env.OUTPUT_DIR_RELATIVE_GENERAL}/commentfile",
                           "${env.OUTPUT_DIR_RELATIVE_JDK7}/commentfile",
-                          "${env.OUTPUT_DIR_RELATIVE_HADOOP2}/commentfile",
-                          "${env.OUTPUT_DIR_RELATIVE_HADOOP3}/commentfile",
+                          "${env.OUTPUT_DIR_RELATIVE_JDK8_HADOOP2}/commentfile",
+                          "${env.OUTPUT_DIR_RELATIVE_JDK8_HADOOP3}/commentfile",
+                          "${env.OUTPUT_DIR_RELATIVE_JDK11_HADOOP3}/commentfile",
                           'output-srctarball/commentfile',
                           'output-integration/commentfile']
            echo env.BRANCH_NAME
diff --git a/dev-support/hbase_nightly_yetus.sh b/dev-support/hbase_nightly_yetus.sh
index 9159057..81a5b3f 100755
--- a/dev-support/hbase_nightly_yetus.sh
+++ b/dev-support/hbase_nightly_yetus.sh
@@ -85,6 +85,10 @@ if [[ -n "${HADOOP_PROFILE}" ]]; then
   YETUS_ARGS=("--hadoop-profile=${HADOOP_PROFILE}" "${YETUS_ARGS[@]}")
 fi
 
+if [[ -n "${SKIP_ERROR_PRONE}" ]]; then
+  YETUS_ARGS=("--skip-errorprone" "${YETUS_ARGS[@]}")
+fi
+
 if [[ true == "${DEBUG}" ]]; then
   YETUS_ARGS=("--debug" "${YETUS_ARGS[@]}")
 fi


[hbase] 01/02: HBASE-23767 Add JDK11 compilation and unit test support to Github precommit

Posted by nd...@apache.org.
This is an automated email from the ASF dual-hosted git repository.

ndimiduk pushed a commit to branch HBASE-23876/jdk11-nightly-master
in repository https://gitbox.apache.org/repos/asf/hbase.git

commit 31f8086aa7d8b98d3db500bb9b3c8fc0df61a488
Author: Nick Dimiduk <nd...@apache.org>
AuthorDate: Wed Feb 19 16:10:57 2020 -0800

    HBASE-23767 Add JDK11 compilation and unit test support to Github precommit
    
    Rebuild our Dockerfile with support for multiple JDK versions. Use
    multiple stages in the Jenkinsfile instead of yetus's multijdk because
    of YETUS-953. Run those multiple stages in parallel to speed up
    results.
    
    Note that multiple stages means multiple Yetus invocations means
    multiple comments on the PreCommit. This should become more obvious to
    users once we can make use of GitHub Checks API, HBASE-23902.
---
 .editorconfig                                 |   1 +
 dev-support/Jenkinsfile_GitHub                | 463 +++++++++++++++++---------
 dev-support/docker/Dockerfile                 | 194 ++++++++---
 dev-support/hbase-personality.sh              |   3 -
 dev-support/jenkins_precommit_github_yetus.sh | 144 ++++++++
 pom.xml                                       |   1 +
 6 files changed, 610 insertions(+), 196 deletions(-)

diff --git a/.editorconfig b/.editorconfig
index aa6adaa..8c65c96 100644
--- a/.editorconfig
+++ b/.editorconfig
@@ -460,6 +460,7 @@ ij_javascript_while_on_new_line = false
 ij_javascript_wrap_comments = false
 
 [{*.gradle,*.groovy,*.gant,*.gdsl,*.gy,*.gson,Jenkinsfile*}]
+indent_size = 4
 ij_groovy_align_group_field_declarations = false
 ij_groovy_align_multiline_array_initializer_expression = false
 ij_groovy_align_multiline_assignment = false
diff --git a/dev-support/Jenkinsfile_GitHub b/dev-support/Jenkinsfile_GitHub
index df1c8e9..fdd6c60 100644
--- a/dev-support/Jenkinsfile_GitHub
+++ b/dev-support/Jenkinsfile_GitHub
@@ -27,17 +27,31 @@ pipeline {
         buildDiscarder(logRotator(numToKeepStr: '15'))
         timeout (time: 7, unit: 'HOURS')
         timestamps()
-        checkoutToSubdirectory('src')
+        skipDefaultCheckout()
     }
 
     environment {
-        SOURCEDIR = 'src'
-        // will also need to change notification section below
-        PATCHDIR = 'out'
-        DOCKERFILE = "${SOURCEDIR}/dev-support/docker/Dockerfile"
-        YETUS='yetus'
+        SRC_REL = 'src'
+        PATCH_REL = 'output'
+        YETUS_REL = 'yetus'
+        DOCKERFILE_REL = "${SRC_REL}/dev-support/docker/Dockerfile"
+        YETUS_DRIVER_REL = "${SRC_REL}/dev-support/jenkins_precommit_github_yetus.sh"
         // Branch or tag name.  Yetus release tags are 'rel/X.Y.Z'
-        YETUS_VERSION='rel/0.11.1'
+        YETUS_VERSION = 'rel/0.11.1'
+        GENERAL_CHECK_PLUGINS = 'all,-compile,-javac,-javadoc,-jira,-shadedjars,-unit'
+        JDK_SPECIFIC_PLUGINS = 'compile,github,htmlout,javac,javadoc,maven,mvninstall,shadedjars,unit'
+        // output from surefire; sadly the archive function in yetus only works on file names.
+        ARCHIVE_PATTERN_LIST = 'TEST-*.xml,org.apache.h*.txt,*.dumpstream,*.dump'
+        // These tests currently have known failures. Once they burn down to 0, remove from here so that new problems will cause a failure.
+        TESTS_FILTER = 'cc,checkstyle,javac,javadoc,pylint,shellcheck,whitespace,perlcritic,ruby-lint,rubocop,mvnsite'
+        EXCLUDE_TESTS_URL = "${JENKINS_URL}/job/HBase-Find-Flaky-Tests/job/${CHANGE_TARGET}/lastSuccessfulBuild/artifact/excludes"
+
+        // a global view of paths. parallel stages can land on the same host concurrently, so each
+        // stage works in its own subdirectory. there is an "output" under each of these
+        // directories, which we retrieve after the build is complete.
+        WORKDIR_REL_GENERAL_CHECK = 'yetus-general-check'
+        WORKDIR_REL_JDK8_HADOOP2_CHECK = 'yetus-jdk8-hadoop2-check'
+        WORKDIR_REL_JDK11_HADOOP3_CHECK = 'yetus-jdk11-hadoop3-check'
     }
 
     parameters {
@@ -47,163 +61,304 @@ pipeline {
     }
 
     stages {
-        stage ('install yetus') {
-            steps {
-                dir("${WORKSPACE}/${YETUS}") {
-                    checkout([
-                        $class: 'GitSCM',
-                        branches: [[name: "${env.YETUS_VERSION}"]],
-                        userRemoteConfigs: [[ url: 'https://github.com/apache/yetus.git']]]
-                    )
-                }
-            }
-        }
-
-        stage ('precommit-run') {
-            steps {
-                withCredentials(
-                    [usernamePassword(credentialsId: 'apache-hbase-at-github.com',
+        stage ('precommit checks') {
+            parallel {
+                stage ('yetus general check') {
+                    agent {
+                        node {
+                            label 'Hadoop'
+                        }
+                    }
+                    environment {
+                        // customized per parallel stage
+                        PLUGINS = "${GENERAL_CHECK_PLUGINS}"
+                        SET_JAVA_HOME = '/usr/lib/jvm/java-8'
+                        WORKDIR_REL = "${WORKDIR_REL_GENERAL_CHECK}"
+                        // identical for all parallel stages
+                        WORKDIR = "${WORKSPACE}/${WORKDIR_REL}"
+                        YETUSDIR = "${WORKDIR}/${YETUS_REL}"
+                        SOURCEDIR = "${WORKDIR}/${SRC_REL}"
+                        PATCHDIR = "${WORKDIR}/${PATCH_REL}"
+                        BUILD_URL_ARTIFACTS = "artifact/${WORKDIR_REL}/${PATCH_REL}"
+                        DOCKERFILE = "${WORKDIR}/${DOCKERFILE_REL}"
+                        YETUS_DRIVER = "${WORKDIR}/${YETUS_DRIVER_REL}"
+                    }
+                    steps {
+                        dir("${SOURCEDIR}") {
+                            checkout scm
+                        }
+                        dir("${YETUSDIR}") {
+                            checkout([
+                              $class           : 'GitSCM',
+                              branches         : [[name: "${YETUS_VERSION}"]],
+                              userRemoteConfigs: [[url: 'https://github.com/apache/yetus.git']]]
+                            )
+                        }
+                        dir("${WORKDIR}") {
+                            withCredentials([
+                                usernamePassword(
+                                  credentialsId: 'apache-hbase-at-github.com',
                                   passwordVariable: 'GITHUB_PASSWORD',
-                                  usernameVariable: 'GITHUB_USER'),
-                    usernamePassword(credentialsId: 'hbaseqa-at-asf-jira',
-                                        passwordVariable: 'JIRA_PASSWORD',
-                                        usernameVariable: 'JIRA_USER')]) {
-                        sh '''#!/usr/bin/env bash
-                        set -e
-                        TESTPATCHBIN="${WORKSPACE}/${YETUS}/precommit/src/main/shell/test-patch.sh"
-                        # this must be clean for every run
-                        if [[ -d "${WORKSPACE}/${PATCHDIR}" ]]; then
-                          rm -rf "${WORKSPACE}/${PATCHDIR}"
-                        fi
-                        mkdir -p "${WORKSPACE}/${PATCHDIR}"
-
-                        ## Checking on H* machine nonsense
-                        echo "JAVA_HOME: ${JAVA_HOME}"
-                        ls -l "${JAVA_HOME}" || true
-                        echo "MAVEN_HOME: ${MAVEN_HOME}"
-                        echo "maven version:"
-                        mvn --offline --version  || true
-                        mkdir "${PATCHDIR}/machine"
-                       "${SOURCEDIR}/dev-support/gather_machine_environment.sh" "${PATCHDIR}/machine"
-                        ## /H*
-
-                        # If CHANGE_URL is set (e.g., Github Branch Source plugin), process it.
-                        # Otherwise exit, because we don't want HBase to do a
-                        # full build.  We wouldn't normally do this check for smaller
-                        # projects. :)
-                        if [[ -z "${CHANGE_URL}" ]]; then
-                            echo "Full build skipped" > "${WORKSPACE}/${PATCHDIR}/report.html"
-                            exit 0
-                        fi
-                        # enable debug output for yetus
-                        if [[ "true" = "${DEBUG}" ]]; then
-                            YETUS_ARGS+=("--debug")
-                        fi
-                        # If we're doing docker, make sure we don't accidentally pollute the image with a host java path
-                        if [ -n "${JAVA_HOME}" ]; then
-                          unset JAVA_HOME
-                        fi
-                        YETUS_ARGS+=("--patch-dir=${WORKSPACE}/${PATCHDIR}")
-                        # where the source is located
-                        YETUS_ARGS+=("--basedir=${WORKSPACE}/${SOURCEDIR}")
-                        # our project defaults come from a personality file
-                        # which will get loaded automatically by setting the project name
-                        YETUS_ARGS+=("--project=hbase")
-                        # lots of different output formats
-                        YETUS_ARGS+=("--brief-report-file=${WORKSPACE}/${PATCHDIR}/brief.txt")
-                        YETUS_ARGS+=("--console-report-file=${WORKSPACE}/${PATCHDIR}/console.txt")
-                        YETUS_ARGS+=("--html-report-file=${WORKSPACE}/${PATCHDIR}/report.html")
-                        # enable writing back to Github
-                        YETUS_ARGS+=(--github-password="${GITHUB_PASSWORD}")
-                        YETUS_ARGS+=(--github-user=${GITHUB_USER})
-                        # enable writing back to ASF JIRA
-                        YETUS_ARGS+=(--jira-password="${JIRA_PASSWORD}")
-                        YETUS_ARGS+=(--jira-user="${JIRA_USER}")
-                        # auto-kill any surefire stragglers during unit test runs
-                        YETUS_ARGS+=("--reapermode=kill")
-                        YETUS_ARGS+=("--multijdktests=compile")
-                        # set relatively high limits for ASF machines
-                        # changing these to higher values may cause problems
-                        # with other jobs on systemd-enabled machines
-                        YETUS_ARGS+=("--proclimit=10000")
-                        YETUS_ARGS+=("--dockermemlimit=20g")
-                        # -1 findbugs issues that show up prior to the patch being applied
-                        YETUS_ARGS+=("--findbugs-strict-precheck")
-                        # rsync these files back into the archive dir
-                        YETUS_ARGS+=("--archive-list=rat.txt")
-                        # URL for user-side presentation in reports and such to our artifacts
-                        # (needs to match the archive bits below)
-                        YETUS_ARGS+=("--build-url-artifacts=artifact/out")
-                        # plugins to enable
-                        YETUS_ARGS+=("--plugins=all")
-                        # don't let these tests cause -1s because we aren't really paying that
-                        # much attention to them
-                        YETUS_ARGS+=("--tests-filter=ruby-lint,test4tests")
-                        # run in docker mode and specifically point to our
-                        # Dockerfile since we don't want to use the auto-pulled version.
-                        YETUS_ARGS+=("--docker")
-                        YETUS_ARGS+=("--dockerfile=${DOCKERFILE}")
-                        YETUS_ARGS+=("--mvn-custom-repos")
-                        YETUS_ARGS+=("--multijdkdirs=/usr/lib/jvm/java-8-openjdk-amd64")
-                        YETUS_ARGS+=("--findbugs-home=/usr")
-                        YETUS_ARGS+=("--whitespace-eol-ignore-list=.*/generated/.*")
-                        YETUS_ARGS+=("--whitespace-tabs-ignore-list=.*/generated/.*")
-                        YETUS_ARGS+=("--personality=${SOURCEDIR}/dev-support/hbase-personality.sh")
-                        YETUS_ARGS+=("--quick-hadoopcheck")
-                        YETUS_ARGS+=("--skip-errorprone")
-                        # effectively treat dev-support as a custom maven module
-                        YETUS_ARGS+=("--skip-dirs=dev-support")
-                        # help keep the ASF boxes clean
-                        YETUS_ARGS+=("--sentinel")
-                        # use emoji vote so it is easier to find the broken line
-                        YETUS_ARGS+=("--github-use-emoji-vote")
-                        "${TESTPATCHBIN}" "${YETUS_ARGS[@]}"
-                        '''
+                                  usernameVariable: 'GITHUB_USER'
+                                )]) {
+                                sh label: 'test-patch', script: '''
+                                    hostname -a ; pwd ; ls -la
+                                    printenv 2>&1 | sort
+                                    echo "[INFO] Launching Yetus via ${YETUS_DRIVER}"
+                                    "${YETUS_DRIVER}"
+                                '''
+                            }
+                        }
+                    }
+                    post {
+                        always {
+                            // Has to be relative to WORKSPACE.
+                            archiveArtifacts artifacts: "${WORKDIR_REL}/${PATCH_REL}/*", excludes: "${WORKDIR_REL}/${PATCH_REL}/precommit"
+                            archiveArtifacts artifacts: "${WORKDIR_REL}/${PATCH_REL}/**/*", excludes: "${WORKDIR_REL}/${PATCH_REL}/precommit/**/*"
+                            publishHTML target: [
+                              allowMissing: true,
+                              keepAll: true,
+                              alwaysLinkToLastBuild: true,
+                              // Has to be relative to WORKSPACE
+                              reportDir: "${WORKDIR_REL}/${PATCH_REL}",
+                              reportFiles: 'report.html',
+                              reportName: 'PR General Check Report'
+                            ]
+                        }
+                        // Jenkins pipeline jobs fill slaves on PRs without this :(
+                        cleanup() {
+                            script {
+                                sh label: 'Cleanup workspace', script: '''
+                                    # See YETUS-764
+                                    if [ -f "${PATCHDIR}/pidfile.txt" ]; then
+                                      echo "test-patch process appears to still be running: killing"
+                                      kill `cat "${PATCHDIR}/pidfile.txt"` || true
+                                      sleep 10
+                                    fi
+                                    if [ -f "${PATCHDIR}/cidfile.txt" ]; then
+                                      echo "test-patch container appears to still be running: killing"
+                                      docker kill `cat "${PATCHDIR}/cidfile.txt"` || true
+                                    fi
+                                    # See HADOOP-13951
+                                    chmod -R u+rxw "${WORKSPACE}"
+                                '''
+                                dir ("${WORKDIR}") {
+                                    deleteDir()
+                                }
+                            }
+                        }
+                    }
+                }
+                stage ('yetus jdk8 Hadoop2 checks') {
+                    agent {
+                        node {
+                            label 'Hadoop'
+                        }
+                    }
+                    environment {
+                        // customized per parallel stage
+                        PLUGINS = "${JDK_SPECIFIC_PLUGINS}"
+                        SET_JAVA_HOME = '/usr/lib/jvm/java-8'
+                        WORKDIR_REL = "${WORKDIR_REL_JDK8_HADOOP2_CHECK}"
+                        // identical for all parallel stages
+                        WORKDIR = "${WORKSPACE}/${WORKDIR_REL}"
+                        YETUSDIR = "${WORKDIR}/${YETUS_REL}"
+                        SOURCEDIR = "${WORKDIR}/${SRC_REL}"
+                        PATCHDIR = "${WORKDIR}/${PATCH_REL}"
+                        BUILD_URL_ARTIFACTS = "artifact/${WORKDIR_REL}/${PATCH_REL}"
+                        DOCKERFILE = "${WORKDIR}/${DOCKERFILE_REL}"
+                        YETUS_DRIVER = "${WORKDIR}/${YETUS_DRIVER_REL}"
+                    }
+                    steps {
+                        dir("${SOURCEDIR}") {
+                            checkout scm
+                        }
+                        dir("${YETUSDIR}") {
+                            checkout([
+                              $class           : 'GitSCM',
+                              branches         : [[name: "${YETUS_VERSION}"]],
+                              userRemoteConfigs: [[url: 'https://github.com/apache/yetus.git']]]
+                            )
+                        }
+                        dir("${WORKDIR}") {
+                            withCredentials([
+                              usernamePassword(
+                                credentialsId: 'apache-hbase-at-github.com',
+                                passwordVariable: 'GITHUB_PASSWORD',
+                                usernameVariable: 'GITHUB_USER'
+                              )]) {
+                                sh label: 'test-patch', script: '''
+                                    hostname -a ; pwd ; ls -la
+                                    printenv 2>&1 | sort
+                                    echo "[INFO] Launching Yetus via ${YETUS_DRIVER}"
+                                    "${YETUS_DRIVER}"
+                                '''
+                            }
+                        }
+                    }
+                    post {
+                        always {
+                            junit testResults: "${WORKDIR_REL}/${SRC_REL}/**/target/**/TEST-*.xml", allowEmptyResults: true
+                            sh label: 'zip surefire reports', script: '''
+                                if [ -d "${PATCHDIR}/archiver" ]; then
+                                  count=$(find "${PATCHDIR}/archiver" -type f | wc -l)
+                                  if [[ 0 -ne ${count} ]]; then
+                                    echo "zipping ${count} archived files"
+                                    zip -q -m -r "${PATCHDIR}/test_logs.zip" "${PATCHDIR}/archiver"
+                                  else
+                                    echo "No archived files, skipping compressing."
+                                  fi
+                                else
+                                  echo "No archiver directory, skipping compressing."
+                                fi
+                            '''
+                            // Has to be relative to WORKSPACE.
+                            archiveArtifacts artifacts: "${WORKDIR_REL}/${PATCH_REL}/*", excludes: "${WORKDIR_REL}/${PATCH_REL}/precommit"
+                            archiveArtifacts artifacts: "${WORKDIR_REL}/${PATCH_REL}/**/*", excludes: "${WORKDIR_REL}/${PATCH_REL}/precommit/**/*"
+                            publishHTML target: [
+                              allowMissing: true,
+                              keepAll: true,
+                              alwaysLinkToLastBuild: true,
+                              // Has to be relative to WORKSPACE
+                              reportDir: "${WORKDIR_REL}/${PATCH_REL}",
+                              reportFiles: 'report.html',
+                              reportName: 'PR JDK8 Hadoop2 Check Report'
+                            ]
+                        }
+                        // Jenkins pipeline jobs fill slaves on PRs without this :(
+                        cleanup() {
+                            script {
+                                sh label: 'Cleanup workspace', script: '''
+                                    # See YETUS-764
+                                    if [ -f "${PATCHDIR}/pidfile.txt" ]; then
+                                      echo "test-patch process appears to still be running: killing"
+                                      kill `cat "${PATCHDIR}/pidfile.txt"` || true
+                                      sleep 10
+                                    fi
+                                    if [ -f "${PATCHDIR}/cidfile.txt" ]; then
+                                      echo "test-patch container appears to still be running: killing"
+                                      docker kill `cat "${PATCHDIR}/cidfile.txt"` || true
+                                    fi
+                                    # See HADOOP-13951
+                                    chmod -R u+rxw "${WORKSPACE}"
+                                '''
+                                dir ("${WORKDIR}") {
+                                    deleteDir()
+                                }
+                            }
+                        }
+                    }
+                }
+                stage ('yetus jdk11 hadoop3 checks') {
+                    agent {
+                        node {
+                            label 'Hadoop'
+                        }
+                    }
+                    environment {
+                        // customized per parallel stage
+                        PLUGINS = "${JDK_SPECIFIC_PLUGINS}"
+                        SET_JAVA_HOME = '/usr/lib/jvm/java-11'
+                        HADOOP_PROFILE = '3.0'
+                        WORKDIR_REL = "${WORKDIR_REL_JDK11_HADOOP3_CHECK}"
+                        // identical for all parallel stages
+                        WORKDIR = "${WORKSPACE}/${WORKDIR_REL}"
+                        YETUSDIR = "${WORKDIR}/${YETUS_REL}"
+                        SOURCEDIR = "${WORKDIR}/${SRC_REL}"
+                        PATCHDIR = "${WORKDIR}/${PATCH_REL}"
+                        BUILD_URL_ARTIFACTS = "artifact/${WORKDIR_REL}/${PATCH_REL}"
+                        DOCKERFILE = "${WORKDIR}/${DOCKERFILE_REL}"
+                        YETUS_DRIVER = "${WORKDIR}/${YETUS_DRIVER_REL}"
+                    }
+                    steps {
+                        dir("${SOURCEDIR}") {
+                            checkout scm
+                        }
+                        dir("${YETUSDIR}") {
+                            checkout([
+                              $class           : 'GitSCM',
+                              branches         : [[name: "${YETUS_VERSION}"]],
+                              userRemoteConfigs: [[url: 'https://github.com/apache/yetus.git']]]
+                            )
+                        }
+                        dir("${WORKDIR}") {
+                            withCredentials([
+                              usernamePassword(
+                                credentialsId: 'apache-hbase-at-github.com',
+                                passwordVariable: 'GITHUB_PASSWORD',
+                                usernameVariable: 'GITHUB_USER'
+                              )]) {
+                                sh label: 'test-patch', script: '''
+                                    hostname -a ; pwd ; ls -la
+                                    printenv 2>&1 | sort
+                                    echo "[INFO] Launching Yetus via ${YETUS_DRIVER}"
+                                    "${YETUS_DRIVER}"
+                                '''
+                            }
+                        }
+                    }
+                    post {
+                        always {
+                            junit testResults: "${WORKDIR_REL}/${SRC_REL}/**/target/**/TEST-*.xml", allowEmptyResults: true
+                            sh label: 'zip surefire reports', script: '''
+                                if [ -d "${PATCHDIR}/archiver" ]; then
+                                  count=$(find "${PATCHDIR}/archiver" -type f | wc -l)
+                                  if [[ 0 -ne ${count} ]]; then
+                                    echo "zipping ${count} archived files"
+                                    zip -q -m -r "${PATCHDIR}/test_logs.zip" "${PATCHDIR}/archiver"
+                                  else
+                                    echo "No archived files, skipping compressing."
+                                  fi
+                                else
+                                  echo "No archiver directory, skipping compressing."
+                                fi
+                            '''
+                            // Has to be relative to WORKSPACE.
+                            archiveArtifacts artifacts: "${WORKDIR_REL}/${PATCH_REL}/*", excludes: "${WORKDIR_REL}/${PATCH_REL}/precommit"
+                            archiveArtifacts artifacts: "${WORKDIR_REL}/${PATCH_REL}/**/*", excludes: "${WORKDIR_REL}/${PATCH_REL}/precommit/**/*"
+                            publishHTML target: [
+                              allowMissing: true,
+                              keepAll: true,
+                              alwaysLinkToLastBuild: true,
+                              // Has to be relative to WORKSPACE
+                              reportDir: "${WORKDIR_REL}/${PATCH_REL}",
+                              reportFiles: 'report.html',
+                              reportName: 'PR JDK11 Hadoop3 Check Report'
+                            ]
+                        }
+                        // Jenkins pipeline jobs fill slaves on PRs without this :(
+                        cleanup() {
+                            script {
+                                sh label: 'Cleanup workspace', script: '''
+                                    # See YETUS-764
+                                    if [ -f "${PATCHDIR}/pidfile.txt" ]; then
+                                      echo "test-patch process appears to still be running: killing"
+                                      kill `cat "${PATCHDIR}/pidfile.txt"` || true
+                                      sleep 10
+                                    fi
+                                    if [ -f "${PATCHDIR}/cidfile.txt" ]; then
+                                      echo "test-patch container appears to still be running: killing"
+                                      docker kill `cat "${PATCHDIR}/cidfile.txt"` || true
+                                    fi
+                                    # See HADOOP-13951
+                                    chmod -R u+rxw "${WORKSPACE}"
+                                '''
+                                dir ("${WORKDIR}") {
+                                    deleteDir()
+                                }
+                            }
+                        }
+                    }
                 }
             }
         }
-
     }
 
     post {
-        always {
-          script {
-            // Yetus output
-            archiveArtifacts "${env.PATCHDIR}/**"
-            // Publish the HTML report so that it can be looked at
-            // Has to be relative to WORKSPACE.
-            publishHTML (target: [
-                          allowMissing: true,
-                          keepAll: true,
-                          alwaysLinkToLastBuild: true,
-                          // Has to be relative to WORKSPACE
-                          reportDir: "${env.PATCHDIR}",
-                          reportFiles: 'report.html',
-                          reportName: 'Yetus Report'
-            ])
-            // Publish JUnit results
-            try {
-                junit "${env.SOURCEDIR}/**/target/surefire-reports/*.xml"
-            } catch(e) {
-                echo 'junit processing: ' + e.toString()
-            }
-          }
-        }
-
         // Jenkins pipeline jobs fill slaves on PRs without this :(
         cleanup() {
             script {
-                sh '''
-                    # See YETUS-764
-                    if [ -f "${WORKSPACE}/${PATCHDIR}/pidfile.txt" ]; then
-                      echo "test-patch process appears to still be running: killing"
-                      kill `cat "${WORKSPACE}/${PATCHDIR}/pidfile.txt"` || true
-                      sleep 10
-                    fi
-                    if [ -f "${WORKSPACE}/${PATCHDIR}/cidfile.txt" ]; then
-                      echo "test-patch container appears to still be running: killing"
-                      docker kill `cat "${WORKSPACE}/${PATCHDIR}/cidfile.txt"` || true
-                    fi
+                sh label: 'Cleanup workspace', script: '''
                     # See HADOOP-13951
                     chmod -R u+rxw "${WORKSPACE}"
                     '''
diff --git a/dev-support/docker/Dockerfile b/dev-support/docker/Dockerfile
index 6a3f377..6b136bc 100644
--- a/dev-support/docker/Dockerfile
+++ b/dev-support/docker/Dockerfile
@@ -14,47 +14,163 @@
 # See the License for the specific language governing permissions and
 # limitations under the License.
 
-# Dockerfile for installing the necessary dependencies for building Hadoop.
-# See BUILDING.txt.
-
-FROM maven:3.5-jdk-8
-
-RUN apt-get -q update && apt-get -q install --no-install-recommends -y \
-       git \
-       bats \
-       findbugs \
-       libperl-critic-perl \
-       pylint \
-       python-dateutil \
-       rsync \
-       make \
-       gcc \
-       libc6-dev \
-       ruby \
-       ruby-dev \
-       wget \
-       && \
-    gem install --no-document rake rubocop ruby-lint
-
-ENV FINDBUGS_HOME /usr
-
-####
-# Install shellcheck
-###
-RUN mkdir -p /opt/shellcheck && \
-    curl -L -s -S \
-        https://storage.googleapis.com/shellcheck/shellcheck-stable.linux.x86_64.tar.xz \
-        -o /opt/shellcheck.tar.xz && \
-    tar xJf /opt/shellcheck.tar.xz --strip-components 1 -C /opt/shellcheck && \
-    ln -s /opt/shellcheck/shellcheck /usr/bin/shellcheck && \
-    rm -f /opt/shellcheck.tar.xz
+#
+# Dockerfile used as the build and test environment, amenable to Yetus.
+#
+# Built in multiple stages so as to avoid re-downloading large binaries when
+# tweaking unrelated aspects of the image.
 
-###
-# Avoid out of memory errors in builds
-###
-ENV MAVEN_OPTS -Xmx3g
+# start with a minimal image into which we can download remote tarballs
+FROM ubuntu:18.04 AS BASE_IMAGE
+SHELL ["/bin/bash", "-o", "pipefail", "-c"]
+
+# hadolint ignore=DL3009
+RUN DEBIAN_FRONTEND=noninteractive apt-get -qq update && \
+  DEBIAN_FRONTEND=noninteractive apt-get -qq install --no-install-recommends -y \
+    ca-certificates=20180409 \
+    curl=7.58.0-2ubuntu3.8 \
+    locales=2.27-3ubuntu1
+
+RUN locale-gen en_US.UTF-8
+ENV LANG=en_US.UTF-8 LANGUAGE=en_US:en LC_ALL=en_US.UTF-8
+
+##
+# download sundry dependencies
+#
+
+FROM BASE_IMAGE AS FINDBUGS_DOWNLOAD_IMAGE
+# TODO: replace with Spotbugs HBASE-23077, HBASE-22383
+ENV FINDBUGS_VERSION '3.0.1'
+ENV FINDBUGS_URL "https://downloads.sourceforge.net/project/findbugs/findbugs/${FINDBUGS_VERSION}/findbugs-${FINDBUGS_VERSION}.tar.gz"
+ENV FINDBUGS_SHA256 'e80e0da0c213a27504ef3188ef25f107651700ffc66433eac6a7454bbe336419'
+SHELL ["/bin/bash", "-o", "pipefail", "-c"]
+RUN curl --location --fail --silent --show-error --output /tmp/findbugs.tar.gz "${FINDBUGS_URL}" && \
+  echo "${FINDBUGS_SHA256} */tmp/findbugs.tar.gz" | sha256sum -c -
+
+FROM BASE_IMAGE AS HADOLINT_DOWNLOAD_IMAGE
+ENV HADOLINT_VERSION '1.17.5'
+ENV HADOLINT_URL "https://github.com/hadolint/hadolint/releases/download/v${HADOLINT_VERSION}/hadolint-Linux-x86_64"
+ENV HADOLINT_SHA256 '20dd38bc0602040f19268adc14c3d1aae11af27b463af43f3122076baf827a35'
+SHELL ["/bin/bash", "-o", "pipefail", "-c"]
+RUN curl --location --fail --silent --show-error --output /tmp/hadolint "${HADOLINT_URL}" && \
+  echo "${HADOLINT_SHA256} */tmp/hadolint" | sha256sum -c -
+
+FROM BASE_IMAGE AS MAVEN_DOWNLOAD_IMAGE
+ENV MAVEN_VERSION='3.5.4'
+ENV MAVEN_URL "https://archive.apache.org/dist/maven/maven-3/${MAVEN_VERSION}/binaries/apache-maven-${MAVEN_VERSION}-bin.tar.gz"
+ENV MAVEN_SHA256 'ce50b1c91364cb77efe3776f756a6d92b76d9038b0a0782f7d53acf1e997a14d'
+SHELL ["/bin/bash", "-o", "pipefail", "-c"]
+RUN curl --location --fail --silent --show-error --output /tmp/maven.tar.gz "${MAVEN_URL}" && \
+  echo "${MAVEN_SHA256} */tmp/maven.tar.gz" | sha256sum -c -
+
+FROM BASE_IMAGE AS OPENJDK8_DOWNLOAD_IMAGE
+ENV OPENJDK8_URL 'https://github.com/AdoptOpenJDK/openjdk8-binaries/releases/download/jdk8u232-b09/OpenJDK8U-jdk_x64_linux_hotspot_8u232b09.tar.gz'
+ENV OPENJDK8_SHA256 '7b7884f2eb2ba2d47f4c0bf3bb1a2a95b73a3a7734bd47ebf9798483a7bcc423'
+SHELL ["/bin/bash", "-o", "pipefail", "-c"]
+RUN curl --location --fail --silent --show-error --output /tmp/adoptopenjdk8.tar.gz "${OPENJDK8_URL}" && \
+  echo "${OPENJDK8_SHA256} */tmp/adoptopenjdk8.tar.gz" | sha256sum -c -
+
+FROM BASE_IMAGE AS OPENJDK11_DOWNLOAD_IMAGE
+ENV OPENJDK11_URL 'https://github.com/AdoptOpenJDK/openjdk11-binaries/releases/download/jdk-11.0.6%2B10/OpenJDK11U-jdk_x64_linux_hotspot_11.0.6_10.tar.gz'
+ENV OPENJDK11_SHA256 '330d19a2eaa07ed02757d7a785a77bab49f5ee710ea03b4ee2fa220ddd0feffc'
+SHELL ["/bin/bash", "-o", "pipefail", "-c"]
+RUN curl --location --fail --silent --show-error --output /tmp/adoptopenjdk11.tar.gz "${OPENJDK11_URL}" && \
+  echo "${OPENJDK11_SHA256} */tmp/adoptopenjdk11.tar.gz" | sha256sum -c -
+
+##
+# build the final image
+#
+
+FROM BASE_IMAGE
+SHELL ["/bin/bash", "-o", "pipefail", "-c"]
+
+##
+# install dependencies from system packages.
+# be careful not to install any system packages (i.e., findbugs) that will
+# pull in the default-jre.
+#
+
+# bring the base image into conformance with the expectations imposed by
+# Yetus and our personality file of what a build environment looks like.
+RUN DEBIAN_FRONTEND=noninteractive apt-get -qq install --no-install-recommends -y \
+  bash=4.4.18-2ubuntu1.2 \
+  build-essential=12.4ubuntu1 \
+  curl=7.58.0-2ubuntu3.8 \
+  diffutils=1:3.6-1 \
+  git=1:2.17.1-1ubuntu0.5 \
+  rsync=3.1.2-2.1ubuntu1 \
+  tar=1.29b-2ubuntu0.1 \
+  wget=1.19.4-1ubuntu2.2
+
+# install the dependencies required in order to enable the sundry precommit
+# checks/features provided by Yetus plugins.
+RUN DEBIAN_FRONTEND=noninteractive apt-get -qq install --no-install-recommends -y \
+  bats=0.4.0-1.1 \
+  libperl-critic-perl=1.130-1 \
+  python3=3.6.7-1~18.04 \
+  python3-pip=9.0.1-2.3~ubuntu1.18.04.1 \
+  python3-setuptools=39.0.1-2 \
+  ruby=1:2.5.1 \
+  ruby-dev=1:2.5.1 \
+  shellcheck=0.4.6-1 \
+  && \
+  apt-get clean && \
+  rm -rf /var/lib/apt/lists/*
+
+RUN python3 -mpip install --upgrade pip && \
+  python3 -mpip install pylint==2.4.4
+
+RUN gem install --no-document \
+  rake:13.0.1 \
+  rubocop:0.80.0 \
+  ruby-lint:2.3.1
+
+# hadolint ignore=DL3010
+COPY --from=FINDBUGS_DOWNLOAD_IMAGE /tmp/findbugs.tar.gz /tmp/findbugs.tar.gz
+RUN tar xzf /tmp/findbugs.tar.gz -C /opt && \
+  ln -s "/opt/$(dirname "$(tar -tf /tmp/findbugs.tar.gz | head -n1)")" /opt/findbugs && \
+  rm /tmp/findbugs.tar.gz
+
+COPY --from=HADOLINT_DOWNLOAD_IMAGE /tmp/hadolint /tmp/hadolint
+RUN mv /tmp/hadolint /usr/local/bin && \
+  chmod a+x /usr/local/bin/hadolint
+
+# hadolint ignore=DL3010
+COPY --from=MAVEN_DOWNLOAD_IMAGE /tmp/maven.tar.gz /tmp/maven.tar.gz
+RUN tar xzf /tmp/maven.tar.gz -C /opt && \
+  ln -s "/opt/$(dirname "$(tar -tf /tmp/maven.tar.gz | head -n1)")" /opt/maven && \
+  rm /tmp/maven.tar.gz
+
+##
+# ensure JVMs are available under `/usr/lib/jvm` and prefix each installation
+# as `java-` so as to conform with Yetus's assumptions.
+#
+
+# hadolint ignore=DL3010
+COPY --from=OPENJDK8_DOWNLOAD_IMAGE /tmp/adoptopenjdk8.tar.gz /tmp/adoptopenjdk8.tar.gz
+RUN mkdir -p /usr/lib/jvm && \
+  tar xzf /tmp/adoptopenjdk8.tar.gz -C /usr/lib/jvm && \
+  ln -s "/usr/lib/jvm/$(basename "$(tar -tf /tmp/adoptopenjdk8.tar.gz | head -n1)")" /usr/lib/jvm/java-8-adoptopenjdk && \
+  ln -s /usr/lib/jvm/java-8-adoptopenjdk /usr/lib/jvm/java-8 && \
+  rm /tmp/adoptopenjdk8.tar.gz
+
+# hadolint ignore=DL3010
+COPY --from=OPENJDK11_DOWNLOAD_IMAGE /tmp/adoptopenjdk11.tar.gz /tmp/adoptopenjdk11.tar.gz
+RUN mkdir -p /usr/lib/jvm && \
+  tar xzf /tmp/adoptopenjdk11.tar.gz -C /usr/lib/jvm && \
+  ln -s "/usr/lib/jvm/$(basename "$(tar -tf /tmp/adoptopenjdk11.tar.gz | head -n1)")" /usr/lib/jvm/java-11-adoptopenjdk && \
+  ln -s /usr/lib/jvm/java-11-adoptopenjdk /usr/lib/jvm/java-11 && \
+  rm /tmp/adoptopenjdk11.tar.gz
+
+# configure default environment for Yetus. Yetus in dockermode seems to require
+# these values to be specified here; the various --foo-path flags do not
+# propigate as expected, while these are honored.
+# TODO (nd): is this really true? investigate and file a ticket.
+ENV FINDBUGS_HOME '/opt/findbugs'
+ENV MAVEN_HOME '/opt/maven'
+ENV MAVEN_OPTS '-Xms6G -Xmx6G'
 
-CMD /bin/bash
+CMD ["/bin/bash"]
 
 ###
 # Everything past this point is either not needed for testing or breaks Yetus.
diff --git a/dev-support/hbase-personality.sh b/dev-support/hbase-personality.sh
index 100b847..0e1ae99 100755
--- a/dev-support/hbase-personality.sh
+++ b/dev-support/hbase-personality.sh
@@ -79,9 +79,6 @@ function personality_globals
 
   # TODO use PATCH_BRANCH to select jdk versions to use.
 
-  # Override the maven options
-  MAVEN_OPTS="${MAVEN_OPTS:-"-Xms6G -Xmx6G"}"
-
   # Yetus 0.7.0 enforces limits. Default proclimit is 1000.
   # Up it. See HBASE-19902 for how we arrived at this number.
   #shellcheck disable=SC2034
diff --git a/dev-support/jenkins_precommit_github_yetus.sh b/dev-support/jenkins_precommit_github_yetus.sh
new file mode 100755
index 0000000..f88df0a
--- /dev/null
+++ b/dev-support/jenkins_precommit_github_yetus.sh
@@ -0,0 +1,144 @@
+#!/usr/bin/env bash
+# Licensed to the Apache Software Foundation (ASF) under one
+# or more contributor license agreements.  See the NOTICE file
+# distributed with this work for additional information
+# regarding copyright ownership.  The ASF licenses this file
+# to you under the Apache License, Version 2.0 (the
+# "License"); you may not use this file except in compliance
+# with the License.  You may obtain a copy of the License at
+#
+#   http://www.apache.org/licenses/LICENSE-2.0
+#
+# Unless required by applicable law or agreed to in writing,
+# software distributed under the License is distributed on an
+# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+# KIND, either express or implied.  See the License for the
+# specific language governing permissions and limitations
+# under the License.
+
+set -e
+
+# place ourselves in the directory containing the hbase and yetus checkouts
+cd "$(dirname "$0")/../.."
+echo "executing from $(pwd)"
+
+if [[ "true" = "${DEBUG}" ]]; then
+  set -x
+  printenv 2>&1 | sort
+fi
+
+declare -i missing_env=0
+declare -a required_envs=(
+  # these ENV variables define the required API with Jenkinsfile_GitHub
+  "ARCHIVE_PATTERN_LIST"
+  "BUILD_URL_ARTIFACTS"
+  "DOCKERFILE"
+  "GITHUB_PASSWORD"
+  "GITHUB_USER"
+  "PATCHDIR"
+  "PLUGINS"
+  "SET_JAVA_HOME"
+  "SOURCEDIR"
+  "TESTS_FILTER"
+  "YETUSDIR"
+)
+# Validate params
+for required_env in "${required_envs[@]}"; do
+  if [ -z "${!required_env}" ]; then
+    echo "[ERROR] Required environment variable '${required_env}' is not set."
+    missing_env=${missing_env}+1
+  fi
+done
+
+if [ ${missing_env} -gt 0 ]; then
+  echo "[ERROR] Please set the required environment variables before invoking. If this error is " \
+       "on Jenkins, then please file a JIRA about the error."
+  exit 1
+fi
+
+# TODO (HBASE-23900): cannot assume test-patch runs directly from sources
+TESTPATCHBIN="${YETUSDIR}/precommit/src/main/shell/test-patch.sh"
+
+# this must be clean for every run
+rm -rf "${PATCHDIR}"
+mkdir -p "${PATCHDIR}"
+
+# Checking on H* machine nonsense
+mkdir "${PATCHDIR}/machine"
+"${SOURCEDIR}/dev-support/gather_machine_environment.sh" "${PATCHDIR}/machine"
+
+# If CHANGE_URL is set (e.g., Github Branch Source plugin), process it.
+# Otherwise exit, because we don't want HBase to do a
+# full build.  We wouldn't normally do this check for smaller
+# projects. :)
+if [[ -z "${CHANGE_URL}" ]]; then
+  echo "Full build skipped" > "${PATCHDIR}/report.html"
+  exit 0
+fi
+# enable debug output for yetus
+if [[ "true" = "${DEBUG}" ]]; then
+  YETUS_ARGS+=("--debug")
+fi
+# If we're doing docker, make sure we don't accidentally pollute the image with a host java path
+if [ -n "${JAVA_HOME}" ]; then
+  unset JAVA_HOME
+fi
+YETUS_ARGS+=("--ignore-unknown-options=true")
+YETUS_ARGS+=("--patch-dir=${PATCHDIR}")
+# where the source is located
+YETUS_ARGS+=("--basedir=${SOURCEDIR}")
+# our project defaults come from a personality file
+# which will get loaded automatically by setting the project name
+YETUS_ARGS+=("--project=hbase")
+# lots of different output formats
+YETUS_ARGS+=("--brief-report-file=${PATCHDIR}/brief.txt")
+YETUS_ARGS+=("--console-report-file=${PATCHDIR}/console.txt")
+YETUS_ARGS+=("--html-report-file=${PATCHDIR}/report.html")
+# enable writing back to Github
+YETUS_ARGS+=("--github-password=${GITHUB_PASSWORD}")
+YETUS_ARGS+=("--github-user=${GITHUB_USER}")
+# auto-kill any surefire stragglers during unit test runs
+YETUS_ARGS+=("--reapermode=kill")
+# set relatively high limits for ASF machines
+# changing these to higher values may cause problems
+# with other jobs on systemd-enabled machines
+YETUS_ARGS+=("--proclimit=10000")
+YETUS_ARGS+=("--dockermemlimit=20g")
+# -1 findbugs issues that show up prior to the patch being applied
+YETUS_ARGS+=("--findbugs-strict-precheck")
+# rsync these files back into the archive dir
+YETUS_ARGS+=("--archive-list=${ARCHIVE_PATTERN_LIST}")
+# URL for user-side presentation in reports and such to our artifacts
+YETUS_ARGS+=("--build-url-artifacts=${BUILD_URL_ARTIFACTS}")
+# plugins to enable
+YETUS_ARGS+=("--plugins=${PLUGINS}")
+# run in docker mode and specifically point to our
+# Dockerfile since we don't want to use the auto-pulled version.
+YETUS_ARGS+=("--docker")
+YETUS_ARGS+=("--dockerfile=${DOCKERFILE}")
+YETUS_ARGS+=("--mvn-custom-repos")
+YETUS_ARGS+=("--java-home=${SET_JAVA_HOME}")
+YETUS_ARGS+=("--whitespace-eol-ignore-list=.*/generated/.*")
+YETUS_ARGS+=("--whitespace-tabs-ignore-list=.*/generated/.*")
+YETUS_ARGS+=("--tests-filter=${TESTS_FILTER}")
+YETUS_ARGS+=("--personality=${SOURCEDIR}/dev-support/hbase-personality.sh")
+YETUS_ARGS+=("--quick-hadoopcheck")
+YETUS_ARGS+=("--skip-errorprone")
+# effectively treat dev-support as a custom maven module
+YETUS_ARGS+=("--skip-dirs=dev-support")
+# For testing with specific hadoop version. Activates corresponding profile in maven runs.
+if [[ -n "${HADOOP_PROFILE}" ]]; then
+  YETUS_ARGS+=("--hadoop-profile=${HADOOP_PROFILE}")
+fi
+if [[ -n "${EXCLUDE_TESTS_URL}" ]]; then
+  YETUS_ARGS+=("--exclude-tests-url=${EXCLUDE_TESTS_URL}")
+fi
+# help keep the ASF boxes clean
+YETUS_ARGS+=("--sentinel")
+# use emoji vote so it is easier to find the broken line
+YETUS_ARGS+=("--github-use-emoji-vote")
+
+echo "Launching yetus with command line:"
+echo "${TESTPATCHBIN} ${YETUS_ARGS[*]}"
+
+/usr/bin/env bash "${TESTPATCHBIN}" "${YETUS_ARGS[@]}"
diff --git a/pom.xml b/pom.xml
index fe07edf..0943da1 100755
--- a/pom.xml
+++ b/pom.xml
@@ -4158,4 +4158,5 @@
       <url>file:///tmp</url>
     </site>
   </distributionManagement>
+  <!-- precommit rebuild the project -->
 </project>