You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@lucene.apache.org by Policeman Jenkins Server <je...@thetaphi.de> on 2020/04/29 15:44:16 UTC

[JENKINS] Lucene-Solr-8.x-Linux (64bit/jdk1.8.0_201) - Build # 2910 - Still Failing!

Build: https://jenkins.thetaphi.de/job/Lucene-Solr-8.x-Linux/2910/
Java: 64bit/jdk1.8.0_201 -XX:-UseCompressedOops -XX:+UseSerialGC

All tests passed

Build Log:
[...truncated 54528 lines...]
-ecj-javadoc-lint-src:
    [mkdir] Created dir: /tmp/ecj1392904238
 [ecj-lint] Compiling 931 source files to /tmp/ecj1392904238
 [ecj-lint] ----------
 [ecj-lint] 1. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/CodecUtil.java (at line 523)
 [ecj-lint] 	throw new CorruptIndexException("misplaced codec footer (file truncated?): length=" + in.length() + " but footerLength==" + footerLength(), input);
 [ecj-lint] 	^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'in' is not closed at this location
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 2. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingStoredFieldsReader.java (at line 166)
 [ecj-lint] 	FieldsIndexReader fieldsIndexReader = new FieldsIndexReader(d, si.name, segmentSuffix, INDEX_EXTENSION_PREFIX, INDEX_CODEC_NAME, si.getId());
 [ecj-lint] 	                  ^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'fieldsIndexReader' is never closed
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 3. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingTermVectorsReader.java (at line 148)
 [ecj-lint] 	FieldsIndexReader fieldsIndexReader = new FieldsIndexReader(d, si.name, segmentSuffix, VECTORS_INDEX_EXTENSION_PREFIX, VECTORS_INDEX_CODEC_NAME, si.getId());
 [ecj-lint] 	                  ^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'fieldsIndexReader' is never closed
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 4. ERROR in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/search/IndexSearcher.java (at line 50)
 [ecj-lint] 	import org.apache.lucene.util.automaton.ByteRunAutomaton;
 [ecj-lint] 	       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 [ecj-lint] The import org.apache.lucene.util.automaton.ByteRunAutomaton is never used
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 5. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/util/automaton/Operations.java (at line 742)
 [ecj-lint] 	Integer q = newstate.get(statesSet);
 [ecj-lint] 	                         ^^^^^^^^^
 [ecj-lint] Unlikely argument type SortedIntSet for get(Object) on a Map<SortedIntSet.FrozenIntSet,Integer>
 [ecj-lint] ----------
 [ecj-lint] 5 problems (1 error, 4 warnings)

BUILD FAILED
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/build.xml:634: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/build.xml:101: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build.xml:201: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/common-build.xml:2127: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/common-build.xml:2166: Compile failed; see the compiler error output for details.

Total time: 35 minutes 24 seconds
Build step 'Invoke Ant' marked build as failure
Archiving artifacts
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
[WARNINGS] Skipping publisher since build result is FAILURE
Recording test results
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2

[JENKINS] Lucene-Solr-8.x-Linux (64bit/jdk-12.0.2) - Build # 2922 - Still Failing!

Posted by Policeman Jenkins Server <je...@thetaphi.de>.
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-8.x-Linux/2922/
Java: 64bit/jdk-12.0.2 -XX:-UseCompressedOops -XX:+UseG1GC

1 tests failed.
FAILED:  org.apache.solr.index.hdfs.CheckHdfsIndexTest.doTest

Error Message:
Error from server at http://127.0.0.1:43925/xyl/collection1: java.lang.NullPointerException  at org.apache.solr.handler.admin.SystemInfoHandler.getSecurityInfo(SystemInfoHandler.java:326)  at org.apache.solr.handler.admin.SystemInfoHandler.handleRequestBody(SystemInfoHandler.java:146)  at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)  at org.apache.solr.core.SolrCore.execute(SolrCore.java:2600)  at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:803)  at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:582)  at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:432)  at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:362)  at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)  at org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)  at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)  at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)  at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)  at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1610)  at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)  at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1300)  at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)  at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)  at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1580)  at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)  at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1215)  at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)  at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)  at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)  at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:767)  at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)  at org.eclipse.jetty.server.Server.handle(Server.java:500)  at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)  at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)  at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)  at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)  at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)  at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)  at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)  at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)  at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)  at java.base/java.lang.Thread.run(Thread.java:835) 

Stack Trace:
org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:43925/xyl/collection1: java.lang.NullPointerException
	at org.apache.solr.handler.admin.SystemInfoHandler.getSecurityInfo(SystemInfoHandler.java:326)
	at org.apache.solr.handler.admin.SystemInfoHandler.handleRequestBody(SystemInfoHandler.java:146)
	at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)
	at org.apache.solr.core.SolrCore.execute(SolrCore.java:2600)
	at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:803)
	at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:582)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:432)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:362)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)
	at org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)
	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1610)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1300)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)
	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1580)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1215)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)
	at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:767)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.server.Server.handle(Server.java:500)
	at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)
	at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)
	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)
	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)
	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
	at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)
	at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)
	at java.base/java.lang.Thread.run(Thread.java:835)

	at __randomizedtesting.SeedInfo.seed([60B56001348939CA:C7F1D8A559322A73]:0)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:665)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:265)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:248)
	at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:211)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1003)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1018)
	at org.apache.solr.index.hdfs.CheckHdfsIndexTest.doTest(CheckHdfsIndexTest.java:120)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:567)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1081)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1053)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at java.base/java.lang.Thread.run(Thread.java:835)




Build Log:
[...truncated 16289 lines...]
   [junit4] Suite: org.apache.solr.index.hdfs.CheckHdfsIndexTest
   [junit4]   2> 1013152 INFO  (SUITE-CheckHdfsIndexTest-seed#[60B56001348939CA]-worker) [     ] o.a.s.SolrTestCase Setting 'solr.default.confdir' system property to test-framework derived value of '/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/server/solr/configsets/_default/conf'
   [junit4]   2> 1013152 INFO  (SUITE-CheckHdfsIndexTest-seed#[60B56001348939CA]-worker) [     ] o.a.s.SolrTestCaseJ4 Created dataDir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J5/temp/solr.index.hdfs.CheckHdfsIndexTest_60B56001348939CA-001/data-dir-150-001
   [junit4]   2> 1013152 WARN  (SUITE-CheckHdfsIndexTest-seed#[60B56001348939CA]-worker) [     ] o.a.s.SolrTestCaseJ4 startTrackingSearchers: numOpens=2 numCloses=2
   [junit4]   2> 1013152 INFO  (SUITE-CheckHdfsIndexTest-seed#[60B56001348939CA]-worker) [     ] o.a.s.SolrTestCaseJ4 Using PointFields (NUMERIC_POINTS_SYSPROP=true) w/NUMERIC_DOCVALUES_SYSPROP=false
   [junit4]   2> 1013153 INFO  (SUITE-CheckHdfsIndexTest-seed#[60B56001348939CA]-worker) [     ] o.a.s.SolrTestCaseJ4 Randomized ssl (false) and clientAuth (false) via: @org.apache.solr.util.RandomizeSSL(reason="", ssl=0.0/0.0, value=0.0/0.0, clientAuth=0.0/0.0)
   [junit4]   2> 1013153 INFO  (SUITE-CheckHdfsIndexTest-seed#[60B56001348939CA]-worker) [     ] o.a.s.SolrTestCaseJ4 SecureRandom sanity checks: test.solr.allowed.securerandom=null & java.security.egd=file:/dev/./urandom
   [junit4]   2> 1013153 INFO  (SUITE-CheckHdfsIndexTest-seed#[60B56001348939CA]-worker) [     ] o.a.s.BaseDistributedSearchTestCase Setting hostContext system property: /xyl/
   [junit4]   1> Formatting using clusterid: testClusterID
   [junit4]   2> 1013210 WARN  (SUITE-CheckHdfsIndexTest-seed#[60B56001348939CA]-worker) [     ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 1013213 INFO  (SUITE-CheckHdfsIndexTest-seed#[60B56001348939CA]-worker) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 12.0.2+10
   [junit4]   2> 1013226 INFO  (SUITE-CheckHdfsIndexTest-seed#[60B56001348939CA]-worker) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 1013226 INFO  (SUITE-CheckHdfsIndexTest-seed#[60B56001348939CA]-worker) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 1013226 INFO  (SUITE-CheckHdfsIndexTest-seed#[60B56001348939CA]-worker) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 1013227 INFO  (SUITE-CheckHdfsIndexTest-seed#[60B56001348939CA]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@2045f7a8{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,AVAILABLE}
   [junit4]   2> 1013338 INFO  (SUITE-CheckHdfsIndexTest-seed#[60B56001348939CA]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.w.WebAppContext@6f5ec2bb{hdfs,/,file:///home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J5/temp/jetty-localhost_localdomain-35267-hadoop-hdfs-3_2_0-tests_jar-_-any-13738145209157207300.dir/webapp/,AVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/hdfs}
   [junit4]   2> 1013339 INFO  (SUITE-CheckHdfsIndexTest-seed#[60B56001348939CA]-worker) [     ] o.e.j.s.AbstractConnector Started ServerConnector@7266150f{HTTP/1.1, (http/1.1)}{localhost.localdomain:35267}
   [junit4]   2> 1013339 INFO  (SUITE-CheckHdfsIndexTest-seed#[60B56001348939CA]-worker) [     ] o.e.j.s.Server Started @1013368ms
   [junit4]   2> 1013454 WARN  (SUITE-CheckHdfsIndexTest-seed#[60B56001348939CA]-worker) [     ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 1013455 INFO  (SUITE-CheckHdfsIndexTest-seed#[60B56001348939CA]-worker) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 12.0.2+10
   [junit4]   2> 1013468 INFO  (SUITE-CheckHdfsIndexTest-seed#[60B56001348939CA]-worker) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 1013468 INFO  (SUITE-CheckHdfsIndexTest-seed#[60B56001348939CA]-worker) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 1013468 INFO  (SUITE-CheckHdfsIndexTest-seed#[60B56001348939CA]-worker) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 1013472 INFO  (SUITE-CheckHdfsIndexTest-seed#[60B56001348939CA]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@52386fa9{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,AVAILABLE}
   [junit4]   2> 1013580 INFO  (SUITE-CheckHdfsIndexTest-seed#[60B56001348939CA]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.w.WebAppContext@1790f46b{datanode,/,file:///home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J5/temp/jetty-localhost-33473-hadoop-hdfs-3_2_0-tests_jar-_-any-10084854464950356495.dir/webapp/,AVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/datanode}
   [junit4]   2> 1013581 INFO  (SUITE-CheckHdfsIndexTest-seed#[60B56001348939CA]-worker) [     ] o.e.j.s.AbstractConnector Started ServerConnector@2b5ccf6a{HTTP/1.1, (http/1.1)}{localhost:33473}
   [junit4]   2> 1013581 INFO  (SUITE-CheckHdfsIndexTest-seed#[60B56001348939CA]-worker) [     ] o.e.j.s.Server Started @1013610ms
   [junit4]   2> 1013823 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0xebdd5c36ef6765a0: Processing first storage report for DS-356da474-83a5-4343-959c-bfd7007c4f82 from datanode c848fd4e-120c-4cd1-b6f4-eaf449ec1326
   [junit4]   2> 1013823 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0xebdd5c36ef6765a0: from storage DS-356da474-83a5-4343-959c-bfd7007c4f82 node DatanodeRegistration(127.0.0.1:43453, datanodeUuid=c848fd4e-120c-4cd1-b6f4-eaf449ec1326, infoPort=38001, infoSecurePort=0, ipcPort=46877, storageInfo=lv=-57;cid=testClusterID;nsid=759510520;c=1588261675183), blocks: 0, hasStaleStorage: true, processing time: 0 msecs, invalidatedBlocks: 0
   [junit4]   2> 1013823 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0xebdd5c36ef6765a0: Processing first storage report for DS-f2673fea-2979-4375-9892-568e97a9bc5f from datanode c848fd4e-120c-4cd1-b6f4-eaf449ec1326
   [junit4]   2> 1013823 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0xebdd5c36ef6765a0: from storage DS-f2673fea-2979-4375-9892-568e97a9bc5f node DatanodeRegistration(127.0.0.1:43453, datanodeUuid=c848fd4e-120c-4cd1-b6f4-eaf449ec1326, infoPort=38001, infoSecurePort=0, ipcPort=46877, storageInfo=lv=-57;cid=testClusterID;nsid=759510520;c=1588261675183), blocks: 0, hasStaleStorage: false, processing time: 0 msecs, invalidatedBlocks: 0
   [junit4]   2> 1013879 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER
   [junit4]   2> 1013880 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer client port:0.0.0.0/0.0.0.0:0
   [junit4]   2> 1013880 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer Starting server
   [junit4]   2> 1013980 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.ZkTestServer start zk server on port:39119
   [junit4]   2> 1013980 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.ZkTestServer waitForServerUp: 127.0.0.1:39119
   [junit4]   2> 1013980 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.ZkTestServer parse host and port list: 127.0.0.1:39119
   [junit4]   2> 1013980 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.ZkTestServer connecting to 127.0.0.1 39119
   [junit4]   2> 1013981 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1013983 INFO  (zkConnectionManagerCallback-13811-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1013983 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1013989 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1013990 INFO  (zkConnectionManagerCallback-13813-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1013990 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1013991 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig-tlog.xml to /configs/conf1/solrconfig.xml
   [junit4]   2> 1013996 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/schema.xml to /configs/conf1/schema.xml
   [junit4]   2> 1014004 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig.snippet.randomindexconfig.xml to /configs/conf1/solrconfig.snippet.randomindexconfig.xml
   [junit4]   2> 1014005 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/stopwords.txt to /configs/conf1/stopwords.txt
   [junit4]   2> 1014006 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/protwords.txt to /configs/conf1/protwords.txt
   [junit4]   2> 1014007 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/currency.xml to /configs/conf1/currency.xml
   [junit4]   2> 1014007 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/enumsConfig.xml to /configs/conf1/enumsConfig.xml
   [junit4]   2> 1014008 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/open-exchange-rates.json to /configs/conf1/open-exchange-rates.json
   [junit4]   2> 1014008 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/mapping-ISOLatin1Accent.txt to /configs/conf1/mapping-ISOLatin1Accent.txt
   [junit4]   2> 1014009 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/old_synonyms.txt to /configs/conf1/old_synonyms.txt
   [junit4]   2> 1014009 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/synonyms.txt to /configs/conf1/synonyms.txt
   [junit4]   2> 1014010 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Will use NRT replicas unless explicitly asked otherwise
   [junit4]   2> 1014093 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 1014093 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 1014093 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ...
   [junit4]   2> 1014093 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 12.0.2+10
   [junit4]   2> 1014120 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 1014120 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 1014120 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 1014132 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@23da853{/xyl,null,AVAILABLE}
   [junit4]   2> 1014136 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.e.j.s.AbstractConnector Started ServerConnector@7e15770f{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:35629}
   [junit4]   2> 1014136 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.e.j.s.Server Started @1014165ms
   [junit4]   2> 1014136 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/xyl, solr.data.dir=hdfs://localhost.localdomain:45421/hdfs__localhost.localdomain_45421__home_jenkins_workspace_Lucene-Solr-8.x-Linux_solr_build_solr-core_test_J5_temp_solr.index.hdfs.CheckHdfsIndexTest_60B56001348939CA-001_tempDir-002_control_data, hostPort=35629, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J5/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J5/temp/solr.index.hdfs.CheckHdfsIndexTest_60B56001348939CA-001/control-001/cores}
   [junit4]   2> 1014136 ERROR (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 1014136 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 1014136 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr? version 8.6.0
   [junit4]   2> 1014136 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 1014136 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr
   [junit4]   2> 1014136 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-04-30T15:47:56.128707Z
   [junit4]   2> 1014140 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1014152 INFO  (zkConnectionManagerCallback-13815-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1014152 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1014253 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 1014253 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J5/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J5/temp/solr.index.hdfs.CheckHdfsIndexTest_60B56001348939CA-001/control-001/solr.xml
   [junit4]   2> 1014256 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 1014256 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 1014257 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@65767615, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 1014291 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 1014291 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@6075d8fd[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1014291 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@6075d8fd[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1014293 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@7981f274[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1014293 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@7981f274[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1014294 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:39119/solr
   [junit4]   2> 1014295 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1014296 INFO  (zkConnectionManagerCallback-13826-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1014296 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1014413 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [n:127.0.0.1:35629_xyl     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1014428 INFO  (zkConnectionManagerCallback-13828-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1014428 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [n:127.0.0.1:35629_xyl     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1014514 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [n:127.0.0.1:35629_xyl     ] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:35629_xyl
   [junit4]   2> 1014515 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [n:127.0.0.1:35629_xyl     ] o.a.s.c.Overseer Overseer (id=72188436999503876-127.0.0.1:35629_xyl-n_0000000000) starting
   [junit4]   2> 1014519 INFO  (OverseerStateUpdate-72188436999503876-127.0.0.1:35629_xyl-n_0000000000) [n:127.0.0.1:35629_xyl     ] o.a.s.c.Overseer Starting to work on the main queue : 127.0.0.1:35629_xyl
   [junit4]   2> 1014524 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [n:127.0.0.1:35629_xyl     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:35629_xyl
   [junit4]   2> 1014525 INFO  (zkCallback-13827-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 1014527 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [n:127.0.0.1:35629_xyl     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 1014528 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [n:127.0.0.1:35629_xyl     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=you make it. Consider configuring authentication/authorization before exposing Solr to users internal or  authorization=external.  See https://s.apache.org/solrsecurity for more info.  Solr is only as secure as disableddisabled
   [junit4]   2> 1014563 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [n:127.0.0.1:35629_xyl     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 1014581 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [n:127.0.0.1:35629_xyl     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@65767615
   [junit4]   2> 1014589 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [n:127.0.0.1:35629_xyl     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@65767615
   [junit4]   2> 1014589 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [n:127.0.0.1:35629_xyl     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@65767615
   [junit4]   2> 1014590 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [n:127.0.0.1:35629_xyl     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J5/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J5/temp/solr.index.hdfs.CheckHdfsIndexTest_60B56001348939CA-001/control-001/cores
   [junit4]   2> 1014614 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1014615 INFO  (zkConnectionManagerCallback-13845-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1014615 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1014616 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 1014616 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:39119/solr ready
   [junit4]   2> 1014632 INFO  (qtp590178597-23669) [n:127.0.0.1:35629_xyl     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:35629_xyl&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 1014640 INFO  (OverseerThreadFactory-13835-thread-1-processing-n:127.0.0.1:35629_xyl) [n:127.0.0.1:35629_xyl     ] o.a.s.c.a.c.CreateCollectionCmd Create collection control_collection
   [junit4]   2> 1014746 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 1014760 INFO  (qtp590178597-23673) [n:127.0.0.1:35629_xyl     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 1014768 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl    x:control_collection_shard1_replica_n1 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 1014768 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl    x:control_collection_shard1_replica_n1 ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 1015781 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.6.0
   [junit4]   2> 1015800 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.IndexSchema Schema name=test
   [junit4]   2> 1015942 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 1016125 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.CoreContainer Creating SolrCore 'control_collection_shard1_replica_n1' using configuration from configset conf1, trusted=true
   [junit4]   2> 1016156 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.control_collection.shard1.replica_n1' (registry 'solr.core.control_collection.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@65767615
   [junit4]   2> 1016156 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory solr.hdfs.home=hdfs://localhost.localdomain:45421/solr_hdfs_home
   [junit4]   2> 1016156 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Solr Kerberos Authentication disabled
   [junit4]   2> 1016156 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrCore [[control_collection_shard1_replica_n1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J5/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J5/temp/solr.index.hdfs.CheckHdfsIndexTest_60B56001348939CA-001/control-001/cores/control_collection_shard1_replica_n1], dataDir=[hdfs://localhost.localdomain:45421/solr_hdfs_home/control_collection/core_node2/data/]
   [junit4]   2> 1016173 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:45421/solr_hdfs_home/control_collection/core_node2/data/snapshot_metadata
   [junit4]   2> 1016187 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 1016187 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 1016187 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 1016196 WARN  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.h.HdfsDirectory The NameNode is in SafeMode - Solr will wait 5 seconds and try again.
   [junit4]   2> 1021200 WARN  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.h.HdfsDirectory The NameNode is in SafeMode - Solr will wait 5 seconds and try again.
   [junit4]   2> 1026219 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 1026224 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:45421/solr_hdfs_home/control_collection/core_node2/data
   [junit4]   2> 1026245 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:45421/solr_hdfs_home/control_collection/core_node2/data/index
   [junit4]   2> 1026253 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 1026253 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 1026253 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 1026258 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 1026259 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=28, maxMergeAtOnceExplicit=15, maxMergedSegmentMB=55.5341796875, floorSegmentMB=1.0830078125, forceMergeDeletesPctAllowed=28.44677827812134, segmentsPerTier=14.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0, deletesPctAllowed=45.913815406180746
   [junit4]   2> 1026722 WARN  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 1026789 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.HdfsUpdateLog
   [junit4]   2> 1026789 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 1026789 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.HdfsUpdateLog Initializing HdfsUpdateLog: tlogDfsReplication=2
   [junit4]   2> 1026801 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 1026801 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 1026802 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=30, maxMergeAtOnceExplicit=11, maxMergedSegmentMB=38.6669921875, floorSegmentMB=0.42578125, forceMergeDeletesPctAllowed=12.4735312021117, segmentsPerTier=38.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0, deletesPctAllowed=46.76465411586561
   [junit4]   2> 1026813 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@57da68ee[control_collection_shard1_replica_n1] main]
   [junit4]   2> 1026816 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 1026816 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 1026817 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms
   [junit4]   2> 1026817 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1665413088604585984
   [junit4]   2> 1026835 INFO  (searcherExecutor-13847-thread-1-processing-n:127.0.0.1:35629_xyl x:control_collection_shard1_replica_n1 c:control_collection s:shard1) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrCore [control_collection_shard1_replica_n1] Registered new searcher Searcher@57da68ee[control_collection_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 1026837 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ZkShardTerms Successful update of terms at /collections/control_collection/terms/shard1 to Terms{values={core_node2=0}, version=0}
   [junit4]   2> 1026837 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/control_collection/leaders/shard1
   [junit4]   2> 1026838 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 1026838 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 1026838 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:35629/xyl/control_collection_shard1_replica_n1/
   [junit4]   2> 1026839 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 1026839 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy http://127.0.0.1:35629/xyl/control_collection_shard1_replica_n1/ has no replicas
   [junit4]   2> 1026839 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/control_collection/leaders/shard1/leader after winning as /collections/control_collection/leader_elect/shard1/election/72188436999503876-core_node2-n_0000000000
   [junit4]   2> 1026840 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:35629/xyl/control_collection_shard1_replica_n1/ shard1
   [junit4]   2> 1026841 INFO  (zkCallback-13827-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 1026844 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 1026847 INFO  (zkCallback-13827-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 1026848 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=12079
   [junit4]   2> 1026852 INFO  (qtp590178597-23669) [n:127.0.0.1:35629_xyl     ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas
   [junit4]   2> 1026949 INFO  (zkCallback-13827-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 1026949 INFO  (zkCallback-13827-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 1026950 INFO  (zkCallback-13827-thread-3) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 1026950 INFO  (qtp590178597-23669) [n:127.0.0.1:35629_xyl     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:35629_xyl&wt=javabin&version=2} status=0 QTime=12318
   [junit4]   2> 1026950 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Waiting to see 1 active replicas in collection: control_collection
   [junit4]   2> 1027063 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1027064 INFO  (zkConnectionManagerCallback-13856-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1027064 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1027064 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 1027065 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:39119/solr ready
   [junit4]   2> 1027065 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.ChaosMonkey monkey: init - expire sessions:false cause connection loss:false
   [junit4]   2> 1027065 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=2&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 1027067 INFO  (OverseerThreadFactory-13835-thread-2-processing-n:127.0.0.1:35629_xyl) [n:127.0.0.1:35629_xyl     ] o.a.s.c.a.c.CreateCollectionCmd Create collection collection1
   [junit4]   2> 1027067 INFO  (OverseerCollectionConfigSetProcessor-72188436999503876-127.0.0.1:35629_xyl-n_0000000000) [n:127.0.0.1:35629_xyl     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000000 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 1027268 WARN  (OverseerThreadFactory-13835-thread-2-processing-n:127.0.0.1:35629_xyl) [n:127.0.0.1:35629_xyl     ] o.a.s.c.a.c.CreateCollectionCmd It is unusual to create a collection (collection1) without cores.
   [junit4]   2> 1027269 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl     ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas
   [junit4]   2> 1027269 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=2&wt=javabin&version=2} status=0 QTime=203
   [junit4]   2> 1027270 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 1027270 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 1027270 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 1027270 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 1027270 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 1027270 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 1027270 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Creating jetty instances pullReplicaCount=0 numOtherReplicas=1
   [junit4]   2> 1027368 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 1 in directory /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J5/temp/solr.index.hdfs.CheckHdfsIndexTest_60B56001348939CA-001/shard-1-001 of type NRT for shard1
   [junit4]   2> 1027369 WARN  (closeThreadPool-13857-thread-1) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 1027369 INFO  (closeThreadPool-13857-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 1027369 INFO  (closeThreadPool-13857-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ...
   [junit4]   2> 1027369 INFO  (closeThreadPool-13857-thread-1) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 12.0.2+10
   [junit4]   2> 1027374 INFO  (closeThreadPool-13857-thread-1) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 1027374 INFO  (closeThreadPool-13857-thread-1) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 1027374 INFO  (closeThreadPool-13857-thread-1) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 1027374 INFO  (closeThreadPool-13857-thread-1) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@7e39aaab{/xyl,null,AVAILABLE}
   [junit4]   2> 1027375 INFO  (closeThreadPool-13857-thread-1) [     ] o.e.j.s.AbstractConnector Started ServerConnector@6e22213c{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:38559}
   [junit4]   2> 1027375 INFO  (closeThreadPool-13857-thread-1) [     ] o.e.j.s.Server Started @1027404ms
   [junit4]   2> 1027375 INFO  (closeThreadPool-13857-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/xyl, solrconfig=solrconfig.xml, solr.data.dir=hdfs://localhost.localdomain:45421/hdfs__localhost.localdomain_45421__home_jenkins_workspace_Lucene-Solr-8.x-Linux_solr_build_solr-core_test_J5_temp_solr.index.hdfs.CheckHdfsIndexTest_60B56001348939CA-001_tempDir-002_jetty1, hostPort=38559, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J5/temp/solr.index.hdfs.CheckHdfsIndexTest_60B56001348939CA-001/shard-1-001/cores, replicaType=NRT}
   [junit4]   2> 1027375 ERROR (closeThreadPool-13857-thread-1) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 1027375 INFO  (closeThreadPool-13857-thread-1) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 1027375 INFO  (closeThreadPool-13857-thread-1) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr? version 8.6.0
   [junit4]   2> 1027375 INFO  (closeThreadPool-13857-thread-1) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 1027375 INFO  (closeThreadPool-13857-thread-1) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr
   [junit4]   2> 1027375 INFO  (closeThreadPool-13857-thread-1) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-04-30T15:48:09.367466Z
   [junit4]   2> 1027376 INFO  (closeThreadPool-13857-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1027377 INFO  (zkConnectionManagerCallback-13859-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1027377 INFO  (closeThreadPool-13857-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1027478 INFO  (closeThreadPool-13857-thread-1) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 1027478 INFO  (closeThreadPool-13857-thread-1) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J5/temp/solr.index.hdfs.CheckHdfsIndexTest_60B56001348939CA-001/shard-1-001/solr.xml
   [junit4]   2> 1027481 INFO  (closeThreadPool-13857-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 1027481 INFO  (closeThreadPool-13857-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 1027483 INFO  (closeThreadPool-13857-thread-1) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@65767615, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 1027575 INFO  (closeThreadPool-13857-thread-1) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 1027577 WARN  (closeThreadPool-13857-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@b13e669[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1027577 WARN  (closeThreadPool-13857-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@b13e669[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1027596 WARN  (closeThreadPool-13857-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@1aebf60d[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1027596 WARN  (closeThreadPool-13857-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@1aebf60d[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1027597 INFO  (closeThreadPool-13857-thread-1) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:39119/solr
   [junit4]   2> 1027601 INFO  (closeThreadPool-13857-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1027601 INFO  (zkConnectionManagerCallback-13870-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1027601 INFO  (closeThreadPool-13857-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1027703 INFO  (closeThreadPool-13857-thread-1) [n:127.0.0.1:38559_xyl     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1027710 INFO  (zkConnectionManagerCallback-13872-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1027710 INFO  (closeThreadPool-13857-thread-1) [n:127.0.0.1:38559_xyl     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1027714 INFO  (closeThreadPool-13857-thread-1) [n:127.0.0.1:38559_xyl     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 1027721 INFO  (closeThreadPool-13857-thread-1) [n:127.0.0.1:38559_xyl     ] o.a.s.c.ZkController Publish node=127.0.0.1:38559_xyl as DOWN
   [junit4]   2> 1027722 INFO  (closeThreadPool-13857-thread-1) [n:127.0.0.1:38559_xyl     ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 1027722 INFO  (closeThreadPool-13857-thread-1) [n:127.0.0.1:38559_xyl     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:38559_xyl
   [junit4]   2> 1027722 INFO  (zkCallback-13827-thread-3) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 1027723 INFO  (zkCallback-13871-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 1027724 INFO  (zkCallback-13855-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 1027724 INFO  (closeThreadPool-13857-thread-1) [n:127.0.0.1:38559_xyl     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 1027724 WARN  (closeThreadPool-13857-thread-1) [n:127.0.0.1:38559_xyl     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=you make it. Consider configuring authentication/authorization before exposing Solr to users internal or  authorization=external.  See https://s.apache.org/solrsecurity for more info.  Solr is only as secure as disableddisabled
   [junit4]   2> 1027735 INFO  (closeThreadPool-13857-thread-1) [n:127.0.0.1:38559_xyl     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 1027747 INFO  (closeThreadPool-13857-thread-1) [n:127.0.0.1:38559_xyl     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@65767615
   [junit4]   2> 1027753 INFO  (closeThreadPool-13857-thread-1) [n:127.0.0.1:38559_xyl     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@65767615
   [junit4]   2> 1027753 INFO  (closeThreadPool-13857-thread-1) [n:127.0.0.1:38559_xyl     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@65767615
   [junit4]   2> 1027753 INFO  (closeThreadPool-13857-thread-1) [n:127.0.0.1:38559_xyl     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J5/temp/solr.index.hdfs.CheckHdfsIndexTest_60B56001348939CA-001/shard-1-001/cores
   [junit4]   2> 1027763 INFO  (closeThreadPool-13857-thread-1) [     ] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:38559_xyl
   [junit4]   2> 1027765 INFO  (qtp590178597-23673) [n:127.0.0.1:35629_xyl     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :addreplica with params node=127.0.0.1:38559_xyl&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 1027766 INFO  (OverseerCollectionConfigSetProcessor-72188436999503876-127.0.0.1:35629_xyl-n_0000000000) [n:127.0.0.1:35629_xyl     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000002 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 1027770 INFO  (qtp590178597-23669) [n:127.0.0.1:35629_xyl     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.core.control_collection.shard1.replica_n1:INDEX.sizeInBytes} status=0 QTime=2
   [junit4]   2> 1027771 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 1027773 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 1027776 INFO  (qtp590178597-23669) [n:127.0.0.1:35629_xyl     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.core.control_collection.shard1.replica_n1:INDEX.sizeInBytes} status=0 QTime=1
   [junit4]   2> 1027777 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 1027778 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 1027779 INFO  (OverseerThreadFactory-13835-thread-3-processing-n:127.0.0.1:35629_xyl) [n:127.0.0.1:35629_xyl c:collection1 s:shard1   ] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:38559_xyl for creating new replica of shard shard1 for collection collection1
   [junit4]   2> 1027780 INFO  (OverseerThreadFactory-13835-thread-3-processing-n:127.0.0.1:35629_xyl) [n:127.0.0.1:35629_xyl c:collection1 s:shard1   ] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command.
   [junit4]   2> 1027782 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl    x:collection1_shard1_replica_n1 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_n1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 1028793 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.6.0
   [junit4]   2> 1028808 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.IndexSchema Schema name=test
   [junit4]   2> 1028886 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 1028903 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard1_replica_n1' using configuration from configset conf1, trusted=true
   [junit4]   2> 1028904 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.collection1.shard1.replica_n1' (registry 'solr.core.collection1.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@65767615
   [junit4]   2> 1028904 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory solr.hdfs.home=hdfs://localhost.localdomain:45421/solr_hdfs_home
   [junit4]   2> 1028904 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Solr Kerberos Authentication disabled
   [junit4]   2> 1028904 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrCore [[collection1_shard1_replica_n1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J5/temp/solr.index.hdfs.CheckHdfsIndexTest_60B56001348939CA-001/shard-1-001/cores/collection1_shard1_replica_n1], dataDir=[hdfs://localhost.localdomain:45421/solr_hdfs_home/collection1/core_node2/data/]
   [junit4]   2> 1028905 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:45421/solr_hdfs_home/collection1/core_node2/data/snapshot_metadata
   [junit4]   2> 1028911 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 1028911 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 1028911 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 1028914 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 1028915 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:45421/solr_hdfs_home/collection1/core_node2/data
   [junit4]   2> 1028932 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:45421/solr_hdfs_home/collection1/core_node2/data/index
   [junit4]   2> 1028939 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 1028939 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 1028939 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 1028944 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 1028944 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=28, maxMergeAtOnceExplicit=15, maxMergedSegmentMB=55.5341796875, floorSegmentMB=1.0830078125, forceMergeDeletesPctAllowed=28.44677827812134, segmentsPerTier=14.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0, deletesPctAllowed=45.913815406180746
   [junit4]   2> 1029405 WARN  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 1029462 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.HdfsUpdateLog
   [junit4]   2> 1029462 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 1029462 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.HdfsUpdateLog Initializing HdfsUpdateLog: tlogDfsReplication=2
   [junit4]   2> 1029515 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 1029515 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 1029517 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=30, maxMergeAtOnceExplicit=11, maxMergedSegmentMB=38.6669921875, floorSegmentMB=0.42578125, forceMergeDeletesPctAllowed=12.4735312021117, segmentsPerTier=38.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0, deletesPctAllowed=46.76465411586561
   [junit4]   2> 1029527 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@6f0fcbb6[collection1_shard1_replica_n1] main]
   [junit4]   2> 1029528 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 1029528 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 1029529 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms
   [junit4]   2> 1029530 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1665413091449372672
   [junit4]   2> 1029532 INFO  (searcherExecutor-13883-thread-1-processing-n:127.0.0.1:38559_xyl x:collection1_shard1_replica_n1 c:collection1 s:shard1) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrCore [collection1_shard1_replica_n1] Registered new searcher Searcher@6f0fcbb6[collection1_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 1029534 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection1/terms/shard1 to Terms{values={core_node2=0}, version=0}
   [junit4]   2> 1029534 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection1/leaders/shard1
   [junit4]   2> 1029540 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 1029540 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 1029540 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:38559/xyl/collection1_shard1_replica_n1/
   [junit4]   2> 1029541 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 1029544 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy http://127.0.0.1:38559/xyl/collection1_shard1_replica_n1/ has no replicas
   [junit4]   2> 1029544 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/collection1/leaders/shard1/leader after winning as /collections/collection1/leader_elect/shard1/election/72188436999503881-core_node2-n_0000000000
   [junit4]   2> 1029545 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:38559/xyl/collection1_shard1_replica_n1/ shard1
   [junit4]   2> 1029681 INFO  (zkCallback-13871-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 1029682 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 1029685 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_n1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1903
   [junit4]   2> 1029687 INFO  (qtp590178597-23673) [n:127.0.0.1:35629_xyl c:collection1    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:38559_xyl&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2} status=0 QTime=1922
   [junit4]   2> 1029688 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Waiting to see 1 active replicas in collection: collection1
   [junit4]   2> 1029688 INFO  (zkCallback-13871-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 1029768 INFO  (OverseerCollectionConfigSetProcessor-72188436999503876-127.0.0.1:35629_xyl-n_0000000000) [n:127.0.0.1:35629_xyl     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000004 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 1029785 INFO  (zkCallback-13855-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 1029785 INFO  (zkCallback-13871-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 1029785 INFO  (zkCallback-13871-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 1029787 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.SolrTestCaseJ4 ###Starting testDeletedDocs
   [junit4]   2> 1029789 INFO  (zkCallback-13871-thread-3) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 1070518 WARN  (DataStreamer for file /solr/_6_Lucene85TermVectorsIndexfile_pointers_r.tmp) [     ] o.a.h.h.DataStreamer Caught exception
   [junit4]   2>           => java.lang.InterruptedException
   [junit4]   2> 	at java.base/java.lang.Object.wait(Native Method)
   [junit4]   2> java.lang.InterruptedException: null
   [junit4]   2> 	at java.lang.Object.wait(Native Method) ~[?:?]
   [junit4]   2> 	at java.lang.Thread.join(Thread.java:1308) ~[?:?]
   [junit4]   2> 	at java.lang.Thread.join(Thread.java:1375) [?:?]
   [junit4]   2> 	at org.apache.hadoop.hdfs.DataStreamer.closeResponder(DataStreamer.java:986) ~[hadoop-hdfs-client-3.2.0.jar:?]
   [junit4]   2> 	at org.apache.hadoop.hdfs.DataStreamer.endBlock(DataStreamer.java:640) ~[hadoop-hdfs-client-3.2.0.jar:?]
   [junit4]   2> 	at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:810) ~[hadoop-hdfs-client-3.2.0.jar:?]
   [junit4]   2> 1074568 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.core.collection1.shard1.replica_n1:INDEX.sizeInBytes&key=solr.core.collection1.shard1.replica_n1:QUERY./select.requests&key=solr.core.collection1.shard1.replica_n1:UPDATE./update.requests} status=0 QTime=2
   [junit4]   2> 1074569 INFO  (qtp1780332711-23731) [n:127.0.0.1:38559_xyl     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.jvm:os.processCpuLoad&key=solr.node:CONTAINER.fs.coreRoot.usableSpace&key=solr.jvm:os.systemLoadAverage&key=solr.jvm:memory.heap.used} status=0 QTime=0
   [junit4]   2> 1074573 INFO  (qtp590178597-23669) [n:127.0.0.1:35629_xyl     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.core.control_collection.shard1.replica_n1:QUERY./select.requests&key=solr.core.control_collection.shard1.replica_n1:INDEX.sizeInBytes&key=solr.core.control_collection.shard1.replica_n1:UPDATE./update.requests} status=0 QTime=1
   [junit4]   2> 1074574 INFO  (qtp590178597-23672) [n:127.0.0.1:35629_xyl     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.jvm:os.processCpuLoad&key=solr.node:CONTAINER.fs.coreRoot.usableSpace&key=solr.jvm:os.systemLoadAverage&key=solr.jvm:memory.heap.used} status=0 QTime=0
   [junit4]   2> 1096659 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:45421/solr
   [junit4]   2> 1096662 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[60B56001348939CA]) [     ] o.a.s.SolrTestCaseJ4 ###Ending testDeletedDocs
   [junit4]   2> 1096804 INFO  (closeThreadPool-13890-thread-2) [     ] o.a.s.c.CoreContainer Shutting down CoreContainer instance=973784677
   [junit4]   2> 1096804 INFO  (closeThreadPool-13890-thread-2) [     ] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:35629_xyl
   [junit4]   2> 1096809 INFO  (closeThreadPool-13890-thread-3) [     ] o.a.s.c.CoreContainer Shutting down CoreContainer instance=842490194
   [junit4]   2> 1096809 INFO  (closeThreadPool-13890-thread-3) [     ] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:38559_xyl
   [junit4]   2> 1096809 INFO  (closeThreadPool-13890-thread-2) [     ] o.a.s.c.ZkController Publish this node as DOWN...
   [junit4]   2> 1096809 INFO  (closeThreadPool-13890-thread-2) [     ] o.a.s.c.ZkController Publish node=127.0.0.1:35629_xyl as DOWN
   [junit4]   2> 1096810 INFO  (closeThreadPool-13890-thread-3) [     ] o.a.s.c.ZkController Publish this node as DOWN...
   [junit4]   2> 1096810 INFO  (closeThreadPool-13890-thread-3) [     ] o.a.s.c.ZkController Publish node=127.0.0.1:38559_xyl as DOWN
   [junit4]   2> 1096812 INFO  (zkCallback-13827-thread-5) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 1096824 INFO  (zkCallback-13827-thread-5) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 1096825 INFO  (zkCallback-13827-thread-6) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 1096832 INFO  (coreCloseExecutor-13897-thread-1) [n:127.0.0.1:35629_xyl     ] o.a.s.c.SolrCore [control_collection_shard1_replica_n1]  CLOSING SolrCore org.apache.solr.core.SolrCore@390ec88a
   [junit4]   2> 1096832 INFO  (coreCloseExecutor-13897-thread-1) [n:127.0.0.1:35629_xyl     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.control_collection.shard1.replica_n1 tag=SolrCore@390ec88a
   [junit4]   2> 1096832 INFO  (coreCloseExecutor-13897-thread-1) [n:127.0.0.1:35629_xyl     ] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@1624f713: rootName = null, domain = solr.core.control_collection.shard1.replica_n1, service url = null, agent id = null] for registry solr.core.control_collection.shard1.replica_n1/com.codahale.metrics.MetricRegistry@4cce2699
   [junit4]   2> 1096841 INFO  (coreCloseExecutor-13897-thread-1) [n:127.0.0.1:35629_xyl     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.control_collection.shard1.leader tag=SolrCore@390ec88a
   [junit4]   2> 1096842 INFO  (coreCloseExecutor-13897-thread-1) [n:127.0.0.1:35629_xyl     ] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close()  ... SKIPPED (unnecessary).
   [junit4]   2> 1096867 INFO  (coreCloseExecutor-13898-thread-1) [n:127.0.0.1:38559_xyl     ] o.a.s.c.SolrCore [collection1_shard1_replica_n1]  CLOSING SolrCore org.apache.solr.core.SolrCore@2c03f054
   [junit4]   2> 1096867 INFO  (coreCloseExecutor-13898-thread-1) [n:127.0.0.1:38559_xyl     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.collection1.shard1.replica_n1 tag=SolrCore@2c03f054
   [junit4]   2> 1096867 INFO  (coreCloseExecutor-13898-thread-1) [n:127.0.0.1:38559_xyl     ] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@75b9d3d2: rootName = null, domain = solr.core.collection1.shard1.replica_n1, service url = null, agent id = null] for registry solr.core.collection1.shard1.replica_n1/com.codahale.metrics.MetricRegistry@2304c7b4
   [junit4]   2> 1096876 INFO  (coreCloseExecutor-13897-thread-1) [n:127.0.0.1:35629_xyl     ] o.a.s.s.h.HdfsDi

[...truncated too long message...]

1:45421) [     ] o.a.h.h.s.d.DataNode Ending block pool service for: Block pool BP-833528286-127.0.0.1-1588261675183 (Datanode Uuid c848fd4e-120c-4cd1-b6f4-eaf449ec1326) service to localhost.localdomain/127.0.0.1:45421
   [junit4]   2> 1150876 INFO  (SUITE-CheckHdfsIndexTest-seed#[60B56001348939CA]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.w.WebAppContext@6f5ec2bb{hdfs,/,null,UNAVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/hdfs}
   [junit4]   2> 1150877 INFO  (SUITE-CheckHdfsIndexTest-seed#[60B56001348939CA]-worker) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@7266150f{HTTP/1.1, (http/1.1)}{localhost.localdomain:0}
   [junit4]   2> 1150877 INFO  (SUITE-CheckHdfsIndexTest-seed#[60B56001348939CA]-worker) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 1150877 INFO  (SUITE-CheckHdfsIndexTest-seed#[60B56001348939CA]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@2045f7a8{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,UNAVAILABLE}
   [junit4]   2> NOTE: leaving temporary files on disk at: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J5/temp/solr.index.hdfs.CheckHdfsIndexTest_60B56001348939CA-001
   [junit4]   2> Apr 30, 2020 3:50:13 PM com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks
   [junit4]   2> WARNING: Will linger awaiting termination of 65 leaked thread(s).
   [junit4]   2> NOTE: test params are: codec=Asserting(Lucene84): {date=PostingsFormat(name=Direct), rnd_b=PostingsFormat(name=LuceneFixedGap), field=PostingsFormat(name=Direct), docid=PostingsFormat(name=LuceneFixedGap), multiDefault=PostingsFormat(name=Direct), _root_=BlockTreeOrds(blocksize=128), titleTokenized=TestBloomFilteredLucenePostings(BloomFilteringPostingsFormat(Lucene84)), id=PostingsFormat(name=LuceneFixedGap), body=PostingsFormat(name=Direct), title=BlockTreeOrds(blocksize=128)}, docValues:{docid_intDV=DocValuesFormat(name=Asserting), range_facet_l_dv=DocValuesFormat(name=Direct), _version_=DocValuesFormat(name=Lucene80), range_facet_i_dv=DocValuesFormat(name=Lucene80), intDvoDefault=DocValuesFormat(name=Direct), titleDV=DocValuesFormat(name=Lucene80), timestamp=DocValuesFormat(name=Lucene80)}, maxPointsInLeafNode=761, maxMBSortInHeap=5.244903108881285, sim=Asserting(org.apache.lucene.search.similarities.AssertingSimilarity@7f4c18d4), locale=so-KE, timezone=Etc/GMT-14
   [junit4]   2> NOTE: Linux 5.3.0-46-generic amd64/AdoptOpenJDK 12.0.2 (64-bit)/cpus=16,threads=4,free=258792688,total=534773760
   [junit4]   2> NOTE: All tests run in this JVM: [CollectionPropsTest, TestBinaryResponseWriter, TestDynamicURP, TestJsonFacetsStatsParsing, TestPolicyCloud, TestChildDocTransformer, TestFileDictionaryLookup, TestDistributedMap, TestManagedSynonymFilterFactory, ConcurrentDeleteAndCreateCollectionTest, TestQueryUtils, TestSchemaResource, DistributedQueryComponentCustomSortTest, LeaderFailureAfterFreshStartTest, DocExpirationUpdateProcessorFactoryTest, DistributedFacetPivotLargeTest, ReturnFieldsTest, TestUnifiedSolrHighlighterWithoutStoredId, TestTlogReplica, TestFoldingMultitermQuery, TestSimpleTrackingShardHandler, TestUseDocValuesAsStored, ConnectionReuseTest, XCJFQueryTest, CoreAdminCreateDiscoverTest, HttpPartitionTest, DistributedIntervalFacetingTest, DynamicMapsTest, TestClassicSimilarityFactory, TestPKIAuthenticationPlugin, HealthCheckHandlerTest, TestSolrCloudWithKerberosAlt, SolrCloudReportersTest, SynonymTokenizerTest, TestPullReplicaErrorHandling, ProtectedTermFilterFactoryTest, TestMergePolicyConfig, PrimitiveFieldTypeTest, TestPHPSerializedResponseWriter, XMLAtomicUpdateMultivalueTest, DeleteShardTest, SolrTestCaseJ4Test, CopyFieldTest, TestZkAclsWithHadoopAuth, SolrLogPostToolTest, HdfsDirectoryFactoryTest, CdcrUpdateLogTest, NodeAddedTriggerIntegrationTest, TestOrdValues, AddSchemaFieldsUpdateProcessorFactoryTest, RoutingToNodesWithPropertiesTest, TestLegacyFieldReuse, SolrRequestParserTest, TestSimUtils, StressHdfsTest, TestSimpleQParserPlugin, CreateCollectionCleanupTest, RequestLoggingTest, V2ApiIntegrationTest, RuleEngineTest, TestDefaultStatsCache, ZkControllerTest, TestFieldCache, RandomizedTaggerTest, TestTrieFacet, CSVRequestHandlerTest, TestRecovery, TestCollationFieldDocValues, CursorMarkTest, TestSimComputePlanAction, TestElisionMultitermQuery, TestHdfsBackupRestoreCore, SpellCheckCollatorWithCollapseTest, AnalysisAfterCoreReloadTest, EchoParamsTest, TestCrossCoreJoin, TestCursorMarkWithoutUniqueKey, TestDistributedGrouping, TestDistributedMissingSort, TestDistributedSearch, AddReplicaTest, BasicDistributedZk2Test, ChaosMonkeyNothingIsSafeTest, CreateRoutedAliasTest, DistributedQueueTest, MetricsHistoryIntegrationTest, MetricsHistoryWithAuthIntegrationTest, MissingSegmentRecoveryTest, MoveReplicaHDFSFailoverTest, MoveReplicaHDFSTest, MoveReplicaTest, PeerSyncReplicationTest, TestCloudInspectUtil, TestCloudPhrasesIdentificationComponent, TestCloudPseudoReturnFields, TestCloudRecovery, TestCloudSearcherWarming, TestRequestForwarding, TestTolerantUpdateProcessorRandomCloud, TlogReplayBufferedWhileIndexingTest, ZkFailoverTest, AssignTest, AsyncCallRequestStatusResponseTest, HdfsCollectionsAPIDistributedZkTest, ShardSplitTest, BlobRepositoryCloudTest, TestCodecSupport, TestConfigSets, TestCorePropertiesReload, TestQuerySenderNoQuery, TestShardHandlerFactory, TestSolrDeletionPolicy1, SearchHandlerTest, TestRestoreCore, InfoHandlerTest, MBeansHandlerTest, TestHttpShardHandlerFactory, TaggerTest, HighlighterConfigTest, CheckHdfsIndexTest]
   [junit4] Completed [858/907 (1!)] on J5 in 140.82s, 5 tests, 1 error, 1 skipped <<< FAILURES!

[...truncated 38266 lines...]
-ecj-javadoc-lint-src:
    [mkdir] Created dir: /tmp/ecj1610761845
 [ecj-lint] Compiling 931 source files to /tmp/ecj1610761845
 [ecj-lint] ----------
 [ecj-lint] 1. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/CodecUtil.java (at line 523)
 [ecj-lint] 	throw new CorruptIndexException("misplaced codec footer (file truncated?): length=" + in.length() + " but footerLength==" + footerLength(), input);
 [ecj-lint] 	^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'in' is not closed at this location
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 2. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingStoredFieldsReader.java (at line 166)
 [ecj-lint] 	FieldsIndexReader fieldsIndexReader = new FieldsIndexReader(d, si.name, segmentSuffix, INDEX_EXTENSION_PREFIX, INDEX_CODEC_NAME, si.getId());
 [ecj-lint] 	                  ^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'fieldsIndexReader' is never closed
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 3. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingTermVectorsReader.java (at line 148)
 [ecj-lint] 	FieldsIndexReader fieldsIndexReader = new FieldsIndexReader(d, si.name, segmentSuffix, VECTORS_INDEX_EXTENSION_PREFIX, VECTORS_INDEX_CODEC_NAME, si.getId());
 [ecj-lint] 	                  ^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'fieldsIndexReader' is never closed
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 4. ERROR in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/search/IndexSearcher.java (at line 50)
 [ecj-lint] 	import org.apache.lucene.util.automaton.ByteRunAutomaton;
 [ecj-lint] 	       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 [ecj-lint] The import org.apache.lucene.util.automaton.ByteRunAutomaton is never used
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 5. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/util/automaton/Operations.java (at line 742)
 [ecj-lint] 	Integer q = newstate.get(statesSet);
 [ecj-lint] 	                         ^^^^^^^^^
 [ecj-lint] Unlikely argument type SortedIntSet for get(Object) on a Map<SortedIntSet.FrozenIntSet,Integer>
 [ecj-lint] ----------
 [ecj-lint] 5 problems (1 error, 4 warnings)

BUILD FAILED
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/build.xml:634: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/build.xml:101: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build.xml:201: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/common-build.xml:2127: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/common-build.xml:2166: Compile failed; see the compiler error output for details.

Total time: 41 minutes 1 second
Build step 'Invoke Ant' marked build as failure
Archiving artifacts
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
[WARNINGS] Skipping publisher since build result is FAILURE
Recording test results
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2

[JENKINS] Lucene-Solr-8.x-Linux (64bit/jdk-12.0.2) - Build # 2921 - Still Failing!

Posted by Policeman Jenkins Server <je...@thetaphi.de>.
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-8.x-Linux/2921/
Java: 64bit/jdk-12.0.2 -XX:+UseCompressedOops -XX:+UseSerialGC

1 tests failed.
FAILED:  org.apache.solr.index.hdfs.CheckHdfsIndexTest.doTest

Error Message:
Error from server at http://127.0.0.1:39439/_t/collection1: java.lang.NullPointerException  at org.apache.solr.handler.admin.SystemInfoHandler.getSecurityInfo(SystemInfoHandler.java:326)  at org.apache.solr.handler.admin.SystemInfoHandler.handleRequestBody(SystemInfoHandler.java:146)  at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)  at org.apache.solr.core.SolrCore.execute(SolrCore.java:2600)  at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:803)  at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:582)  at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:432)  at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:362)  at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)  at org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)  at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)  at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)  at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)  at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1610)  at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)  at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1300)  at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)  at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)  at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1580)  at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)  at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1215)  at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)  at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)  at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)  at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:767)  at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)  at org.eclipse.jetty.server.Server.handle(Server.java:500)  at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)  at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)  at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)  at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)  at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)  at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)  at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)  at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)  at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)  at java.base/java.lang.Thread.run(Thread.java:835) 

Stack Trace:
org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:39439/_t/collection1: java.lang.NullPointerException
	at org.apache.solr.handler.admin.SystemInfoHandler.getSecurityInfo(SystemInfoHandler.java:326)
	at org.apache.solr.handler.admin.SystemInfoHandler.handleRequestBody(SystemInfoHandler.java:146)
	at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)
	at org.apache.solr.core.SolrCore.execute(SolrCore.java:2600)
	at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:803)
	at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:582)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:432)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:362)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)
	at org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)
	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1610)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1300)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)
	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1580)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1215)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)
	at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:767)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.server.Server.handle(Server.java:500)
	at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)
	at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)
	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)
	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)
	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
	at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)
	at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)
	at java.base/java.lang.Thread.run(Thread.java:835)

	at __randomizedtesting.SeedInfo.seed([9DE6106AF9834F4E:3AA2A8CE94385CF7]:0)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:665)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:265)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:248)
	at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:211)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1003)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1018)
	at org.apache.solr.index.hdfs.CheckHdfsIndexTest.doTest(CheckHdfsIndexTest.java:120)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:567)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1081)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1053)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at java.base/java.lang.Thread.run(Thread.java:835)




Build Log:
[...truncated 14969 lines...]
   [junit4] Suite: org.apache.solr.index.hdfs.CheckHdfsIndexTest
   [junit4]   2> 674118 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.a.s.SolrTestCase Setting 'solr.default.confdir' system property to test-framework derived value of '/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/server/solr/configsets/_default/conf'
   [junit4]   2> 674118 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.a.s.SolrTestCaseJ4 SecureRandom sanity checks: test.solr.allowed.securerandom=null & java.security.egd=file:/dev/./urandom
   [junit4]   2> 674119 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.a.s.SolrTestCaseJ4 Created dataDir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_9DE6106AF9834F4E-001/data-dir-41-001
   [junit4]   2> 674119 WARN  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.a.s.SolrTestCaseJ4 startTrackingSearchers: numOpens=6 numCloses=6
   [junit4]   2> 674119 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.a.s.SolrTestCaseJ4 Using TrieFields (NUMERIC_POINTS_SYSPROP=false) w/NUMERIC_DOCVALUES_SYSPROP=true
   [junit4]   2> 674119 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.a.s.SolrTestCaseJ4 Randomized ssl (false) and clientAuth (true) via: @org.apache.solr.util.RandomizeSSL(reason="", value=0.0/0.0, ssl=0.0/0.0, clientAuth=0.0/0.0)
   [junit4]   2> 674119 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.a.s.BaseDistributedSearchTestCase Setting hostContext system property: /_t/
   [junit4]   1> Formatting using clusterid: testClusterID
   [junit4]   2> 674867 WARN  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 674881 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 12.0.2+10
   [junit4]   2> 674887 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 674887 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 674887 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 674889 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@42d5e6bf{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,AVAILABLE}
   [junit4]   2> 675042 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.w.WebAppContext@115a46e6{hdfs,/,file:///home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/jetty-localhost_localdomain-41009-hadoop-hdfs-3_2_0-tests_jar-_-any-15861095362240058289.dir/webapp/,AVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/hdfs}
   [junit4]   2> 675042 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.AbstractConnector Started ServerConnector@29a71165{HTTP/1.1, (http/1.1)}{localhost.localdomain:41009}
   [junit4]   2> 675042 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.Server Started @675053ms
   [junit4]   2> 675380 WARN  (StorageLocationChecker thread 0) [     ] o.a.h.u.NativeCodeLoader Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
   [junit4]   2> 675415 WARN  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 675419 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 12.0.2+10
   [junit4]   2> 675419 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 675419 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 675419 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 675420 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@1ac38bef{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,AVAILABLE}
   [junit4]   2> 675553 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.w.WebAppContext@717c8ce2{datanode,/,file:///home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/jetty-localhost-35971-hadoop-hdfs-3_2_0-tests_jar-_-any-15958930941942190994.dir/webapp/,AVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/datanode}
   [junit4]   2> 675553 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.AbstractConnector Started ServerConnector@1c200a5b{HTTP/1.1, (http/1.1)}{localhost:35971}
   [junit4]   2> 675553 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.Server Started @675564ms
   [junit4]   2> 675714 WARN  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 675715 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 12.0.2+10
   [junit4]   2> 675722 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 675722 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 675722 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 675723 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@48693126{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,AVAILABLE}
   [junit4]   2> 675825 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.w.WebAppContext@49db7ad{datanode,/,file:///home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/jetty-localhost-36723-hadoop-hdfs-3_2_0-tests_jar-_-any-2128974269637608975.dir/webapp/,AVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/datanode}
   [junit4]   2> 675825 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.AbstractConnector Started ServerConnector@61b47557{HTTP/1.1, (http/1.1)}{localhost:36723}
   [junit4]   2> 675825 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.Server Started @675836ms
   [junit4]   2> 676040 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0xc55cd0eb193fa2ba: Processing first storage report for DS-f338b7c2-e479-4089-8209-ad2e8248a862 from datanode 2f26f45f-9aeb-416f-b343-84a2ae9b6198
   [junit4]   2> 676041 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0xc55cd0eb193fa2ba: from storage DS-f338b7c2-e479-4089-8209-ad2e8248a862 node DatanodeRegistration(127.0.0.1:34767, datanodeUuid=2f26f45f-9aeb-416f-b343-84a2ae9b6198, infoPort=36121, infoSecurePort=0, ipcPort=40225, storageInfo=lv=-57;cid=testClusterID;nsid=1598458588;c=1588253928542), blocks: 0, hasStaleStorage: true, processing time: 1 msecs, invalidatedBlocks: 0
   [junit4]   2> 676041 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x88b29a1e67b7b1c4: Processing first storage report for DS-670f64b3-4f48-4cc9-896d-e8e89f77d231 from datanode 5ecb6f04-0341-419b-b3e2-17fea2d20036
   [junit4]   2> 676041 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x88b29a1e67b7b1c4: from storage DS-670f64b3-4f48-4cc9-896d-e8e89f77d231 node DatanodeRegistration(127.0.0.1:45387, datanodeUuid=5ecb6f04-0341-419b-b3e2-17fea2d20036, infoPort=33611, infoSecurePort=0, ipcPort=45275, storageInfo=lv=-57;cid=testClusterID;nsid=1598458588;c=1588253928542), blocks: 0, hasStaleStorage: true, processing time: 0 msecs, invalidatedBlocks: 0
   [junit4]   2> 676041 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0xc55cd0eb193fa2ba: Processing first storage report for DS-895d7b38-e6b9-463e-8cf5-bea39393fbed from datanode 2f26f45f-9aeb-416f-b343-84a2ae9b6198
   [junit4]   2> 676041 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0xc55cd0eb193fa2ba: from storage DS-895d7b38-e6b9-463e-8cf5-bea39393fbed node DatanodeRegistration(127.0.0.1:34767, datanodeUuid=2f26f45f-9aeb-416f-b343-84a2ae9b6198, infoPort=36121, infoSecurePort=0, ipcPort=40225, storageInfo=lv=-57;cid=testClusterID;nsid=1598458588;c=1588253928542), blocks: 0, hasStaleStorage: false, processing time: 0 msecs, invalidatedBlocks: 0
   [junit4]   2> 676041 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x88b29a1e67b7b1c4: Processing first storage report for DS-5ba18a3e-7210-4fa2-b5e0-8e4ae54c253c from datanode 5ecb6f04-0341-419b-b3e2-17fea2d20036
   [junit4]   2> 676041 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x88b29a1e67b7b1c4: from storage DS-5ba18a3e-7210-4fa2-b5e0-8e4ae54c253c node DatanodeRegistration(127.0.0.1:45387, datanodeUuid=5ecb6f04-0341-419b-b3e2-17fea2d20036, infoPort=33611, infoSecurePort=0, ipcPort=45275, storageInfo=lv=-57;cid=testClusterID;nsid=1598458588;c=1588253928542), blocks: 0, hasStaleStorage: false, processing time: 0 msecs, invalidatedBlocks: 0
   [junit4]   2> 676157 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER
   [junit4]   2> 676157 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer client port:0.0.0.0/0.0.0.0:0
   [junit4]   2> 676157 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer Starting server
   [junit4]   2> 676257 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.ZkTestServer start zk server on port:35563
   [junit4]   2> 676257 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.ZkTestServer waitForServerUp: 127.0.0.1:35563
   [junit4]   2> 676257 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.ZkTestServer parse host and port list: 127.0.0.1:35563
   [junit4]   2> 676257 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.ZkTestServer connecting to 127.0.0.1 35563
   [junit4]   2> 676259 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 676260 INFO  (zkConnectionManagerCallback-8107-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 676260 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 676262 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 676265 INFO  (zkConnectionManagerCallback-8109-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 676265 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 676266 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig-tlog.xml to /configs/conf1/solrconfig.xml
   [junit4]   2> 676272 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/schema.xml to /configs/conf1/schema.xml
   [junit4]   2> 676277 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig.snippet.randomindexconfig.xml to /configs/conf1/solrconfig.snippet.randomindexconfig.xml
   [junit4]   2> 676280 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/stopwords.txt to /configs/conf1/stopwords.txt
   [junit4]   2> 676286 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/protwords.txt to /configs/conf1/protwords.txt
   [junit4]   2> 676286 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/currency.xml to /configs/conf1/currency.xml
   [junit4]   2> 676287 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/enumsConfig.xml to /configs/conf1/enumsConfig.xml
   [junit4]   2> 676288 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/open-exchange-rates.json to /configs/conf1/open-exchange-rates.json
   [junit4]   2> 676290 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/mapping-ISOLatin1Accent.txt to /configs/conf1/mapping-ISOLatin1Accent.txt
   [junit4]   2> 676291 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/old_synonyms.txt to /configs/conf1/old_synonyms.txt
   [junit4]   2> 676292 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/synonyms.txt to /configs/conf1/synonyms.txt
   [junit4]   2> 676292 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Will use NRT replicas unless explicitly asked otherwise
   [junit4]   2> 676848 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 676848 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 676848 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ...
   [junit4]   2> 676848 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 12.0.2+10
   [junit4]   2> 676850 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 676850 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 676850 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 676851 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@17c36778{/_t,null,AVAILABLE}
   [junit4]   2> 676853 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.e.j.s.AbstractConnector Started ServerConnector@74f2fa14{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:32797}
   [junit4]   2> 676853 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.e.j.s.Server Started @676864ms
   [junit4]   2> 676853 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/_t, solr.data.dir=hdfs://localhost.localdomain:46161/hdfs__localhost.localdomain_46161__home_jenkins_workspace_Lucene-Solr-8.x-Linux_solr_build_solr-core_test_J3_temp_solr.index.hdfs.CheckHdfsIndexTest_9DE6106AF9834F4E-001_tempDir-002_control_data, hostPort=32797, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_9DE6106AF9834F4E-001/control-001/cores}
   [junit4]   2> 676853 ERROR (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 676853 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 676853 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solrâ„¢ version 8.6.0
   [junit4]   2> 676853 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 676853 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr
   [junit4]   2> 676853 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-04-30T13:38:50.710981Z
   [junit4]   2> 676855 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 676856 INFO  (zkConnectionManagerCallback-8111-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 676856 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 676957 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 676957 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_9DE6106AF9834F4E-001/control-001/solr.xml
   [junit4]   2> 676961 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 676961 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 676962 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@2600d2f8, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 677019 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 677020 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@13f0772a[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 677020 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@13f0772a[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 677024 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@51baeb42[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 677024 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@51baeb42[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 677025 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:35563/solr
   [junit4]   2> 677026 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 677027 INFO  (zkConnectionManagerCallback-8122-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 677027 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 677129 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [n:127.0.0.1:32797__t     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 677130 INFO  (zkConnectionManagerCallback-8124-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 677130 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [n:127.0.0.1:32797__t     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 677187 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [n:127.0.0.1:32797__t     ] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:32797__t
   [junit4]   2> 677187 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [n:127.0.0.1:32797__t     ] o.a.s.c.Overseer Overseer (id=72187929365774340-127.0.0.1:32797__t-n_0000000000) starting
   [junit4]   2> 677191 INFO  (OverseerStateUpdate-72187929365774340-127.0.0.1:32797__t-n_0000000000) [n:127.0.0.1:32797__t     ] o.a.s.c.Overseer Starting to work on the main queue : 127.0.0.1:32797__t
   [junit4]   2> 677193 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [n:127.0.0.1:32797__t     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:32797__t
   [junit4]   2> 677198 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [n:127.0.0.1:32797__t     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 677198 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [n:127.0.0.1:32797__t     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=you make it. Consider configuring authentication/authorization before exposing Solr to users internal or  authorization=external.  See https://s.apache.org/solrsecurity for more info.  Solr is only as secure as disableddisabled
   [junit4]   2> 677205 INFO  (zkCallback-8123-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 677211 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [n:127.0.0.1:32797__t     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 677230 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [n:127.0.0.1:32797__t     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@2600d2f8
   [junit4]   2> 677238 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [n:127.0.0.1:32797__t     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@2600d2f8
   [junit4]   2> 677238 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [n:127.0.0.1:32797__t     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@2600d2f8
   [junit4]   2> 677240 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [n:127.0.0.1:32797__t     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_9DE6106AF9834F4E-001/control-001/cores
   [junit4]   2> 677246 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 677247 INFO  (zkConnectionManagerCallback-8141-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 677247 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 677247 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 677248 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:35563/solr ready
   [junit4]   2> 677249 INFO  (qtp1389335085-15048) [n:127.0.0.1:32797__t     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:32797__t&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 677251 INFO  (OverseerThreadFactory-8131-thread-1-processing-n:127.0.0.1:32797__t) [n:127.0.0.1:32797__t     ] o.a.s.c.a.c.CreateCollectionCmd Create collection control_collection
   [junit4]   2> 677356 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 677357 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 677360 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t    x:control_collection_shard1_replica_n1 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 677360 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t    x:control_collection_shard1_replica_n1 ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 678368 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.6.0
   [junit4]   2> 678381 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.IndexSchema Schema name=test
   [junit4]   2> 678470 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 678479 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.CoreContainer Creating SolrCore 'control_collection_shard1_replica_n1' using configuration from configset conf1, trusted=true
   [junit4]   2> 678480 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.control_collection.shard1.replica_n1' (registry 'solr.core.control_collection.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@2600d2f8
   [junit4]   2> 678482 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory solr.hdfs.home=hdfs://localhost.localdomain:46161/solr_hdfs_home
   [junit4]   2> 678482 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Solr Kerberos Authentication disabled
   [junit4]   2> 678482 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrCore [[control_collection_shard1_replica_n1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_9DE6106AF9834F4E-001/control-001/cores/control_collection_shard1_replica_n1], dataDir=[hdfs://localhost.localdomain:46161/solr_hdfs_home/control_collection/core_node2/data/]
   [junit4]   2> 678483 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:46161/solr_hdfs_home/control_collection/core_node2/data/snapshot_metadata
   [junit4]   2> 678492 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 678492 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [33554432] will allocate [1] slabs and use ~[33554432] bytes
   [junit4]   2> 678492 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new global HDFS BlockCache
   [junit4]   2> 678530 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 678534 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:46161/solr_hdfs_home/control_collection/core_node2/data
   [junit4]   2> 678555 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:46161/solr_hdfs_home/control_collection/core_node2/data/index
   [junit4]   2> 678560 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 678560 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [33554432] will allocate [1] slabs and use ~[33554432] bytes
   [junit4]   2> 678567 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 678568 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=18, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.7290290228651891]
   [junit4]   2> 679070 WARN  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 679127 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.HdfsUpdateLog
   [junit4]   2> 679127 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 679127 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.HdfsUpdateLog Initializing HdfsUpdateLog: tlogDfsReplication=2
   [junit4]   2> 679141 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 679141 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 679144 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogDocMergePolicy: [LogDocMergePolicy: minMergeSize=1000, mergeFactor=27, maxMergeSize=9223372036854775807, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.0]
   [junit4]   2> 679208 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@7094d199[control_collection_shard1_replica_n1] main]
   [junit4]   2> 679215 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 679217 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 679219 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms
   [junit4]   2> 679235 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1665404956145876992
   [junit4]   2> 679240 INFO  (searcherExecutor-8143-thread-1-processing-n:127.0.0.1:32797__t x:control_collection_shard1_replica_n1 c:control_collection s:shard1) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrCore [control_collection_shard1_replica_n1] Registered new searcher Searcher@7094d199[control_collection_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 679242 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ZkShardTerms Successful update of terms at /collections/control_collection/terms/shard1 to Terms{values={core_node2=0}, version=0}
   [junit4]   2> 679242 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/control_collection/leaders/shard1
   [junit4]   2> 679244 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 679244 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 679244 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:32797/_t/control_collection_shard1_replica_n1/
   [junit4]   2> 679244 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 679244 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy http://127.0.0.1:32797/_t/control_collection_shard1_replica_n1/ has no replicas
   [junit4]   2> 679244 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/control_collection/leaders/shard1/leader after winning as /collections/control_collection/leader_elect/shard1/election/72187929365774340-core_node2-n_0000000000
   [junit4]   2> 679246 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:32797/_t/control_collection_shard1_replica_n1/ shard1
   [junit4]   2> 679347 INFO  (zkCallback-8123-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 679347 INFO  (zkCallback-8123-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 679347 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 679349 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1989
   [junit4]   2> 679351 INFO  (qtp1389335085-15048) [n:127.0.0.1:32797__t     ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas
   [junit4]   2> 679449 INFO  (zkCallback-8123-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 679449 INFO  (zkCallback-8123-thread-3) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 679449 INFO  (zkCallback-8123-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 679450 INFO  (qtp1389335085-15048) [n:127.0.0.1:32797__t     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:32797__t&wt=javabin&version=2} status=0 QTime=2201
   [junit4]   2> 679450 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Waiting to see 1 active replicas in collection: control_collection
   [junit4]   2> 679556 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 679557 INFO  (zkConnectionManagerCallback-8152-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 679557 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 679557 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 679558 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:35563/solr ready
   [junit4]   2> 679558 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.ChaosMonkey monkey: init - expire sessions:false cause connection loss:false
   [junit4]   2> 679559 INFO  (qtp1389335085-15048) [n:127.0.0.1:32797__t     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=1&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 679563 INFO  (OverseerThreadFactory-8131-thread-2-processing-n:127.0.0.1:32797__t) [n:127.0.0.1:32797__t     ] o.a.s.c.a.c.CreateCollectionCmd Create collection collection1
   [junit4]   2> 679563 INFO  (OverseerCollectionConfigSetProcessor-72187929365774340-127.0.0.1:32797__t-n_0000000000) [n:127.0.0.1:32797__t     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000000 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 679765 WARN  (OverseerThreadFactory-8131-thread-2-processing-n:127.0.0.1:32797__t) [n:127.0.0.1:32797__t     ] o.a.s.c.a.c.CreateCollectionCmd It is unusual to create a collection (collection1) without cores.
   [junit4]   2> 679766 INFO  (qtp1389335085-15048) [n:127.0.0.1:32797__t     ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas
   [junit4]   2> 679766 INFO  (qtp1389335085-15048) [n:127.0.0.1:32797__t     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=1&wt=javabin&version=2} status=0 QTime=207
   [junit4]   2> 679767 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 679767 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 679767 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 679767 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 679767 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 679767 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 679767 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Creating jetty instances pullReplicaCount=0 numOtherReplicas=1
   [junit4]   2> 679867 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 1 in directory /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_9DE6106AF9834F4E-001/shard-1-001 of type NRT for shard1
   [junit4]   2> 679869 WARN  (closeThreadPool-8153-thread-1) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 679869 INFO  (closeThreadPool-8153-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 679869 INFO  (closeThreadPool-8153-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ...
   [junit4]   2> 679869 INFO  (closeThreadPool-8153-thread-1) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 12.0.2+10
   [junit4]   2> 679870 INFO  (closeThreadPool-8153-thread-1) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 679870 INFO  (closeThreadPool-8153-thread-1) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 679870 INFO  (closeThreadPool-8153-thread-1) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 679871 INFO  (closeThreadPool-8153-thread-1) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@288343c2{/_t,null,AVAILABLE}
   [junit4]   2> 679872 INFO  (closeThreadPool-8153-thread-1) [     ] o.e.j.s.AbstractConnector Started ServerConnector@71fc6f74{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:43233}
   [junit4]   2> 679872 INFO  (closeThreadPool-8153-thread-1) [     ] o.e.j.s.Server Started @679883ms
   [junit4]   2> 679872 INFO  (closeThreadPool-8153-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/_t, solrconfig=solrconfig.xml, solr.data.dir=hdfs://localhost.localdomain:46161/hdfs__localhost.localdomain_46161__home_jenkins_workspace_Lucene-Solr-8.x-Linux_solr_build_solr-core_test_J3_temp_solr.index.hdfs.CheckHdfsIndexTest_9DE6106AF9834F4E-001_tempDir-002_jetty1, hostPort=43233, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_9DE6106AF9834F4E-001/shard-1-001/cores}
   [junit4]   2> 679873 ERROR (closeThreadPool-8153-thread-1) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 679873 INFO  (closeThreadPool-8153-thread-1) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 679873 INFO  (closeThreadPool-8153-thread-1) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solrâ„¢ version 8.6.0
   [junit4]   2> 679873 INFO  (closeThreadPool-8153-thread-1) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 679873 INFO  (closeThreadPool-8153-thread-1) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr
   [junit4]   2> 679873 INFO  (closeThreadPool-8153-thread-1) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-04-30T13:38:53.730246Z
   [junit4]   2> 679874 INFO  (closeThreadPool-8153-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 679875 INFO  (zkConnectionManagerCallback-8155-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 679875 INFO  (closeThreadPool-8153-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 679976 INFO  (closeThreadPool-8153-thread-1) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 679976 INFO  (closeThreadPool-8153-thread-1) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_9DE6106AF9834F4E-001/shard-1-001/solr.xml
   [junit4]   2> 679979 INFO  (closeThreadPool-8153-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 679979 INFO  (closeThreadPool-8153-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 679980 INFO  (closeThreadPool-8153-thread-1) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@2600d2f8, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 680034 INFO  (closeThreadPool-8153-thread-1) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 680035 WARN  (closeThreadPool-8153-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@6e4e20f3[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 680035 WARN  (closeThreadPool-8153-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@6e4e20f3[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 680037 WARN  (closeThreadPool-8153-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@ca8dd0e[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 680037 WARN  (closeThreadPool-8153-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@ca8dd0e[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 680038 INFO  (closeThreadPool-8153-thread-1) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:35563/solr
   [junit4]   2> 680039 INFO  (closeThreadPool-8153-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 680040 INFO  (zkConnectionManagerCallback-8166-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 680040 INFO  (closeThreadPool-8153-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 680144 INFO  (closeThreadPool-8153-thread-1) [n:127.0.0.1:43233__t     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 680146 INFO  (zkConnectionManagerCallback-8168-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 680147 INFO  (closeThreadPool-8153-thread-1) [n:127.0.0.1:43233__t     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 680150 INFO  (closeThreadPool-8153-thread-1) [n:127.0.0.1:43233__t     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 680158 INFO  (closeThreadPool-8153-thread-1) [n:127.0.0.1:43233__t     ] o.a.s.c.ZkController Publish node=127.0.0.1:43233__t as DOWN
   [junit4]   2> 680159 INFO  (closeThreadPool-8153-thread-1) [n:127.0.0.1:43233__t     ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 680159 INFO  (closeThreadPool-8153-thread-1) [n:127.0.0.1:43233__t     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:43233__t
   [junit4]   2> 680160 INFO  (zkCallback-8151-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 680163 INFO  (zkCallback-8123-thread-2) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 680163 INFO  (zkCallback-8167-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 680164 INFO  (closeThreadPool-8153-thread-1) [n:127.0.0.1:43233__t     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 680165 WARN  (closeThreadPool-8153-thread-1) [n:127.0.0.1:43233__t     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=you make it. Consider configuring authentication/authorization before exposing Solr to users internal or  authorization=external.  See https://s.apache.org/solrsecurity for more info.  Solr is only as secure as disableddisabled
   [junit4]   2> 680187 INFO  (closeThreadPool-8153-thread-1) [n:127.0.0.1:43233__t     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 680203 INFO  (closeThreadPool-8153-thread-1) [n:127.0.0.1:43233__t     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@2600d2f8
   [junit4]   2> 680211 INFO  (closeThreadPool-8153-thread-1) [n:127.0.0.1:43233__t     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@2600d2f8
   [junit4]   2> 680211 INFO  (closeThreadPool-8153-thread-1) [n:127.0.0.1:43233__t     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@2600d2f8
   [junit4]   2> 680212 INFO  (closeThreadPool-8153-thread-1) [n:127.0.0.1:43233__t     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_9DE6106AF9834F4E-001/shard-1-001/cores
   [junit4]   2> 680222 INFO  (closeThreadPool-8153-thread-1) [     ] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:43233__t
   [junit4]   2> 680231 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :addreplica with params node=127.0.0.1:43233__t&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 680235 INFO  (OverseerCollectionConfigSetProcessor-72187929365774340-127.0.0.1:32797__t-n_0000000000) [n:127.0.0.1:32797__t     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000002 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 680242 INFO  (qtp1389335085-15048) [n:127.0.0.1:32797__t     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.core.control_collection.shard1.replica_n1:INDEX.sizeInBytes} status=0 QTime=4
   [junit4]   2> 680244 INFO  (qtp1389335085-15048) [n:127.0.0.1:32797__t     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 680249 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 680254 INFO  (qtp1389335085-15048) [n:127.0.0.1:32797__t     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.core.control_collection.shard1.replica_n1:INDEX.sizeInBytes} status=0 QTime=3
   [junit4]   2> 680261 INFO  (qtp1389335085-15048) [n:127.0.0.1:32797__t     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=1
   [junit4]   2> 680263 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 680263 INFO  (OverseerThreadFactory-8131-thread-3-processing-n:127.0.0.1:32797__t) [n:127.0.0.1:32797__t c:collection1 s:shard1   ] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:43233__t for creating new replica of shard shard1 for collection collection1
   [junit4]   2> 680269 INFO  (OverseerThreadFactory-8131-thread-3-processing-n:127.0.0.1:32797__t) [n:127.0.0.1:32797__t c:collection1 s:shard1   ] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command.
   [junit4]   2> 680284 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t    x:collection1_shard1_replica_n1 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_n1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 681291 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.6.0
   [junit4]   2> 681320 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.IndexSchema Schema name=test
   [junit4]   2> 681425 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 681437 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard1_replica_n1' using configuration from configset conf1, trusted=true
   [junit4]   2> 681438 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.collection1.shard1.replica_n1' (registry 'solr.core.collection1.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@2600d2f8
   [junit4]   2> 681438 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory solr.hdfs.home=hdfs://localhost.localdomain:46161/solr_hdfs_home
   [junit4]   2> 681438 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Solr Kerberos Authentication disabled
   [junit4]   2> 681438 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrCore [[collection1_shard1_replica_n1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_9DE6106AF9834F4E-001/shard-1-001/cores/collection1_shard1_replica_n1], dataDir=[hdfs://localhost.localdomain:46161/solr_hdfs_home/collection1/core_node2/data/]
   [junit4]   2> 681439 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:46161/solr_hdfs_home/collection1/core_node2/data/snapshot_metadata
   [junit4]   2> 681444 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 681444 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [33554432] will allocate [1] slabs and use ~[33554432] bytes
   [junit4]   2> 681447 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 681448 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:46161/solr_hdfs_home/collection1/core_node2/data
   [junit4]   2> 681464 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:46161/solr_hdfs_home/collection1/core_node2/data/index
   [junit4]   2> 681469 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 681469 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [33554432] will allocate [1] slabs and use ~[33554432] bytes
   [junit4]   2> 681473 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 681473 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=18, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.7290290228651891]
   [junit4]   2> 681515 WARN  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 681559 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.HdfsUpdateLog
   [junit4]   2> 681559 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 681559 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.HdfsUpdateLog Initializing HdfsUpdateLog: tlogDfsReplication=2
   [junit4]   2> 681576 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 681576 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 681578 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogDocMergePolicy: [LogDocMergePolicy: minMergeSize=1000, mergeFactor=27, maxMergeSize=9223372036854775807, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.0]
   [junit4]   2> 681588 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@681e9b11[collection1_shard1_replica_n1] main]
   [junit4]   2> 681589 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 681589 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 681590 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms
   [junit4]   2> 681590 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1665404958615273472
   [junit4]   2> 681592 INFO  (searcherExecutor-8179-thread-1-processing-n:127.0.0.1:43233__t x:collection1_shard1_replica_n1 c:collection1 s:shard1) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrCore [collection1_shard1_replica_n1] Registered new searcher Searcher@681e9b11[collection1_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 681593 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection1/terms/shard1 to Terms{values={core_node2=0}, version=0}
   [junit4]   2> 681593 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection1/leaders/shard1
   [junit4]   2> 681594 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 681594 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 681594 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:43233/_t/collection1_shard1_replica_n1/
   [junit4]   2> 681594 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 681594 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy http://127.0.0.1:43233/_t/collection1_shard1_replica_n1/ has no replicas
   [junit4]   2> 681594 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/collection1/leaders/shard1/leader after winning as /collections/collection1/leader_elect/shard1/election/72187929365774345-core_node2-n_0000000000
   [junit4]   2> 681595 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:43233/_t/collection1_shard1_replica_n1/ shard1
   [junit4]   2> 681696 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 681698 INFO  (qtp1975772436-15112) [n:127.0.0.1:43233__t     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_n1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1414
   [junit4]   2> 681699 INFO  (qtp1389335085-15051) [n:127.0.0.1:32797__t c:collection1    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:43233__t&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2} status=0 QTime=1468
   [junit4]   2> 681700 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Waiting to see 1 active replicas in collection: collection1
   [junit4]   2> 681798 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.SolrTestCaseJ4 ###Starting testDeletedDocs
   [junit4]   2> 682173 WARN  (DataStreamer for file /solr/_0_BlockTreeOrds_0.doc) [     ] o.a.h.h.DataStreamer Caught exception
   [junit4]   2>           => java.lang.InterruptedException
   [junit4]   2> 	at java.base/java.lang.Object.wait(Native Method)
   [junit4]   2> java.lang.InterruptedException: null
   [junit4]   2> 	at java.lang.Object.wait(Native Method) ~[?:?]
   [junit4]   2> 	at java.lang.Thread.join(Thread.java:1308) ~[?:?]
   [junit4]   2> 	at java.lang.Thread.join(Thread.java:1375) [?:?]
   [junit4]   2> 	at org.apache.hadoop.hdfs.DataStreamer.closeResponder(DataStreamer.java:986) ~[hadoop-hdfs-client-3.2.0.jar:?]
   [junit4]   2> 	at org.apache.hadoop.hdfs.DataStreamer.endBlock(DataStreamer.java:640) ~[hadoop-hdfs-client-3.2.0.jar:?]
   [junit4]   2> 	at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:810) ~[hadoop-hdfs-client-3.2.0.jar:?]
   [junit4]   2> 682239 INFO  (OverseerCollectionConfigSetProcessor-72187929365774340-127.0.0.1:32797__t-n_0000000000) [n:127.0.0.1:32797__t     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000004 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 692716 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:46161/solr
   [junit4]   2> 692717 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[9DE6106AF9834F4E]) [     ] o.a.s.SolrTestCaseJ4 ###Ending testDeletedDocs
   [junit4]   2> 692821 INFO  (closeThreadPool-8186-thread-2) [     ] o.a.s.c.CoreContainer Shutting down CoreContainer instance=1486934822
   [junit4]   2> 692821 INFO  (closeThreadPool-8186-thread-2) [     ] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:32797__t
   [junit4]   2> 692821 INFO  (closeThreadPool-8186-thread-2) [     ] o.a.s.c.ZkController Publish this node as DOWN...
   [junit4]   2> 692821 INFO  (closeThreadPool-8186-thread-2) [     ] o.a.s.c.ZkController Publish node=127.0.0.1:32797__t as DOWN
   [junit4]   2> 692822 INFO  (closeThreadPool-8186-thread-1) [     ] o.a.s.c.CoreContainer Shutting down CoreContainer instance=589154035
   [junit4]   2> 692822 INFO  (closeThreadPool-8186-thread-1) [     ] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:43233__t
   [junit4]   2> 692823 INFO  (closeThreadPool-8186-thread-1) [     ] o.a.s.c.ZkController Publish this node as DOWN...
   [junit4]   2> 692823 INFO  (closeThreadPool-8186-thread-1) [     ] o.a.s.c.ZkController Publish node=127.0.0.1:43233__t as DOWN
   [junit4]   2> 692823 INFO  (zkCallback-8123-thread-3) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 692823 INFO  (zkCallback-8123-thread-4) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 692823 INFO  (zkCallback-8123-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 692827 INFO  (coreCloseExecutor-8192-thread-1) [n:127.0.0.1:32797__t     ] o.a.s.c.SolrCore [control_collection_shard1_replica_n1]  CLOSING SolrCore org.apache.solr.core.SolrCore@4fac6f62
   [junit4]   2> 692827 INFO  (coreCloseExecutor-8192-thread-1) [n:127.0.0.1:32797__t     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.control_collection.shard1.replica_n1 tag=SolrCore@4fac6f62
   [junit4]   2> 692827 INFO  (coreCloseExecutor-8192-thread-1) [n:127.0.0.1:32797__t     ] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@3f8284f: rootName = null, domain = solr.core.control_collection.shard1.replica_n1, service url = null, agent id = null] for registry solr.core.control_collection.shard1.replica_n1/com.codahale.metrics.MetricRegistry@390d375b
   [junit4]   2> 692828 INFO  (coreCloseExecutor-8194-thread-1) [n:127.0.0.1:43233__t     ] o.a.s.c.SolrCore [collection1_shard1_replica_n1]  CLOSING SolrCore org.apache.solr.core.SolrCore@275e8cc5
   [junit4]   2> 692828 INFO  (coreCloseExecutor-8194-thread-1) [n:127.0.0.1:43233__t     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.collection1.shard1.replica_n1 tag=SolrCore@275e8cc5
   [junit4]   2> 692828 INFO  (coreCloseExecutor-8194-thread-1) [n:127.0.0.1:43233__t     ] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@61f0e7df: rootName = null, domain = solr.core.collection1.shard1.replica_n1, service url = null, agent id = null] for registry solr.core.collection1.shard1.replica_n1/com.codahale.metrics.MetricRegistry@428e033b
   [junit4]   2> 692843 INFO  (coreCloseExecutor-8194-thread-1) [n:127.0.0.1:43233__t     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.collection1.shard1.leader tag=SolrCore@275e8cc5
   [junit4]   2> 692843 INFO  (coreCloseExecutor-8192-thread-1) [n:127.0.0.1:32797__t     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.control_collection.shard1.leader tag=SolrCore@4fac6f62
   [junit4]   2> 692848 INFO  (coreCloseExecutor-8194-thread-1) [n:127.0.0.1:43233__t     ] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close()  ... SKIPPED (unnecessary).
   [junit4]   2> 692849 INFO  (coreCloseExecutor-8192-thread-1) [n:127.0.0.1:32797__t     ] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close()  ... SKIPPED (unnecessary).
   [junit4]   2> 692854 INFO  (coreCloseExecutor-8192-thread-1) [n:127.0.0.1:32797__t     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:46161/solr_hdfs_home/control_collection/core_node2/data/snapshot_metadata
   [junit4]   2> 692861 INFO  (coreCloseExecutor-8192-thread-1) [n:127.0.0.1:32797__t     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:46161/solr_hdfs_home/control_collection/core_node2/data/index
   [junit4]   2> 692861 INFO  (coreCloseExecutor-8194-thread-1) [n:127.0.0.1:43233__t     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:46161/solr_hdfs_home/collection1/core_node2/data/index
   [junit4]   2> 692863 INFO  (coreCloseExecutor-8192

[...truncated too long message...]

{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/datanode}
   [junit4]   2> 712204 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@1c200a5b{HTTP/1.1, (http/1.1)}{localhost:0}
   [junit4]   2> 712204 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 712205 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@1ac38bef{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,UNAVAILABLE}
   [junit4]   2> 712206 WARN  (BP-1390888650-127.0.0.1-1588253928542 heartbeating to localhost.localdomain/127.0.0.1:46161) [     ] o.a.h.h.s.d.IncrementalBlockReportManager IncrementalBlockReportManager interrupted
   [junit4]   2> 712206 WARN  (BP-1390888650-127.0.0.1-1588253928542 heartbeating to localhost.localdomain/127.0.0.1:46161) [     ] o.a.h.h.s.d.DataNode Ending block pool service for: Block pool BP-1390888650-127.0.0.1-1588253928542 (Datanode Uuid 5ecb6f04-0341-419b-b3e2-17fea2d20036) service to localhost.localdomain/127.0.0.1:46161
   [junit4]   2> 712224 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.w.WebAppContext@115a46e6{hdfs,/,null,UNAVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/hdfs}
   [junit4]   2> 712225 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@29a71165{HTTP/1.1, (http/1.1)}{localhost.localdomain:0}
   [junit4]   2> 712225 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 712225 INFO  (SUITE-CheckHdfsIndexTest-seed#[9DE6106AF9834F4E]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@42d5e6bf{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,UNAVAILABLE}
   [junit4]   2> NOTE: leaving temporary files on disk at: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_9DE6106AF9834F4E-001
   [junit4]   2> Apr 30, 2020 1:39:26 PM com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks
   [junit4]   2> WARNING: Will linger awaiting termination of 129 leaked thread(s).
   [junit4]   2> NOTE: test params are: codec=Asserting(Lucene84): {date=BlockTreeOrds(blocksize=128), range_facet_l_dv=Lucene84, n_l1=BlockTreeOrds(blocksize=128), multiDefault=FST50, intDefault=BlockTreeOrds(blocksize=128), titleTokenized=PostingsFormat(name=LuceneVarGapFixedInterval), n_td1=FST50, body=BlockTreeOrds(blocksize=128), title=FST50, n_d1=BlockTreeOrds(blocksize=128), n_f1=FST50, range_facet_l=BlockTreeOrds(blocksize=128), n_tl1=FST50, n_tf1=BlockTreeOrds(blocksize=128), id=Lucene84, timestamp=BlockTreeOrds(blocksize=128), docid=Lucene84, _root_=BlockTreeOrds(blocksize=128), n_dt1=FST50, n_ti1=Lucene84, rnd_b=Lucene84, field=BlockTreeOrds(blocksize=128), n_tdt1=BlockTreeOrds(blocksize=128), id_i1=FST50, range_facet_i_dv=BlockTreeOrds(blocksize=128)}, docValues:{range_facet_l_dv=DocValuesFormat(name=Asserting), n_l1=DocValuesFormat(name=Lucene80), intDefault=DocValuesFormat(name=Lucene80), n_dt1=DocValuesFormat(name=Lucene80), n_td1=DocValuesFormat(name=Lucene80), n_d1=DocValuesFormat(name=Lucene80), range_facet_l=DocValuesFormat(name=Lucene80), n_f1=DocValuesFormat(name=Lucene80), n_ti1=DocValuesFormat(name=Asserting), docid_intDV=DocValuesFormat(name=Direct), n_tl1=DocValuesFormat(name=Lucene80), _version_=DocValuesFormat(name=Lucene80), n_tf1=DocValuesFormat(name=Lucene80), n_tdt1=DocValuesFormat(name=Lucene80), id_i1=DocValuesFormat(name=Lucene80), range_facet_i_dv=DocValuesFormat(name=Lucene80), intDvoDefault=DocValuesFormat(name=Direct), titleDV=DocValuesFormat(name=Lucene80), timestamp=DocValuesFormat(name=Lucene80)}, maxPointsInLeafNode=336, maxMBSortInHeap=6.735296865723733, sim=Asserting(org.apache.lucene.search.similarities.AssertingSimilarity@11c3a25c), locale=mua-CM, timezone=Africa/Bangui
   [junit4]   2> NOTE: Linux 5.3.0-46-generic amd64/AdoptOpenJDK 12.0.2 (64-bit)/cpus=16,threads=18,free=288629544,total=518979584
   [junit4]   2> NOTE: All tests run in this JVM: [TestManagedSchemaThreadSafety, TestAddFieldRealTimeGet, TestStandardQParsers, SampleTest, TestRemoteStreaming, TestScoreJoinQPNoScore, TestRetrieveFieldsOptimizer, HdfsUnloadDistributedZkTest, TestDocTermOrds, TestStressLiveNodes, TestReloadAndDeleteDocs, TestMaxTokenLenTokenizer, JSONWriterTest, TestDistributedStatsComponentCardinality, SparseHLLTest, AddBlockUpdateTest, TestPseudoReturnFields, SplitShardTest, DistribJoinFromCollectionTest, LeaderElectionTest, RecoveryZkTest, TestCopyFieldCollectionResource, MetricsHistoryIntegrationTest, AdminHandlersProxyTest, ChaosMonkeySafeLeaderTest, TestGraphMLResponseWriter, ShardRoutingTest, TestSimScenario, TestCloudSearcherWarming, AssignBackwardCompatibilityTest, BitVectorTest, RequestHandlersTest, TestCustomSort, ZkSolrClientTest, TestHashPartitioner, TestFieldCacheVsDocValues, TestWaitForStateWithJettyShutdowns, TestNumericRangeQuery32, SolrJmxReporterTest, MetricTriggerIntegrationTest, TestSimExtremeIndexing, TriggerSetPropertiesIntegrationTest, CategoryRoutedAliasUpdateProcessorTest, ScriptEngineTest, TestSolrIndexConfig, ConfigSetsAPITest, CheckHdfsIndexTest]
   [junit4] Completed [470/907 (1!)] on J3 in 41.08s, 5 tests, 1 error, 1 skipped <<< FAILURES!

[...truncated 39573 lines...]
-ecj-javadoc-lint-src:
    [mkdir] Created dir: /tmp/ecj1003107468
 [ecj-lint] Compiling 931 source files to /tmp/ecj1003107468
 [ecj-lint] ----------
 [ecj-lint] 1. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/CodecUtil.java (at line 523)
 [ecj-lint] 	throw new CorruptIndexException("misplaced codec footer (file truncated?): length=" + in.length() + " but footerLength==" + footerLength(), input);
 [ecj-lint] 	^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'in' is not closed at this location
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 2. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingStoredFieldsReader.java (at line 166)
 [ecj-lint] 	FieldsIndexReader fieldsIndexReader = new FieldsIndexReader(d, si.name, segmentSuffix, INDEX_EXTENSION_PREFIX, INDEX_CODEC_NAME, si.getId());
 [ecj-lint] 	                  ^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'fieldsIndexReader' is never closed
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 3. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingTermVectorsReader.java (at line 148)
 [ecj-lint] 	FieldsIndexReader fieldsIndexReader = new FieldsIndexReader(d, si.name, segmentSuffix, VECTORS_INDEX_EXTENSION_PREFIX, VECTORS_INDEX_CODEC_NAME, si.getId());
 [ecj-lint] 	                  ^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'fieldsIndexReader' is never closed
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 4. ERROR in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/search/IndexSearcher.java (at line 50)
 [ecj-lint] 	import org.apache.lucene.util.automaton.ByteRunAutomaton;
 [ecj-lint] 	       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 [ecj-lint] The import org.apache.lucene.util.automaton.ByteRunAutomaton is never used
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 5. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/util/automaton/Operations.java (at line 742)
 [ecj-lint] 	Integer q = newstate.get(statesSet);
 [ecj-lint] 	                         ^^^^^^^^^
 [ecj-lint] Unlikely argument type SortedIntSet for get(Object) on a Map<SortedIntSet.FrozenIntSet,Integer>
 [ecj-lint] ----------
 [ecj-lint] 5 problems (1 error, 4 warnings)

BUILD FAILED
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/build.xml:634: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/build.xml:101: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build.xml:201: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/common-build.xml:2127: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/common-build.xml:2166: Compile failed; see the compiler error output for details.

Total time: 37 minutes 53 seconds
Build step 'Invoke Ant' marked build as failure
Archiving artifacts
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
[WARNINGS] Skipping publisher since build result is FAILURE
Recording test results
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2

[JENKINS] Lucene-Solr-8.x-Linux (64bit/jdk-11.0.6) - Build # 2920 - Failure!

Posted by Policeman Jenkins Server <je...@thetaphi.de>.
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-8.x-Linux/2920/
Java: 64bit/jdk-11.0.6 -XX:+UseCompressedOops -XX:+UseG1GC

2 tests failed.
FAILED:  org.apache.lucene.index.TestIndexWriterDelete.testDeleteAllNoDeadLock

Error Message:


Stack Trace:
java.lang.AssertionError
	at __randomizedtesting.SeedInfo.seed([8C0EBC8A112C5540:BF17539491EFEF16]:0)
	at org.apache.lucene.index.IndexWriter.abortMerges(IndexWriter.java:2514)
	at org.apache.lucene.index.IndexWriter.deleteAll(IndexWriter.java:2441)
	at org.apache.lucene.index.RandomIndexWriter.deleteAll(RandomIndexWriter.java:373)
	at org.apache.lucene.index.TestIndexWriterDelete.testDeleteAllNoDeadLock(TestIndexWriterDelete.java:348)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at java.base/java.lang.Thread.run(Thread.java:834)


FAILED:  org.apache.solr.index.hdfs.CheckHdfsIndexTest.doTest

Error Message:
Error from server at https://127.0.0.1:44111/collection1: java.lang.NullPointerException  at org.apache.solr.handler.admin.SystemInfoHandler.getSecurityInfo(SystemInfoHandler.java:326)  at org.apache.solr.handler.admin.SystemInfoHandler.handleRequestBody(SystemInfoHandler.java:146)  at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)  at org.apache.solr.core.SolrCore.execute(SolrCore.java:2600)  at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:803)  at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:582)  at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:432)  at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:362)  at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)  at org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)  at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)  at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)  at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)  at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1610)  at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)  at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1300)  at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)  at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)  at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1580)  at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)  at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1215)  at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)  at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)  at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)  at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:767)  at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)  at org.eclipse.jetty.server.Server.handle(Server.java:500)  at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)  at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)  at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)  at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)  at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)  at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)  at org.eclipse.jetty.io.ssl.SslConnection$1.run(SslConnection.java:146)  at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)  at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)  at java.base/java.lang.Thread.run(Thread.java:834) 

Stack Trace:
org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at https://127.0.0.1:44111/collection1: java.lang.NullPointerException
	at org.apache.solr.handler.admin.SystemInfoHandler.getSecurityInfo(SystemInfoHandler.java:326)
	at org.apache.solr.handler.admin.SystemInfoHandler.handleRequestBody(SystemInfoHandler.java:146)
	at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)
	at org.apache.solr.core.SolrCore.execute(SolrCore.java:2600)
	at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:803)
	at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:582)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:432)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:362)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)
	at org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)
	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1610)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1300)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)
	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1580)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1215)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)
	at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:767)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.server.Server.handle(Server.java:500)
	at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)
	at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)
	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)
	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)
	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
	at org.eclipse.jetty.io.ssl.SslConnection$1.run(SslConnection.java:146)
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)
	at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)
	at java.base/java.lang.Thread.run(Thread.java:834)

	at __randomizedtesting.SeedInfo.seed([64EADBDE58807702:C3AE637A353B64BB]:0)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:665)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:265)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:248)
	at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:211)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1003)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1018)
	at org.apache.solr.index.hdfs.CheckHdfsIndexTest.doTest(CheckHdfsIndexTest.java:120)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1081)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1053)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at java.base/java.lang.Thread.run(Thread.java:834)




Build Log:
[...truncated 1506 lines...]
   [junit4] Suite: org.apache.lucene.index.TestIndexWriterDelete
   [junit4] IGNOR/A 0.00s J5 | TestIndexWriterDelete.testUpdatesOnDiskFull
   [junit4]    > Assumption #1: 'nightly' test group is disabled (@Nightly())
   [junit4] IGNOR/A 0.00s J5 | TestIndexWriterDelete.testDeletesOnDiskFull
   [junit4]    > Assumption #1: 'nightly' test group is disabled (@Nightly())
   [junit4]   2> NOTE: reproduce with: ant test  -Dtestcase=TestIndexWriterDelete -Dtests.method=testDeleteAllNoDeadLock -Dtests.seed=8C0EBC8A112C5540 -Dtests.multiplier=3 -Dtests.slow=true -Dtests.locale=os-RU -Dtests.timezone=Africa/Ndjamena -Dtests.asserts=true -Dtests.file.encoding=US-ASCII
   [junit4] FAILURE 0.13s J5 | TestIndexWriterDelete.testDeleteAllNoDeadLock <<<
   [junit4]    > Throwable #1: java.lang.AssertionError
   [junit4]    > 	at __randomizedtesting.SeedInfo.seed([8C0EBC8A112C5540:BF17539491EFEF16]:0)
   [junit4]    > 	at org.apache.lucene.index.IndexWriter.abortMerges(IndexWriter.java:2514)
   [junit4]    > 	at org.apache.lucene.index.IndexWriter.deleteAll(IndexWriter.java:2441)
   [junit4]    > 	at org.apache.lucene.index.RandomIndexWriter.deleteAll(RandomIndexWriter.java:373)
   [junit4]    > 	at org.apache.lucene.index.TestIndexWriterDelete.testDeleteAllNoDeadLock(TestIndexWriterDelete.java:348)
   [junit4]    > 	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
   [junit4]    > 	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
   [junit4]    > 	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
   [junit4]    > 	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
   [junit4]    > 	at java.base/java.lang.Thread.run(Thread.java:834)
   [junit4] IGNOR/A 0.00s J5 | TestIndexWriterDelete.testApplyDeletesOnFlush
   [junit4]    > Assumption #1: 'nightly' test group is disabled (@Nightly())
   [junit4] IGNOR/A 0.00s J5 | TestIndexWriterDelete.testIndexingThenDeleting
   [junit4]    > Assumption #1: 'nightly' test group is disabled (@Nightly())
   [junit4]   2> NOTE: test params are: codec=Asserting(Lucene84): {field=Lucene84, city=BlockTreeOrds(blocksize=128), contents=Lucene84, id=BlockTreeOrds(blocksize=128), value=Lucene84, content=Lucene84}, docValues:{dv=DocValuesFormat(name=Lucene80)}, maxPointsInLeafNode=1578, maxMBSortInHeap=7.949798835052155, sim=Asserting(org.apache.lucene.search.similarities.AssertingSimilarity@1786b16c), locale=os-RU, timezone=Africa/Ndjamena
   [junit4]   2> NOTE: Linux 5.3.0-46-generic amd64/AdoptOpenJDK 11.0.6 (64-bit)/cpus=16,threads=1,free=385208256,total=536870912
   [junit4]   2> NOTE: All tests run in this JVM: [TestLatLonShape, TestSearchForDuplicates, TestDisjunctionMaxQuery, TestPoint2D, TestMergedIterator, TestSegmentReader, TestXYMultiPointShapeQueries, TestLevenshteinAutomata, TestPointValues, TestTermVectorsReader, TestBooleanScorer, TestBKD, TestNGramPhraseQuery, TestHighCompressionMode, TestLucene50TermVectorsFormat, TestCharacterUtils, TestConjunctions, TestRoaringDocIdSet, TestSort, TestDirectory, TestTopDocsCollector, TestDocCount, TestReaderWrapperDVTypeCheck, TestLucene80NormsFormat, TestBoolean2ScorerSupplier, TestNoMergeScheduler, TestIndexingSequenceNumbers, TestFloatRange, TestOneMergeWrappingMergePolicy, TestFlex, TestLatLonMultiPolygonShapeQueries, TestLatLonDocValuesQueries, Test2BDocs, TestSubScorerFreqs, TestNRTReaderWithThreads, TestIndexFileDeleter, TestPackedInts, TestMatchNoDocsQuery, TestAllFilesDetectTruncation, TestTwoPhaseCommitTool, TestMmapDirectory, TestQueryRescorer, TestBytesRefHash, TestFloatRangeFieldQueries, TestDuelingCodecsAtNight, TestCodecs, TestSoftDeletesRetentionMergePolicy, TestCustomNorms, TestDocsAndPositions, TestFieldInvertState, TestIndexManyDocuments, TestIndexTooManyDocs, TestIndexWriterConfig, TestIndexWriterDelete]
   [junit4] Completed [332/564 (1!)] on J5 in 1.76s, 26 tests, 1 failure, 5 skipped <<< FAILURES!

[...truncated 12494 lines...]
   [junit4] Suite: org.apache.solr.index.hdfs.CheckHdfsIndexTest
   [junit4]   2> 82700 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.a.s.SolrTestCase Setting 'solr.default.confdir' system property to test-framework derived value of '/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/server/solr/configsets/_default/conf'
   [junit4]   2> 82701 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.a.s.SolrTestCaseJ4 Created dataDir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_64EADBDE58807702-001/data-dir-10-001
   [junit4]   2> 82701 WARN  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.a.s.SolrTestCaseJ4 startTrackingSearchers: numOpens=14 numCloses=14
   [junit4]   2> 82701 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.a.s.SolrTestCaseJ4 Using PointFields (NUMERIC_POINTS_SYSPROP=true) w/NUMERIC_DOCVALUES_SYSPROP=false
   [junit4]   2> 82701 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.a.s.SolrTestCaseJ4 Randomized ssl (true) and clientAuth (false) via: @org.apache.solr.util.RandomizeSSL(reason="", value=0.0/0.0, ssl=0.0/0.0, clientAuth=0.0/0.0)
   [junit4]   2> 82702 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.a.s.SolrTestCaseJ4 SecureRandom sanity checks: test.solr.allowed.securerandom=null & java.security.egd=file:/dev/./urandom
   [junit4]   2> 82702 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.a.s.BaseDistributedSearchTestCase Setting hostContext system property: /
   [junit4]   1> Formatting using clusterid: testClusterID
   [junit4]   2> 83305 WARN  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 83316 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 11.0.6+10
   [junit4]   2> 83319 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 83319 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 83319 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 83321 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@56a1c6f6{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,AVAILABLE}
   [junit4]   2> 83463 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.w.WebAppContext@31ff1b72{hdfs,/,file:///home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/jetty-localhost_localdomain-33309-hadoop-hdfs-3_2_0-tests_jar-_-any-16660361855838484855.dir/webapp/,AVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/hdfs}
   [junit4]   2> 83464 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.e.j.s.AbstractConnector Started ServerConnector@1c1fc6a3{HTTP/1.1, (http/1.1)}{localhost.localdomain:33309}
   [junit4]   2> 83464 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.e.j.s.Server Started @83495ms
   [junit4]   2> 83764 WARN  (StorageLocationChecker thread 0) [     ] o.a.h.u.NativeCodeLoader Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
   [junit4]   2> 83800 WARN  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 83805 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 11.0.6+10
   [junit4]   2> 83811 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 83811 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 83811 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 83811 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@724f0ba1{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,AVAILABLE}
   [junit4]   2> 83907 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.w.WebAppContext@188c3d3b{datanode,/,file:///home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/jetty-localhost-40623-hadoop-hdfs-3_2_0-tests_jar-_-any-8761691933914317438.dir/webapp/,AVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/datanode}
   [junit4]   2> 83908 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.e.j.s.AbstractConnector Started ServerConnector@66e6e8f0{HTTP/1.1, (http/1.1)}{localhost:40623}
   [junit4]   2> 83908 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.e.j.s.Server Started @83939ms
   [junit4]   2> 84571 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x56214ff99be15813: Processing first storage report for DS-68215172-8e25-48d6-b425-06ee6f8b83e0 from datanode d0edb8f9-d6da-49bc-a30e-273f547c28f0
   [junit4]   2> 84573 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x56214ff99be15813: from storage DS-68215172-8e25-48d6-b425-06ee6f8b83e0 node DatanodeRegistration(127.0.0.1:38191, datanodeUuid=d0edb8f9-d6da-49bc-a30e-273f547c28f0, infoPort=35887, infoSecurePort=0, ipcPort=45521, storageInfo=lv=-57;cid=testClusterID;nsid=45195356;c=1588246322449), blocks: 0, hasStaleStorage: true, processing time: 2 msecs, invalidatedBlocks: 0
   [junit4]   2> 84573 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x56214ff99be15813: Processing first storage report for DS-5730121a-b5bd-4a68-bac7-8bcbc1f819a7 from datanode d0edb8f9-d6da-49bc-a30e-273f547c28f0
   [junit4]   2> 84573 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x56214ff99be15813: from storage DS-5730121a-b5bd-4a68-bac7-8bcbc1f819a7 node DatanodeRegistration(127.0.0.1:38191, datanodeUuid=d0edb8f9-d6da-49bc-a30e-273f547c28f0, infoPort=35887, infoSecurePort=0, ipcPort=45521, storageInfo=lv=-57;cid=testClusterID;nsid=45195356;c=1588246322449), blocks: 0, hasStaleStorage: false, processing time: 0 msecs, invalidatedBlocks: 0
   [junit4]   2> 84741 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER
   [junit4]   2> 84742 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer client port:0.0.0.0/0.0.0.0:0
   [junit4]   2> 84742 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer Starting server
   [junit4]   2> 84842 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.ZkTestServer start zk server on port:45083
   [junit4]   2> 84842 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.ZkTestServer waitForServerUp: 127.0.0.1:45083
   [junit4]   2> 84842 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.ZkTestServer parse host and port list: 127.0.0.1:45083
   [junit4]   2> 84842 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.ZkTestServer connecting to 127.0.0.1 45083
   [junit4]   2> 84854 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 84868 INFO  (zkConnectionManagerCallback-933-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 84868 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 84903 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 84927 INFO  (zkConnectionManagerCallback-935-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 84927 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 84955 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig-tlog.xml to /configs/conf1/solrconfig.xml
   [junit4]   2> 85003 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/schema.xml to /configs/conf1/schema.xml
   [junit4]   2> 85037 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig.snippet.randomindexconfig.xml to /configs/conf1/solrconfig.snippet.randomindexconfig.xml
   [junit4]   2> 85079 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/stopwords.txt to /configs/conf1/stopwords.txt
   [junit4]   2> 85081 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/protwords.txt to /configs/conf1/protwords.txt
   [junit4]   2> 85081 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/currency.xml to /configs/conf1/currency.xml
   [junit4]   2> 85086 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/enumsConfig.xml to /configs/conf1/enumsConfig.xml
   [junit4]   2> 85091 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/open-exchange-rates.json to /configs/conf1/open-exchange-rates.json
   [junit4]   2> 85093 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/mapping-ISOLatin1Accent.txt to /configs/conf1/mapping-ISOLatin1Accent.txt
   [junit4]   2> 85095 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/old_synonyms.txt to /configs/conf1/old_synonyms.txt
   [junit4]   2> 85096 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/synonyms.txt to /configs/conf1/synonyms.txt
   [junit4]   2> 85100 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 85103 INFO  (zkConnectionManagerCallback-939-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 85103 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 85208 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Will use NRT replicas unless explicitly asked otherwise
   [junit4]   2> 85316 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 85316 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 85316 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ...
   [junit4]   2> 85316 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 11.0.6+10
   [junit4]   2> 85319 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 85319 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 85320 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 85331 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@3be5ea11{/,null,AVAILABLE}
   [junit4]   2> 85343 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.e.j.s.AbstractConnector Started ServerConnector@e161331{ssl, (ssl, alpn, http/1.1, h2)}{127.0.0.1:43071}
   [junit4]   2> 85343 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.e.j.s.Server Started @85374ms
   [junit4]   2> 85343 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/, solr.data.dir=hdfs://localhost.localdomain:43729/hdfs__localhost.localdomain_43729__home_jenkins_workspace_Lucene-Solr-8.x-Linux_solr_build_solr-core_test_J3_temp_solr.index.hdfs.CheckHdfsIndexTest_64EADBDE58807702-001_tempDir-002_control_data, hostPort=43071, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_64EADBDE58807702-001/control-001/cores, replicaType=NRT}
   [junit4]   2> 85343 ERROR (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 85343 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 85343 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr? version 8.6.0
   [junit4]   2> 85343 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 85343 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr
   [junit4]   2> 85343 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-04-30T11:32:04.672630Z
   [junit4]   2> 85346 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 85347 INFO  (zkConnectionManagerCallback-941-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 85347 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 85448 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 85448 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_64EADBDE58807702-001/control-001/solr.xml
   [junit4]   2> 85451 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 85451 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 85452 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@2e2cd77d, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 85499 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 85500 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@1b1abe4e[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 85500 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@1b1abe4e[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 85503 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@471fffbb[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 85503 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@471fffbb[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 85505 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:45083/solr
   [junit4]   2> 85507 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 85515 INFO  (zkConnectionManagerCallback-952-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 85515 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 85617 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [n:127.0.0.1:43071_     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 85618 INFO  (zkConnectionManagerCallback-954-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 85618 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [n:127.0.0.1:43071_     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 85675 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [n:127.0.0.1:43071_     ] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:43071_
   [junit4]   2> 85676 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [n:127.0.0.1:43071_     ] o.a.s.c.Overseer Overseer (id=72187430904528901-127.0.0.1:43071_-n_0000000000) starting
   [junit4]   2> 85679 INFO  (OverseerStateUpdate-72187430904528901-127.0.0.1:43071_-n_0000000000) [n:127.0.0.1:43071_     ] o.a.s.c.Overseer Starting to work on the main queue : 127.0.0.1:43071_
   [junit4]   2> 85684 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [n:127.0.0.1:43071_     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:43071_
   [junit4]   2> 85688 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [n:127.0.0.1:43071_     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 85690 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [n:127.0.0.1:43071_     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=you make it. Consider configuring authentication/authorization before exposing Solr to users internal or  authorization=external.  See https://s.apache.org/solrsecurity for more info.  Solr is only as secure as disableddisabled
   [junit4]   2> 85699 INFO  (zkCallback-953-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 85738 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [n:127.0.0.1:43071_     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 85756 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [n:127.0.0.1:43071_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@2e2cd77d
   [junit4]   2> 85763 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [n:127.0.0.1:43071_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@2e2cd77d
   [junit4]   2> 85764 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [n:127.0.0.1:43071_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@2e2cd77d
   [junit4]   2> 85765 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [n:127.0.0.1:43071_     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_64EADBDE58807702-001/control-001/cores
   [junit4]   2> 85803 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 85807 INFO  (zkConnectionManagerCallback-971-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 85807 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 85816 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 85818 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:45083/solr ready
   [junit4]   2> 85851 INFO  (qtp1578561805-1736) [n:127.0.0.1:43071_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:43071_&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 85855 INFO  (OverseerThreadFactory-961-thread-1-processing-n:127.0.0.1:43071_) [n:127.0.0.1:43071_     ] o.a.s.c.a.c.CreateCollectionCmd Create collection control_collection
   [junit4]   2> 85981 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 85997 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=4
   [junit4]   2> 86014 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_    x:control_collection_shard1_replica_n1 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 86015 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_    x:control_collection_shard1_replica_n1 ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 87033 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.6.0
   [junit4]   2> 87051 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.IndexSchema Schema name=test
   [junit4]   2> 87173 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 87195 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.CoreContainer Creating SolrCore 'control_collection_shard1_replica_n1' using configuration from configset conf1, trusted=true
   [junit4]   2> 87196 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.control_collection.shard1.replica_n1' (registry 'solr.core.control_collection.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@2e2cd77d
   [junit4]   2> 87199 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory solr.hdfs.home=hdfs://localhost.localdomain:43729/solr_hdfs_home
   [junit4]   2> 87200 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Solr Kerberos Authentication disabled
   [junit4]   2> 87200 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrCore [[control_collection_shard1_replica_n1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_64EADBDE58807702-001/control-001/cores/control_collection_shard1_replica_n1], dataDir=[hdfs://localhost.localdomain:43729/solr_hdfs_home/control_collection/core_node2/data/]
   [junit4]   2> 87201 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:43729/solr_hdfs_home/control_collection/core_node2/data/snapshot_metadata
   [junit4]   2> 87214 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 87214 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 87214 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 87278 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 87287 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:43729/solr_hdfs_home/control_collection/core_node2/data
   [junit4]   2> 87351 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:43729/solr_hdfs_home/control_collection/core_node2/data/index
   [junit4]   2> 87360 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 87360 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 87360 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 87383 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 87383 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=29, maxMergeAtOnceExplicit=26, maxMergedSegmentMB=89.9130859375, floorSegmentMB=0.998046875, forceMergeDeletesPctAllowed=8.538998266362153, segmentsPerTier=19.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.7033202577378823, deletesPctAllowed=37.63721210728515
   [junit4]   2> 87963 WARN  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 88047 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.HdfsUpdateLog
   [junit4]   2> 88047 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 88047 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.HdfsUpdateLog Initializing HdfsUpdateLog: tlogDfsReplication=2
   [junit4]   2> 88073 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 88073 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 88079 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogDocMergePolicy: [LogDocMergePolicy: minMergeSize=1000, mergeFactor=11, maxMergeSize=9223372036854775807, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.0]
   [junit4]   2> 88146 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@3fb33ae3[control_collection_shard1_replica_n1] main]
   [junit4]   2> 88148 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 88149 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 88151 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms
   [junit4]   2> 88152 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1665396981084717056
   [junit4]   2> 88156 INFO  (searcherExecutor-973-thread-1-processing-n:127.0.0.1:43071_ x:control_collection_shard1_replica_n1 c:control_collection s:shard1) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrCore [control_collection_shard1_replica_n1] Registered new searcher Searcher@3fb33ae3[control_collection_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 88164 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ZkShardTerms Successful update of terms at /collections/control_collection/terms/shard1 to Terms{values={core_node2=0}, version=0}
   [junit4]   2> 88164 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/control_collection/leaders/shard1
   [junit4]   2> 88166 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 88166 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 88166 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync replicas to https://127.0.0.1:43071/control_collection_shard1_replica_n1/
   [junit4]   2> 88168 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 88168 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy https://127.0.0.1:43071/control_collection_shard1_replica_n1/ has no replicas
   [junit4]   2> 88168 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/control_collection/leaders/shard1/leader after winning as /collections/control_collection/leader_elect/shard1/election/72187430904528901-core_node2-n_0000000000
   [junit4]   2> 88170 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I am the new leader: https://127.0.0.1:43071/control_collection_shard1_replica_n1/ shard1
   [junit4]   2> 88172 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 88175 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=2161
   [junit4]   2> 88178 INFO  (qtp1578561805-1736) [n:127.0.0.1:43071_     ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas
   [junit4]   2> 88274 INFO  (zkCallback-953-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 88274 INFO  (zkCallback-953-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 88274 INFO  (qtp1578561805-1736) [n:127.0.0.1:43071_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:43071_&wt=javabin&version=2} status=0 QTime=2423
   [junit4]   2> 88275 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Waiting to see 1 active replicas in collection: control_collection
   [junit4]   2> 88384 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 88387 INFO  (zkConnectionManagerCallback-982-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 88387 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 88388 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 88389 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:45083/solr ready
   [junit4]   2> 88391 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.ChaosMonkey monkey: init - expire sessions:false cause connection loss:false
   [junit4]   2> 88407 INFO  (qtp1578561805-1736) [n:127.0.0.1:43071_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=1&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 88415 INFO  (OverseerCollectionConfigSetProcessor-72187430904528901-127.0.0.1:43071_-n_0000000000) [n:127.0.0.1:43071_     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000000 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 88422 INFO  (OverseerThreadFactory-961-thread-2-processing-n:127.0.0.1:43071_) [n:127.0.0.1:43071_     ] o.a.s.c.a.c.CreateCollectionCmd Create collection collection1
   [junit4]   2> 88624 WARN  (OverseerThreadFactory-961-thread-2-processing-n:127.0.0.1:43071_) [n:127.0.0.1:43071_     ] o.a.s.c.a.c.CreateCollectionCmd It is unusual to create a collection (collection1) without cores.
   [junit4]   2> 88625 INFO  (qtp1578561805-1736) [n:127.0.0.1:43071_     ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas
   [junit4]   2> 88626 INFO  (qtp1578561805-1736) [n:127.0.0.1:43071_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=1&wt=javabin&version=2} status=0 QTime=218
   [junit4]   2> 88632 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 88632 INFO  (watches-979-thread-1) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 88632 INFO  (watches-979-thread-1) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 88632 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 88632 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 88632 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 88632 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 88632 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 88632 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Creating jetty instances pullReplicaCount=0 numOtherReplicas=1
   [junit4]   2> 88736 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 1 in directory /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_64EADBDE58807702-001/shard-1-001 of type NRT for shard1
   [junit4]   2> 88740 WARN  (closeThreadPool-983-thread-1) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 88740 INFO  (closeThreadPool-983-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 88740 INFO  (closeThreadPool-983-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ...
   [junit4]   2> 88740 INFO  (closeThreadPool-983-thread-1) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 11.0.6+10
   [junit4]   2> 88743 INFO  (closeThreadPool-983-thread-1) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 88743 INFO  (closeThreadPool-983-thread-1) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 88743 INFO  (closeThreadPool-983-thread-1) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 88744 INFO  (closeThreadPool-983-thread-1) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@276d101f{/,null,AVAILABLE}
   [junit4]   2> 88807 INFO  (closeThreadPool-983-thread-1) [     ] o.e.j.s.AbstractConnector Started ServerConnector@49b0dc05{ssl, (ssl, alpn, http/1.1, h2)}{127.0.0.1:45589}
   [junit4]   2> 88807 INFO  (closeThreadPool-983-thread-1) [     ] o.e.j.s.Server Started @88838ms
   [junit4]   2> 88807 INFO  (closeThreadPool-983-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/, solrconfig=solrconfig.xml, solr.data.dir=hdfs://localhost.localdomain:43729/hdfs__localhost.localdomain_43729__home_jenkins_workspace_Lucene-Solr-8.x-Linux_solr_build_solr-core_test_J3_temp_solr.index.hdfs.CheckHdfsIndexTest_64EADBDE58807702-001_tempDir-002_jetty1, hostPort=45589, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_64EADBDE58807702-001/shard-1-001/cores, replicaType=NRT}
   [junit4]   2> 88807 ERROR (closeThreadPool-983-thread-1) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 88807 INFO  (closeThreadPool-983-thread-1) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 88807 INFO  (closeThreadPool-983-thread-1) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr? version 8.6.0
   [junit4]   2> 88807 INFO  (closeThreadPool-983-thread-1) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 88807 INFO  (closeThreadPool-983-thread-1) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr
   [junit4]   2> 88807 INFO  (closeThreadPool-983-thread-1) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-04-30T11:32:08.136798Z
   [junit4]   2> 88822 INFO  (closeThreadPool-983-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 88826 INFO  (zkConnectionManagerCallback-985-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 88826 INFO  (closeThreadPool-983-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 88928 INFO  (closeThreadPool-983-thread-1) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 88928 INFO  (closeThreadPool-983-thread-1) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_64EADBDE58807702-001/shard-1-001/solr.xml
   [junit4]   2> 88931 INFO  (closeThreadPool-983-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 88931 INFO  (closeThreadPool-983-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 88932 INFO  (closeThreadPool-983-thread-1) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@2e2cd77d, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 89230 INFO  (closeThreadPool-983-thread-1) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 89232 WARN  (closeThreadPool-983-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@c2262d8[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 89232 WARN  (closeThreadPool-983-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@c2262d8[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 89236 WARN  (closeThreadPool-983-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@74d4f8c9[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 89236 WARN  (closeThreadPool-983-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@74d4f8c9[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 89237 INFO  (closeThreadPool-983-thread-1) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:45083/solr
   [junit4]   2> 89240 INFO  (closeThreadPool-983-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 89242 INFO  (zkConnectionManagerCallback-996-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 89243 INFO  (closeThreadPool-983-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 89351 INFO  (closeThreadPool-983-thread-1) [n:127.0.0.1:45589_     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 89352 INFO  (zkConnectionManagerCallback-998-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 89352 INFO  (closeThreadPool-983-thread-1) [n:127.0.0.1:45589_     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 89362 INFO  (closeThreadPool-983-thread-1) [n:127.0.0.1:45589_     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 89366 INFO  (closeThreadPool-983-thread-1) [n:127.0.0.1:45589_     ] o.a.s.c.ZkController Publish node=127.0.0.1:45589_ as DOWN
   [junit4]   2> 89367 INFO  (closeThreadPool-983-thread-1) [n:127.0.0.1:45589_     ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 89367 INFO  (closeThreadPool-983-thread-1) [n:127.0.0.1:45589_     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:45589_
   [junit4]   2> 89375 INFO  (zkCallback-953-thread-2) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 89375 INFO  (zkCallback-981-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 89375 INFO  (zkCallback-997-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 89379 INFO  (closeThreadPool-983-thread-1) [n:127.0.0.1:45589_     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 89380 WARN  (closeThreadPool-983-thread-1) [n:127.0.0.1:45589_     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=you make it. Consider configuring authentication/authorization before exposing Solr to users internal or  authorization=external.  See https://s.apache.org/solrsecurity for more info.  Solr is only as secure as disableddisabled
   [junit4]   2> 89392 INFO  (closeThreadPool-983-thread-1) [n:127.0.0.1:45589_     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 89429 INFO  (closeThreadPool-983-thread-1) [n:127.0.0.1:45589_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@2e2cd77d
   [junit4]   2> 89436 INFO  (closeThreadPool-983-thread-1) [n:127.0.0.1:45589_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@2e2cd77d
   [junit4]   2> 89436 INFO  (closeThreadPool-983-thread-1) [n:127.0.0.1:45589_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@2e2cd77d
   [junit4]   2> 89437 INFO  (closeThreadPool-983-thread-1) [n:127.0.0.1:45589_     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_64EADBDE58807702-001/shard-1-001/cores
   [junit4]   2> 89447 INFO  (closeThreadPool-983-thread-1) [     ] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:45589_
   [junit4]   2> 89472 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :addreplica with params node=127.0.0.1:45589_&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 89474 INFO  (OverseerCollectionConfigSetProcessor-72187430904528901-127.0.0.1:43071_-n_0000000000) [n:127.0.0.1:43071_     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000002 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 89494 INFO  (qtp1578561805-1736) [n:127.0.0.1:43071_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.core.control_collection.shard1.replica_n1:INDEX.sizeInBytes} status=0 QTime=8
   [junit4]   2> 89500 INFO  (qtp1578561805-1736) [n:127.0.0.1:43071_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=1
   [junit4]   2> 89522 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 89528 INFO  (qtp1578561805-1736) [n:127.0.0.1:43071_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.core.control_collection.shard1.replica_n1:INDEX.sizeInBytes} status=0 QTime=3
   [junit4]   2> 89531 INFO  (qtp1578561805-1736) [n:127.0.0.1:43071_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=1
   [junit4]   2> 89534 INFO  (qtp143513150-1798) [n:127.0.0.1:45589_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 89536 INFO  (OverseerThreadFactory-961-thread-3-processing-n:127.0.0.1:43071_) [n:127.0.0.1:43071_ c:collection1 s:shard1   ] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:45589_ for creating new replica of shard shard1 for collection collection1
   [junit4]   2> 89537 INFO  (OverseerThreadFactory-961-thread-3-processing-n:127.0.0.1:43071_) [n:127.0.0.1:43071_ c:collection1 s:shard1   ] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command.
   [junit4]   2> 89581 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_    x:collection1_shard1_replica_n1 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_n1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 90591 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.6.0
   [junit4]   2> 90605 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.IndexSchema Schema name=test
   [junit4]   2> 90682 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 90700 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard1_replica_n1' using configuration from configset conf1, trusted=true
   [junit4]   2> 90700 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.collection1.shard1.replica_n1' (registry 'solr.core.collection1.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@2e2cd77d
   [junit4]   2> 90700 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory solr.hdfs.home=hdfs://localhost.localdomain:43729/solr_hdfs_home
   [junit4]   2> 90701 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Solr Kerberos Authentication disabled
   [junit4]   2> 90701 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrCore [[collection1_shard1_replica_n1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_64EADBDE58807702-001/shard-1-001/cores/collection1_shard1_replica_n1], dataDir=[hdfs://localhost.localdomain:43729/solr_hdfs_home/collection1/core_node2/data/]
   [junit4]   2> 90702 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:43729/solr_hdfs_home/collection1/core_node2/data/snapshot_metadata
   [junit4]   2> 90709 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 90709 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 90709 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 90720 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 90723 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:43729/solr_hdfs_home/collection1/core_node2/data
   [junit4]   2> 90742 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:43729/solr_hdfs_home/collection1/core_node2/data/index
   [junit4]   2> 90749 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 90749 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 90749 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 90754 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 90755 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=29, maxMergeAtOnceExplicit=26, maxMergedSegmentMB=89.9130859375, floorSegmentMB=0.998046875, forceMergeDeletesPctAllowed=8.538998266362153, segmentsPerTier=19.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.7033202577378823, deletesPctAllowed=37.63721210728515
   [junit4]   2> 91192 WARN  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 91257 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.HdfsUpdateLog
   [junit4]   2> 91257 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 91257 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.HdfsUpdateLog Initializing HdfsUpdateLog: tlogDfsReplication=2
   [junit4]   2> 91280 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 91280 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 91283 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogDocMergePolicy: [LogDocMergePolicy: minMergeSize=1000, mergeFactor=11, maxMergeSize=9223372036854775807, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.0]
   [junit4]   2> 91313 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@107f69da[collection1_shard1_replica_n1] main]
   [junit4]   2> 91315 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 91315 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 91316 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms
   [junit4]   2> 91317 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1665396984403460096
   [junit4]   2> 91319 INFO  (searcherExecutor-1009-thread-1-processing-n:127.0.0.1:45589_ x:collection1_shard1_replica_n1 c:collection1 s:shard1) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrCore [collection1_shard1_replica_n1] Registered new searcher Searcher@107f69da[collection1_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 91335 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection1/terms/shard1 to Terms{values={core_node2=0}, version=0}
   [junit4]   2> 91335 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection1/leaders/shard1
   [junit4]   2> 91337 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 91337 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 91337 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync replicas to https://127.0.0.1:45589/collection1_shard1_replica_n1/
   [junit4]   2> 91337 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 91337 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy https://127.0.0.1:45589/collection1_shard1_replica_n1/ has no replicas
   [junit4]   2> 91337 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/collection1/leaders/shard1/leader after winning as /collections/collection1/leader_elect/shard1/election/72187430904528906-core_node2-n_0000000000
   [junit4]   2> 91339 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I am the new leader: https://127.0.0.1:45589/collection1_shard1_replica_n1/ shard1
   [junit4]   2> 91441 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 91446 INFO  (qtp143513150-1796) [n:127.0.0.1:45589_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_n1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1865
   [junit4]   2> 91451 INFO  (qtp1578561805-1738) [n:127.0.0.1:43071_ c:collection1    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:45589_&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2} status=0 QTime=1979
   [junit4]   2> 91455 INFO  (OverseerCollectionConfigSetProcessor-72187430904528901-127.0.0.1:43071_-n_0000000000) [n:127.0.0.1:43071_     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000004 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 91455 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Waiting to see 1 active replicas in collection: collection1
   [junit4]   2> 91547 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.SolrTestCaseJ4 ###Starting testChecksumsOnlyVerbose
   [junit4]   2> 104797 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:43729/solr
   [junit4]   2> 104799 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[64EADBDE58807702]) [     ] o.a.s.SolrTestCaseJ4 ###Ending testChecksumsOnlyVerbose
   [junit4]   2> 104931 INFO  (closeThreadPool-1016-thread-2) [     ] o.a.s.c.CoreContainer Shutting down CoreContainer instance=1779402994
   [junit4]   2> 104931 INFO  (closeThreadPool-1016-thread-2) [     ] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:43071_
   [junit4]   2> 104931 INFO  (closeThreadPool-1016-thread-2) [     ] o.a.s.c.ZkController Publish this node as DOWN...
   [junit4]   2> 104931 INFO  (closeThreadPool-1016-thread-2) [     ] o.a.s.c.ZkController Publish node=127.0.0.1:43071_ as DOWN
   [junit4]   2> 104931 INFO  (closeThreadPool-1016-thread-1) [     ] o.a.s.c.CoreContainer Shutting down CoreContainer instance=1679806780
   [junit4]   2> 104931 INFO  (closeThreadPool-1016-thread-1) [     ] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:45589_
   [junit4]   2> 104932 INFO  (closeThreadPool-1016-thread-1) [     ] o.a.s.c.ZkController Publish this node as DOWN...
   [junit4]   2> 104932 INFO  (closeThreadPool-1016-thread-1) [     ] o.a.s.c.ZkController Publish node=127.0.0.1:45589_ as DOWN
   [junit4]   2> 104933 INFO  (coreCloseExecutor-1023-thread-1) [n:127.0.0.1:43071_     ] o.a.s.c.SolrCore [control_collection_shard1_replica_n1]  CLOSING SolrCore org.apache.solr.core.SolrCore@6b79dc38
   [junit4]   2> 104933 INFO  (coreCloseExecutor-1023-thread-1) [n:127.0.0.1:43071_     ] o.a.s.m.SolrMet

[...truncated too long message...]

dowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) ~[java/:?]
   [junit4]   2> 	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) ~[java/:?]
   [junit4]   2> 	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) ~[java/:?]
   [junit4]   2> 	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54) ~[java/:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at java.lang.Thread.run(Thread.java:834) [?:?]
   [junit4]   2> 214647 WARN  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.a.h.h.s.d.DirectoryScanner DirectoryScanner: shutdown has been called
   [junit4]   2> 214723 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.w.WebAppContext@188c3d3b{datanode,/,null,UNAVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/datanode}
   [junit4]   2> 214723 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@66e6e8f0{HTTP/1.1, (http/1.1)}{localhost:0}
   [junit4]   2> 214723 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 214723 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@724f0ba1{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,UNAVAILABLE}
   [junit4]   2> 214724 WARN  (BP-1219874417-127.0.0.1-1588246322449 heartbeating to localhost.localdomain/127.0.0.1:43729) [     ] o.a.h.h.s.d.IncrementalBlockReportManager IncrementalBlockReportManager interrupted
   [junit4]   2> 214725 WARN  (BP-1219874417-127.0.0.1-1588246322449 heartbeating to localhost.localdomain/127.0.0.1:43729) [     ] o.a.h.h.s.d.DataNode Ending block pool service for: Block pool BP-1219874417-127.0.0.1-1588246322449 (Datanode Uuid d0edb8f9-d6da-49bc-a30e-273f547c28f0) service to localhost.localdomain/127.0.0.1:43729
   [junit4]   2> 214758 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.w.WebAppContext@31ff1b72{hdfs,/,null,UNAVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/hdfs}
   [junit4]   2> 214759 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@1c1fc6a3{HTTP/1.1, (http/1.1)}{localhost.localdomain:0}
   [junit4]   2> 214759 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 214759 INFO  (SUITE-CheckHdfsIndexTest-seed#[64EADBDE58807702]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@56a1c6f6{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,UNAVAILABLE}
   [junit4]   2> NOTE: leaving temporary files on disk at: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_64EADBDE58807702-001
   [junit4]   2> Apr 30, 2020 11:34:14 AM com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks
   [junit4]   2> WARNING: Will linger awaiting termination of 65 leaked thread(s).
   [junit4]   2> NOTE: test params are: codec=Asserting(Lucene84): {date=PostingsFormat(name=LuceneVarGapDocFreqInterval), rnd_b=PostingsFormat(name=Asserting), field=PostingsFormat(name=LuceneVarGapDocFreqInterval), docid=PostingsFormat(name=Asserting), multiDefault=PostingsFormat(name=LuceneVarGapDocFreqInterval), _root_=BlockTreeOrds(blocksize=128), titleTokenized=Lucene84, id=PostingsFormat(name=Asserting), body=PostingsFormat(name=LuceneVarGapDocFreqInterval), title=BlockTreeOrds(blocksize=128)}, docValues:{docid_intDV=DocValuesFormat(name=Direct), range_facet_l_dv=DocValuesFormat(name=Lucene80), _version_=DocValuesFormat(name=Lucene80), range_facet_i_dv=DocValuesFormat(name=Asserting), intDvoDefault=DocValuesFormat(name=Lucene80), titleDV=DocValuesFormat(name=Lucene80), timestamp=DocValuesFormat(name=Asserting)}, maxPointsInLeafNode=1897, maxMBSortInHeap=7.262450966517059, sim=Asserting(org.apache.lucene.search.similarities.AssertingSimilarity@41c270eb), locale=es-PE, timezone=Asia/Tehran
   [junit4]   2> NOTE: Linux 5.3.0-46-generic amd64/AdoptOpenJDK 11.0.6 (64-bit)/cpus=16,threads=10,free=356638672,total=536870912
   [junit4]   2> NOTE: All tests run in this JVM: [TestInitQParser, DeleteNodeTest, TestDistributedStatsComponentCardinality, SolrCoreMetricManagerTest, TestImpersonationWithHadoopAuth, TestSkipOverseerOperations, CheckHdfsIndexTest]
   [junit4] Completed [172/907 (1!)] on J3 in 135.00s, 5 tests, 1 error, 1 skipped <<< FAILURES!

[...truncated 40528 lines...]
-ecj-javadoc-lint-src:
    [mkdir] Created dir: /tmp/ecj1215559777
 [ecj-lint] Compiling 931 source files to /tmp/ecj1215559777
 [ecj-lint] ----------
 [ecj-lint] 1. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/CodecUtil.java (at line 523)
 [ecj-lint] 	throw new CorruptIndexException("misplaced codec footer (file truncated?): length=" + in.length() + " but footerLength==" + footerLength(), input);
 [ecj-lint] 	^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'in' is not closed at this location
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 2. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingStoredFieldsReader.java (at line 166)
 [ecj-lint] 	FieldsIndexReader fieldsIndexReader = new FieldsIndexReader(d, si.name, segmentSuffix, INDEX_EXTENSION_PREFIX, INDEX_CODEC_NAME, si.getId());
 [ecj-lint] 	                  ^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'fieldsIndexReader' is never closed
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 3. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingTermVectorsReader.java (at line 148)
 [ecj-lint] 	FieldsIndexReader fieldsIndexReader = new FieldsIndexReader(d, si.name, segmentSuffix, VECTORS_INDEX_EXTENSION_PREFIX, VECTORS_INDEX_CODEC_NAME, si.getId());
 [ecj-lint] 	                  ^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'fieldsIndexReader' is never closed
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 4. ERROR in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/search/IndexSearcher.java (at line 50)
 [ecj-lint] 	import org.apache.lucene.util.automaton.ByteRunAutomaton;
 [ecj-lint] 	       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 [ecj-lint] The import org.apache.lucene.util.automaton.ByteRunAutomaton is never used
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 5. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/util/automaton/Operations.java (at line 742)
 [ecj-lint] 	Integer q = newstate.get(statesSet);
 [ecj-lint] 	                         ^^^^^^^^^
 [ecj-lint] Unlikely argument type SortedIntSet for get(Object) on a Map<SortedIntSet.FrozenIntSet,Integer>
 [ecj-lint] ----------
 [ecj-lint] 5 problems (1 error, 4 warnings)

BUILD FAILED
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/build.xml:634: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/build.xml:101: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build.xml:201: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/common-build.xml:2127: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/common-build.xml:2166: Compile failed; see the compiler error output for details.

Total time: 39 minutes 50 seconds
Build step 'Invoke Ant' marked build as failure
Archiving artifacts
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
[WARNINGS] Skipping publisher since build result is FAILURE
Recording test results
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2

[JENKINS] Lucene-Solr-8.x-Linux (64bit/jdk-14) - Build # 2919 - Still unstable!

Posted by Policeman Jenkins Server <je...@thetaphi.de>.
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-8.x-Linux/2919/
Java: 64bit/jdk-14 -XX:-UseCompressedOops -XX:+UseSerialGC

2 tests failed.
FAILED:  org.apache.solr.index.hdfs.CheckHdfsIndexTest.doTest

Error Message:
Error from server at https://127.0.0.1:33569/collection1: java.lang.NullPointerException  at org.apache.solr.handler.admin.SystemInfoHandler.getSecurityInfo(SystemInfoHandler.java:326)  at org.apache.solr.handler.admin.SystemInfoHandler.handleRequestBody(SystemInfoHandler.java:146)  at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)  at org.apache.solr.core.SolrCore.execute(SolrCore.java:2600)  at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:803)  at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:582)  at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:432)  at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:362)  at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)  at org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)  at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)  at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)  at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)  at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1610)  at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)  at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1300)  at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)  at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)  at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1580)  at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)  at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1215)  at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)  at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)  at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)  at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:767)  at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)  at org.eclipse.jetty.server.Server.handle(Server.java:500)  at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)  at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)  at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)  at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)  at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)  at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)  at org.eclipse.jetty.io.ssl.SslConnection$1.run(SslConnection.java:146)  at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)  at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)  at java.base/java.lang.Thread.run(Thread.java:832) 

Stack Trace:
org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at https://127.0.0.1:33569/collection1: java.lang.NullPointerException
	at org.apache.solr.handler.admin.SystemInfoHandler.getSecurityInfo(SystemInfoHandler.java:326)
	at org.apache.solr.handler.admin.SystemInfoHandler.handleRequestBody(SystemInfoHandler.java:146)
	at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)
	at org.apache.solr.core.SolrCore.execute(SolrCore.java:2600)
	at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:803)
	at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:582)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:432)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:362)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)
	at org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)
	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1610)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1300)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)
	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1580)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1215)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)
	at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:767)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.server.Server.handle(Server.java:500)
	at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)
	at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)
	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)
	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)
	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
	at org.eclipse.jetty.io.ssl.SslConnection$1.run(SslConnection.java:146)
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)
	at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)
	at java.base/java.lang.Thread.run(Thread.java:832)

	at __randomizedtesting.SeedInfo.seed([4F3637E4BDEF750D:E8728F40D05466B4]:0)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:665)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:265)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:248)
	at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:211)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1003)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1018)
	at org.apache.solr.index.hdfs.CheckHdfsIndexTest.doTest(CheckHdfsIndexTest.java:120)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:564)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1081)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1053)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at java.base/java.lang.Thread.run(Thread.java:832)


FAILED:  org.apache.solr.index.hdfs.CheckHdfsIndexTest.doTest

Error Message:
Error from server at https://127.0.0.1:38417/collection1: java.lang.NullPointerException  at org.apache.solr.handler.admin.SystemInfoHandler.getSecurityInfo(SystemInfoHandler.java:326)  at org.apache.solr.handler.admin.SystemInfoHandler.handleRequestBody(SystemInfoHandler.java:146)  at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)  at org.apache.solr.core.SolrCore.execute(SolrCore.java:2600)  at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:803)  at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:582)  at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:432)  at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:362)  at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)  at org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)  at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)  at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)  at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)  at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1610)  at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)  at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1300)  at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)  at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)  at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1580)  at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)  at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1215)  at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)  at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)  at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)  at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:767)  at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)  at org.eclipse.jetty.server.Server.handle(Server.java:500)  at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)  at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)  at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)  at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)  at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)  at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)  at org.eclipse.jetty.io.ssl.SslConnection$1.run(SslConnection.java:146)  at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)  at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)  at java.base/java.lang.Thread.run(Thread.java:832) 

Stack Trace:
org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at https://127.0.0.1:38417/collection1: java.lang.NullPointerException
	at org.apache.solr.handler.admin.SystemInfoHandler.getSecurityInfo(SystemInfoHandler.java:326)
	at org.apache.solr.handler.admin.SystemInfoHandler.handleRequestBody(SystemInfoHandler.java:146)
	at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)
	at org.apache.solr.core.SolrCore.execute(SolrCore.java:2600)
	at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:803)
	at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:582)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:432)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:362)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)
	at org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)
	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1610)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1300)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)
	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1580)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1215)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)
	at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:767)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.server.Server.handle(Server.java:500)
	at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)
	at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)
	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)
	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)
	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
	at org.eclipse.jetty.io.ssl.SslConnection$1.run(SslConnection.java:146)
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)
	at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)
	at java.base/java.lang.Thread.run(Thread.java:832)

	at __randomizedtesting.SeedInfo.seed([4F3637E4BDEF750D:E8728F40D05466B4]:0)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:665)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:265)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:248)
	at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:211)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1003)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1018)
	at org.apache.solr.index.hdfs.CheckHdfsIndexTest.doTest(CheckHdfsIndexTest.java:120)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:564)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1081)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1053)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at java.base/java.lang.Thread.run(Thread.java:832)




Build Log:
[...truncated 16511 lines...]
   [junit4] Suite: org.apache.solr.index.hdfs.CheckHdfsIndexTest
   [junit4]   2> 1097948 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.a.s.SolrTestCase Setting 'solr.default.confdir' system property to test-framework derived value of '/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/server/solr/configsets/_default/conf'
   [junit4]   2> 1097948 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.a.s.SolrTestCaseJ4 SecureRandom sanity checks: test.solr.allowed.securerandom=null & java.security.egd=file:/dev/./urandom
   [junit4]   2> 1097948 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.a.s.SolrTestCaseJ4 Created dataDir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/temp/solr.index.hdfs.CheckHdfsIndexTest_4F3637E4BDEF750D-001/data-dir-124-001
   [junit4]   2> 1097948 WARN  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.a.s.SolrTestCaseJ4 startTrackingSearchers: numOpens=38 numCloses=38
   [junit4]   2> 1097948 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.a.s.SolrTestCaseJ4 Using PointFields (NUMERIC_POINTS_SYSPROP=true) w/NUMERIC_DOCVALUES_SYSPROP=true
   [junit4]   2> 1097949 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.a.s.SolrTestCaseJ4 Randomized ssl (true) and clientAuth (true) via: @org.apache.solr.util.RandomizeSSL(reason="", ssl=0.0/0.0, value=0.0/0.0, clientAuth=0.0/0.0)
   [junit4]   2> 1097949 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.a.s.BaseDistributedSearchTestCase Setting hostContext system property: /
   [junit4]   1> Formatting using clusterid: testClusterID
   [junit4]   2> 1097976 WARN  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 1097977 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 14+36
   [junit4]   2> 1097980 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 1097980 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 1097980 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 1097980 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@39cba97b{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,AVAILABLE}
   [junit4]   2> 1098077 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.w.WebAppContext@5267de1e{hdfs,/,file:///home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/temp/jetty-localhost_localdomain-33255-hadoop-hdfs-3_2_0-tests_jar-_-any-16032205707297707498.dir/webapp/,AVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/hdfs}
   [junit4]   2> 1098078 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.e.j.s.AbstractConnector Started ServerConnector@1ea547f4{HTTP/1.1, (http/1.1)}{localhost.localdomain:33255}
   [junit4]   2> 1098078 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.e.j.s.Server Started @1098104ms
   [junit4]   2> 1098114 WARN  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 1098116 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 14+36
   [junit4]   2> 1098119 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 1098119 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 1098119 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 1098119 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@5455e157{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,AVAILABLE}
   [junit4]   2> 1098226 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.w.WebAppContext@5d2cd11f{datanode,/,file:///home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/temp/jetty-localhost-43665-hadoop-hdfs-3_2_0-tests_jar-_-any-9079678006642981233.dir/webapp/,AVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/datanode}
   [junit4]   2> 1098226 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.e.j.s.AbstractConnector Started ServerConnector@165f6{HTTP/1.1, (http/1.1)}{localhost:43665}
   [junit4]   2> 1098226 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.e.j.s.Server Started @1098252ms
   [junit4]   2> 1098285 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x89b09fda0c8e3ca: Processing first storage report for DS-c5715ccd-ceef-4acc-ab04-f5c272614b6d from datanode 73ee8c01-863d-47ce-8e6d-8b7702521383
   [junit4]   2> 1098285 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x89b09fda0c8e3ca: from storage DS-c5715ccd-ceef-4acc-ab04-f5c272614b6d node DatanodeRegistration(127.0.0.1:45649, datanodeUuid=73ee8c01-863d-47ce-8e6d-8b7702521383, infoPort=45519, infoSecurePort=0, ipcPort=44111, storageInfo=lv=-57;cid=testClusterID;nsid=1638332147;c=1588239780838), blocks: 0, hasStaleStorage: true, processing time: 0 msecs, invalidatedBlocks: 0
   [junit4]   2> 1098285 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x89b09fda0c8e3ca: Processing first storage report for DS-1594f0af-19d3-4c06-9e62-0b934a01491f from datanode 73ee8c01-863d-47ce-8e6d-8b7702521383
   [junit4]   2> 1098285 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x89b09fda0c8e3ca: from storage DS-1594f0af-19d3-4c06-9e62-0b934a01491f node DatanodeRegistration(127.0.0.1:45649, datanodeUuid=73ee8c01-863d-47ce-8e6d-8b7702521383, infoPort=45519, infoSecurePort=0, ipcPort=44111, storageInfo=lv=-57;cid=testClusterID;nsid=1638332147;c=1588239780838), blocks: 0, hasStaleStorage: false, processing time: 0 msecs, invalidatedBlocks: 0
   [junit4]   2> 1098360 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER
   [junit4]   2> 1098361 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer client port:0.0.0.0/0.0.0.0:0
   [junit4]   2> 1098361 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer Starting server
   [junit4]   2> 1098461 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.ZkTestServer start zk server on port:42987
   [junit4]   2> 1098461 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.ZkTestServer waitForServerUp: 127.0.0.1:42987
   [junit4]   2> 1098461 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.ZkTestServer parse host and port list: 127.0.0.1:42987
   [junit4]   2> 1098461 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.ZkTestServer connecting to 127.0.0.1 42987
   [junit4]   2> 1098462 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1098463 INFO  (zkConnectionManagerCallback-12563-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1098463 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1098464 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1098465 INFO  (zkConnectionManagerCallback-12565-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1098465 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1098465 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig-tlog.xml to /configs/conf1/solrconfig.xml
   [junit4]   2> 1098466 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/schema.xml to /configs/conf1/schema.xml
   [junit4]   2> 1098467 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig.snippet.randomindexconfig.xml to /configs/conf1/solrconfig.snippet.randomindexconfig.xml
   [junit4]   2> 1098468 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/stopwords.txt to /configs/conf1/stopwords.txt
   [junit4]   2> 1098469 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/protwords.txt to /configs/conf1/protwords.txt
   [junit4]   2> 1098469 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/currency.xml to /configs/conf1/currency.xml
   [junit4]   2> 1098470 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/enumsConfig.xml to /configs/conf1/enumsConfig.xml
   [junit4]   2> 1098470 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/open-exchange-rates.json to /configs/conf1/open-exchange-rates.json
   [junit4]   2> 1098470 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/mapping-ISOLatin1Accent.txt to /configs/conf1/mapping-ISOLatin1Accent.txt
   [junit4]   2> 1098471 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/old_synonyms.txt to /configs/conf1/old_synonyms.txt
   [junit4]   2> 1098471 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/synonyms.txt to /configs/conf1/synonyms.txt
   [junit4]   2> 1098472 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1098473 INFO  (zkConnectionManagerCallback-12569-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1098473 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1098574 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Will use NRT replicas unless explicitly asked otherwise
   [junit4]   2> 1098652 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 1098652 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 1098652 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ...
   [junit4]   2> 1098652 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 14+36
   [junit4]   2> 1098652 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 1098652 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 1098652 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 1098653 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@3349044c{/,null,AVAILABLE}
   [junit4]   2> 1098653 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.e.j.s.AbstractConnector Started ServerConnector@4fe0057{ssl, (ssl, alpn, http/1.1, h2)}{127.0.0.1:35133}
   [junit4]   2> 1098653 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.e.j.s.Server Started @1098679ms
   [junit4]   2> 1098653 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/, solr.data.dir=hdfs://localhost.localdomain:41635/hdfs__localhost.localdomain_41635__home_jenkins_workspace_Lucene-Solr-8.x-Linux_solr_build_solr-core_test_J0_temp_solr.index.hdfs.CheckHdfsIndexTest_4F3637E4BDEF750D-001_tempDir-002_control_data, hostPort=35133, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/temp/solr.index.hdfs.CheckHdfsIndexTest_4F3637E4BDEF750D-001/control-001/cores, replicaType=NRT}
   [junit4]   2> 1098654 ERROR (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 1098654 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 1098654 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solrâ„¢ version 8.6.0
   [junit4]   2> 1098654 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 1098654 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr
   [junit4]   2> 1098654 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-04-30T09:43:01.528325Z
   [junit4]   2> 1098654 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1098655 INFO  (zkConnectionManagerCallback-12571-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1098655 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1098756 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 1098756 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/temp/solr.index.hdfs.CheckHdfsIndexTest_4F3637E4BDEF750D-001/control-001/solr.xml
   [junit4]   2> 1098758 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 1098758 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 1098759 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@7dbcc125, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 1098839 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 1098840 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@1e0ea788[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1098840 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@1e0ea788[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1098841 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@1247bd7f[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1098841 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@1247bd7f[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1098842 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:42987/solr
   [junit4]   2> 1098842 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1098843 INFO  (zkConnectionManagerCallback-12582-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1098843 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1098945 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [n:127.0.0.1:35133_     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1098945 INFO  (zkConnectionManagerCallback-12584-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1098945 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [n:127.0.0.1:35133_     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1098974 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [n:127.0.0.1:35133_     ] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:35133_
   [junit4]   2> 1098974 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [n:127.0.0.1:35133_     ] o.a.s.c.Overseer Overseer (id=72187002111328261-127.0.0.1:35133_-n_0000000000) starting
   [junit4]   2> 1098976 INFO  (OverseerStateUpdate-72187002111328261-127.0.0.1:35133_-n_0000000000) [n:127.0.0.1:35133_     ] o.a.s.c.Overseer Starting to work on the main queue : 127.0.0.1:35133_
   [junit4]   2> 1098976 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [n:127.0.0.1:35133_     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:35133_
   [junit4]   2> 1098977 INFO  (zkCallback-12583-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 1098977 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [n:127.0.0.1:35133_     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 1098977 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [n:127.0.0.1:35133_     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=you make it. Consider configuring authentication/authorization before exposing Solr to users internal or  authorization=external.  See https://s.apache.org/solrsecurity for more info.  Solr is only as secure as disableddisabled
   [junit4]   2> 1098985 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [n:127.0.0.1:35133_     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 1099000 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [n:127.0.0.1:35133_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@7dbcc125
   [junit4]   2> 1099008 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [n:127.0.0.1:35133_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@7dbcc125
   [junit4]   2> 1099008 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [n:127.0.0.1:35133_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@7dbcc125
   [junit4]   2> 1099009 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [n:127.0.0.1:35133_     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/temp/solr.index.hdfs.CheckHdfsIndexTest_4F3637E4BDEF750D-001/control-001/cores
   [junit4]   2> 1099015 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1099015 INFO  (zkConnectionManagerCallback-12601-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1099015 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1099016 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 1099016 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:42987/solr ready
   [junit4]   2> 1099030 INFO  (qtp978044763-23590) [n:127.0.0.1:35133_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:35133_&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 1099032 INFO  (OverseerThreadFactory-12591-thread-1-processing-n:127.0.0.1:35133_) [n:127.0.0.1:35133_     ] o.a.s.c.a.c.CreateCollectionCmd Create collection control_collection
   [junit4]   2> 1099141 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 1099143 INFO  (qtp978044763-23594) [n:127.0.0.1:35133_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 1099144 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_    x:control_collection_shard1_replica_n1 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 1099144 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_    x:control_collection_shard1_replica_n1 ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 1100150 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.6.0
   [junit4]   2> 1100167 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.IndexSchema Schema name=test
   [junit4]   2> 1100226 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 1100234 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.CoreContainer Creating SolrCore 'control_collection_shard1_replica_n1' using configuration from configset conf1, trusted=true
   [junit4]   2> 1100234 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.control_collection.shard1.replica_n1' (registry 'solr.core.control_collection.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@7dbcc125
   [junit4]   2> 1100234 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory solr.hdfs.home=hdfs://localhost.localdomain:41635/solr_hdfs_home
   [junit4]   2> 1100234 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Solr Kerberos Authentication disabled
   [junit4]   2> 1100234 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrCore [[control_collection_shard1_replica_n1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/temp/solr.index.hdfs.CheckHdfsIndexTest_4F3637E4BDEF750D-001/control-001/cores/control_collection_shard1_replica_n1], dataDir=[hdfs://localhost.localdomain:41635/solr_hdfs_home/control_collection/core_node2/data/]
   [junit4]   2> 1100235 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:41635/solr_hdfs_home/control_collection/core_node2/data/snapshot_metadata
   [junit4]   2> 1100239 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 1100239 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 1100239 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 1100242 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 1100243 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:41635/solr_hdfs_home/control_collection/core_node2/data
   [junit4]   2> 1100254 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:41635/solr_hdfs_home/control_collection/core_node2/data/index
   [junit4]   2> 1100258 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 1100258 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 1100258 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 1100262 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 1100262 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogDocMergePolicy: [LogDocMergePolicy: minMergeSize=1000, mergeFactor=49, maxMergeSize=9223372036854775807, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.4104206170339654]
   [junit4]   2> 1100670 WARN  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 1100704 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.HdfsUpdateLog
   [junit4]   2> 1100704 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 1100704 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.HdfsUpdateLog Initializing HdfsUpdateLog: tlogDfsReplication=2
   [junit4]   2> 1100711 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 1100711 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 1100712 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=38, maxMergeAtOnceExplicit=21, maxMergedSegmentMB=40.283203125, floorSegmentMB=0.6416015625, forceMergeDeletesPctAllowed=16.857507177679242, segmentsPerTier=17.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.0, deletesPctAllowed=45.57208994436813
   [junit4]   2> 1100715 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@6a94c55a[control_collection_shard1_replica_n1] main]
   [junit4]   2> 1100716 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 1100716 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 1100716 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms
   [junit4]   2> 1100716 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1665390119317667840
   [junit4]   2> 1100718 INFO  (searcherExecutor-12603-thread-1-processing-n:127.0.0.1:35133_ x:control_collection_shard1_replica_n1 c:control_collection s:shard1) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrCore [control_collection_shard1_replica_n1] Registered new searcher Searcher@6a94c55a[control_collection_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 1100718 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ZkShardTerms Successful update of terms at /collections/control_collection/terms/shard1 to Terms{values={core_node2=0}, version=0}
   [junit4]   2> 1100718 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/control_collection/leaders/shard1
   [junit4]   2> 1100719 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 1100719 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 1100719 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync replicas to https://127.0.0.1:35133/control_collection_shard1_replica_n1/
   [junit4]   2> 1100719 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 1100719 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy https://127.0.0.1:35133/control_collection_shard1_replica_n1/ has no replicas
   [junit4]   2> 1100719 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/control_collection/leaders/shard1/leader after winning as /collections/control_collection/leader_elect/shard1/election/72187002111328261-core_node2-n_0000000000
   [junit4]   2> 1100720 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I am the new leader: https://127.0.0.1:35133/control_collection_shard1_replica_n1/ shard1
   [junit4]   2> 1100821 INFO  (zkCallback-12583-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 1100821 INFO  (zkCallback-12583-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 1100821 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 1100822 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1678
   [junit4]   2> 1100823 INFO  (qtp978044763-23590) [n:127.0.0.1:35133_     ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas
   [junit4]   2> 1100922 INFO  (zkCallback-12583-thread-3) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 1100922 INFO  (zkCallback-12583-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 1100922 INFO  (zkCallback-12583-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 1100922 INFO  (qtp978044763-23590) [n:127.0.0.1:35133_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:35133_&wt=javabin&version=2} status=0 QTime=1892
   [junit4]   2> 1100923 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Waiting to see 1 active replicas in collection: control_collection
   [junit4]   2> 1101025 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1101026 INFO  (zkConnectionManagerCallback-12612-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1101026 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1101026 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 1101027 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:42987/solr ready
   [junit4]   2> 1101027 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.ChaosMonkey monkey: init - expire sessions:false cause connection loss:false
   [junit4]   2> 1101032 INFO  (OverseerCollectionConfigSetProcessor-72187002111328261-127.0.0.1:35133_-n_0000000000) [n:127.0.0.1:35133_     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000000 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 1101040 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=1&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 1101041 INFO  (OverseerThreadFactory-12591-thread-2-processing-n:127.0.0.1:35133_) [n:127.0.0.1:35133_     ] o.a.s.c.a.c.CreateCollectionCmd Create collection collection1
   [junit4]   2> 1101243 WARN  (OverseerThreadFactory-12591-thread-2-processing-n:127.0.0.1:35133_) [n:127.0.0.1:35133_     ] o.a.s.c.a.c.CreateCollectionCmd It is unusual to create a collection (collection1) without cores.
   [junit4]   2> 1101243 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_     ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas
   [junit4]   2> 1101244 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=1&wt=javabin&version=2} status=0 QTime=204
   [junit4]   2> 1101244 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 1101244 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 1101244 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 1101244 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 1101244 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 1101244 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 1101244 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Creating jetty instances pullReplicaCount=0 numOtherReplicas=1
   [junit4]   2> 1101322 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[4F3637E4BDEF750D]) [     ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 1 in directory /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/temp/solr.index.hdfs.CheckHdfsIndexTest_4F3637E4BDEF750D-001/shard-1-001 of type NRT for shard1
   [junit4]   2> 1101324 WARN  (closeThreadPool-12613-thread-1) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 1101324 INFO  (closeThreadPool-12613-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 1101324 INFO  (closeThreadPool-12613-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ...
   [junit4]   2> 1101324 INFO  (closeThreadPool-12613-thread-1) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 14+36
   [junit4]   2> 1101325 INFO  (closeThreadPool-12613-thread-1) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 1101325 INFO  (closeThreadPool-12613-thread-1) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 1101325 INFO  (closeThreadPool-12613-thread-1) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 1101325 INFO  (closeThreadPool-12613-thread-1) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@19edf36f{/,null,AVAILABLE}
   [junit4]   2> 1101329 INFO  (closeThreadPool-12613-thread-1) [     ] o.e.j.s.AbstractConnector Started ServerConnector@3aa71db3{ssl, (ssl, alpn, http/1.1, h2)}{127.0.0.1:43121}
   [junit4]   2> 1101329 INFO  (closeThreadPool-12613-thread-1) [     ] o.e.j.s.Server Started @1101355ms
   [junit4]   2> 1101329 INFO  (closeThreadPool-12613-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/, solrconfig=solrconfig.xml, solr.data.dir=hdfs://localhost.localdomain:41635/hdfs__localhost.localdomain_41635__home_jenkins_workspace_Lucene-Solr-8.x-Linux_solr_build_solr-core_test_J0_temp_solr.index.hdfs.CheckHdfsIndexTest_4F3637E4BDEF750D-001_tempDir-002_jetty1, hostPort=43121, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/temp/solr.index.hdfs.CheckHdfsIndexTest_4F3637E4BDEF750D-001/shard-1-001/cores}
   [junit4]   2> 1101329 ERROR (closeThreadPool-12613-thread-1) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 1101329 INFO  (closeThreadPool-12613-thread-1) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 1101329 INFO  (closeThreadPool-12613-thread-1) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solrâ„¢ version 8.6.0
   [junit4]   2> 1101329 INFO  (closeThreadPool-12613-thread-1) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 1101329 INFO  (closeThreadPool-12613-thread-1) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr
   [junit4]   2> 1101329 INFO  (closeThreadPool-12613-thread-1) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-04-30T09:43:04.203834Z
   [junit4]   2> 1101330 INFO  (closeThreadPool-12613-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1101331 INFO  (zkConnectionManagerCallback-12615-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1101331 INFO  (closeThreadPool-12613-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1101431 INFO  (closeThreadPool-12613-thread-1) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 1101431 INFO  (closeThreadPool-12613-thread-1) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/temp/solr.index.hdfs.CheckHdfsIndexTest_4F3637E4BDEF750D-001/shard-1-001/solr.xml
   [junit4]   2> 1101433 INFO  (closeThreadPool-12613-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 1101433 INFO  (closeThreadPool-12613-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 1101434 INFO  (closeThreadPool-12613-thread-1) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@7dbcc125, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 1101531 INFO  (closeThreadPool-12613-thread-1) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 1101532 WARN  (closeThreadPool-12613-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@52d133d6[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1101532 WARN  (closeThreadPool-12613-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@52d133d6[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1101533 WARN  (closeThreadPool-12613-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@5704b102[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1101533 WARN  (closeThreadPool-12613-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@5704b102[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1101534 INFO  (closeThreadPool-12613-thread-1) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:42987/solr
   [junit4]   2> 1101534 INFO  (closeThreadPool-12613-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1101535 INFO  (zkConnectionManagerCallback-12626-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1101535 INFO  (closeThreadPool-12613-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1101636 INFO  (closeThreadPool-12613-thread-1) [n:127.0.0.1:43121_     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1101637 INFO  (zkConnectionManagerCallback-12628-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1101637 INFO  (closeThreadPool-12613-thread-1) [n:127.0.0.1:43121_     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1101638 INFO  (closeThreadPool-12613-thread-1) [n:127.0.0.1:43121_     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 1101639 INFO  (closeThreadPool-12613-thread-1) [n:127.0.0.1:43121_     ] o.a.s.c.ZkController Publish node=127.0.0.1:43121_ as DOWN
   [junit4]   2> 1101639 INFO  (closeThreadPool-12613-thread-1) [n:127.0.0.1:43121_     ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 1101639 INFO  (closeThreadPool-12613-thread-1) [n:127.0.0.1:43121_     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:43121_
   [junit4]   2> 1101640 INFO  (zkCallback-12583-thread-2) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 1101640 INFO  (zkCallback-12611-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 1101640 INFO  (zkCallback-12627-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 1101640 INFO  (closeThreadPool-12613-thread-1) [n:127.0.0.1:43121_     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 1101640 WARN  (closeThreadPool-12613-thread-1) [n:127.0.0.1:43121_     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=you make it. Consider configuring authentication/authorization before exposing Solr to users internal or  authorization=external.  See https://s.apache.org/solrsecurity for more info.  Solr is only as secure as disableddisabled
   [junit4]   2> 1101647 INFO  (closeThreadPool-12613-thread-1) [n:127.0.0.1:43121_     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 1101657 INFO  (closeThreadPool-12613-thread-1) [n:127.0.0.1:43121_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@7dbcc125
   [junit4]   2> 1101672 INFO  (closeThreadPool-12613-thread-1) [n:127.0.0.1:43121_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@7dbcc125
   [junit4]   2> 1101672 INFO  (closeThreadPool-12613-thread-1) [n:127.0.0.1:43121_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@7dbcc125
   [junit4]   2> 1101672 INFO  (closeThreadPool-12613-thread-1) [n:127.0.0.1:43121_     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/temp/solr.index.hdfs.CheckHdfsIndexTest_4F3637E4BDEF750D-001/shard-1-001/cores
   [junit4]   2> 1101676 INFO  (closeThreadPool-12613-thread-1) [     ] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:43121_
   [junit4]   2> 1101678 INFO  (qtp978044763-23590) [n:127.0.0.1:35133_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :addreplica with params node=127.0.0.1:43121_&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 1101679 INFO  (OverseerCollectionConfigSetProcessor-72187002111328261-127.0.0.1:35133_-n_0000000000) [n:127.0.0.1:35133_     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000002 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 1101681 INFO  (qtp978044763-23594) [n:127.0.0.1:35133_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.core.control_collection.shard1.replica_n1:INDEX.sizeInBytes} status=0 QTime=1
   [junit4]   2> 1101682 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 1101693 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 1101695 INFO  (qtp978044763-23594) [n:127.0.0.1:35133_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.core.control_collection.shard1.replica_n1:INDEX.sizeInBytes} status=0 QTime=1
   [junit4]   2> 1101696 INFO  (qtp978044763-23592) [n:127.0.0.1:35133_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 1101697 INFO  (qtp406408057-23653) [n:127.0.0.1:43121_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 1101698 INFO  (OverseerThreadFactory-12591-thread-3-processing-n:127.0.0.1:35133_) [n:127.0.0.1:35133_ c:collection1 s:shard1   ] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:43121_ for creating new replica of shard shard1 for collection collection1
   [junit4]   2> 1101698 INFO  (OverseerThreadFactory-12591-thread-3-processing-n:127.0.0.1:35133_) [n:127.0.0.1:35133_ c:collection1 s:shard1   ] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command.
   [junit4]   2> 1101715 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_    x:collection1_shard1_replica_n1 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_n1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 1102721 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.6.0
   [junit4]   2> 1102734 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.IndexSchema Schema name=test
   [junit4]   2> 1102810 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 1102816 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard1_replica_n1' using configuration from configset conf1, trusted=true
   [junit4]   2> 1102816 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.collection1.shard1.replica_n1' (registry 'solr.core.collection1.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@7dbcc125
   [junit4]   2> 1102816 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory solr.hdfs.home=hdfs://localhost.localdomain:41635/solr_hdfs_home
   [junit4]   2> 1102816 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Solr Kerberos Authentication disabled
   [junit4]   2> 1102816 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrCore [[collection1_shard1_replica_n1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/temp/solr.index.hdfs.CheckHdfsIndexTest_4F3637E4BDEF750D-001/shard-1-001/cores/collection1_shard1_replica_n1], dataDir=[hdfs://localhost.localdomain:41635/solr_hdfs_home/collection1/core_node2/data/]
   [junit4]   2> 1102816 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:41635/solr_hdfs_home/collection1/core_node2/data/snapshot_metadata
   [junit4]   2> 1102821 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 1102821 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 1102821 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 1102823 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 1102823 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:41635/solr_hdfs_home/collection1/core_node2/data
   [junit4]   2> 1102833 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:41635/solr_hdfs_home/collection1/core_node2/data/index
   [junit4]   2> 1102838 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 1102838 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 1102838 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 1102840 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 1102840 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogDocMergePolicy: [LogDocMergePolicy: minMergeSize=1000, mergeFactor=49, maxMergeSize=9223372036854775807, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.4104206170339654]
   [junit4]   2> 1102847 WARN  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 1102874 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.HdfsUpdateLog
   [junit4]   2> 1102874 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 1102874 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.HdfsUpdateLog Initializing HdfsUpdateLog: tlogDfsReplication=2
   [junit4]   2> 1102883 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 1102883 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 1102884 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=38, maxMergeAtOnceExplicit=21, maxMergedSegmentMB=40.283203125, floorSegmentMB=0.6416015625, forceMergeDeletesPctAllowed=16.857507177679242, segmentsPerTier=17.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.0, deletesPctAllowed=45.57208994436813
   [junit4]   2> 1102886 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@1b86ed56[collection1_shard1_replica_n1] main]
   [junit4]   2> 1102887 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 1102887 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 1102887 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms
   [junit4]   2> 1102888 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1665390121595174912
   [junit4]   2> 1102889 INFO  (searcherExecutor-12639-thread-1-processing-n:127.0.0.1:43121_ x:collection1_shard1_replica_n1 c:collection1 s:shard1) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrCore [collection1_shard1_replica_n1] Registered new searcher Searcher@1b86ed56[collection1_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 1102890 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection1/terms/shard1 to Terms{values={core_node2=0}, version=0}
   [junit4]   2> 1102890 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection1/leaders/shard1
   [junit4]   2> 1102890 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 1102890 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 1102890 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync replicas to https://127.0.0.1:43121/collection1_shard1_replica_n1/
   [junit4]   2> 1102890 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 1102890 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy https://127.0.0.1:43121/collection1_shard1_replica_n1/ has no replicas
   [junit4]   2> 1102890 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/collection1/leaders/shard1/leader after winning as /collections/collection1/leader_elect/shard1/election/72187002111328266-core_node2-n_0000000000
   [junit4]   2> 1102891 INFO  (qtp406408057-23651) [n:127.0.0.1:43121_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I am the new leader: https://1

[...truncated too long message...]

HdfsIndexTest.java:71) ~[test/:?]
   [junit4]   2> 	at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
   [junit4]   2> 	at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?]
   [junit4]   2> 	at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
   [junit4]   2> 	at java.lang.reflect.Method.invoke(Method.java:564) ~[?:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:901) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) ~[java/:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41) ~[java/:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) ~[java/:?]
   [junit4]   2> 	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) ~[java/:?]
   [junit4]   2> 	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) ~[java/:?]
   [junit4]   2> 	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54) ~[java/:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at java.lang.Thread.run(Thread.java:832) [?:?]
   [junit4]   2> 183377 WARN  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.a.h.h.s.d.DirectoryScanner DirectoryScanner: shutdown has been called
   [junit4]   2> 183394 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.w.WebAppContext@393428c5{datanode,/,null,UNAVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/datanode}
   [junit4]   2> 183394 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@6f122141{HTTP/1.1, (http/1.1)}{localhost:0}
   [junit4]   2> 183394 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 183394 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@4d2dee6c{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,UNAVAILABLE}
   [junit4]   2> 183395 WARN  (BP-966566496-127.0.0.1-1588240416127 heartbeating to localhost.localdomain/127.0.0.1:34933) [     ] o.a.h.h.s.d.IncrementalBlockReportManager IncrementalBlockReportManager interrupted
   [junit4]   2> 183395 WARN  (BP-966566496-127.0.0.1-1588240416127 heartbeating to localhost.localdomain/127.0.0.1:34933) [     ] o.a.h.h.s.d.DataNode Ending block pool service for: Block pool BP-966566496-127.0.0.1-1588240416127 (Datanode Uuid fa36b63f-087d-4d54-90f7-1efd6a61812e) service to localhost.localdomain/127.0.0.1:34933
   [junit4]   2> 183416 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.w.WebAppContext@5c259364{hdfs,/,null,UNAVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/hdfs}
   [junit4]   2> 183416 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@8fefc7a{HTTP/1.1, (http/1.1)}{localhost.localdomain:0}
   [junit4]   2> 183416 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 183417 INFO  (SUITE-CheckHdfsIndexTest-seed#[4F3637E4BDEF750D]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@7972613{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,UNAVAILABLE}
   [junit4]   2> NOTE: leaving temporary files on disk at: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/temp/solr.index.hdfs.CheckHdfsIndexTest_4F3637E4BDEF750D-001
   [junit4]   2> Apr 30, 2020 9:56:36 AM com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks
   [junit4]   2> WARNING: Will linger awaiting termination of 65 leaked thread(s).
   [junit4]   2> NOTE: test params are: codec=Asserting(Lucene84): {date=BlockTreeOrds(blocksize=128), rnd_b=TestBloomFilteredLucenePostings(BloomFilteringPostingsFormat(Lucene84)), field=BlockTreeOrds(blocksize=128), docid=TestBloomFilteredLucenePostings(BloomFilteringPostingsFormat(Lucene84)), multiDefault=BlockTreeOrds(blocksize=128), _root_=FST50, titleTokenized=PostingsFormat(name=MockRandom), id=TestBloomFilteredLucenePostings(BloomFilteringPostingsFormat(Lucene84)), body=BlockTreeOrds(blocksize=128), title=FST50}, docValues:{range_facet_l_dv=DocValuesFormat(name=Lucene80), n_l1=DocValuesFormat(name=Lucene80), intDefault=DocValuesFormat(name=Asserting), n_dt1=DocValuesFormat(name=Asserting), n_td1=DocValuesFormat(name=Asserting), n_d1=DocValuesFormat(name=Lucene80), range_facet_l=DocValuesFormat(name=Lucene80), n_f1=DocValuesFormat(name=Asserting), n_ti1=DocValuesFormat(name=Lucene80), docid_intDV=DocValuesFormat(name=Direct), n_tl1=DocValuesFormat(name=Asserting), _version_=DocValuesFormat(name=Asserting), n_tf1=DocValuesFormat(name=Lucene80), n_tdt1=DocValuesFormat(name=Asserting), id_i1=DocValuesFormat(name=Asserting), range_facet_i_dv=DocValuesFormat(name=Lucene80), intDvoDefault=DocValuesFormat(name=Lucene80), titleDV=DocValuesFormat(name=Asserting), timestamp=DocValuesFormat(name=Lucene80)}, maxPointsInLeafNode=52, maxMBSortInHeap=5.8905437800842995, sim=Asserting(org.apache.lucene.search.similarities.AssertingSimilarity@5d64bc3c), locale=ln-CD, timezone=Asia/Pontianak
   [junit4]   2> NOTE: Linux 5.3.0-46-generic amd64/AdoptOpenJDK 14 (64-bit)/cpus=16,threads=10,free=305082528,total=518979584
   [junit4]   2> NOTE: All tests run in this JVM: [CheckHdfsIndexTest]
   [junit4] Completed [5/5 (5!)] on J0 in 185.76s, 5 tests, 1 error, 1 skipped <<< FAILURES!

[...truncated 17 lines...]
BUILD FAILED
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/common-build.xml:1599: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/common-build.xml:1126: There were test failures: 5 suites, 25 tests, 5 errors, 5 ignored [seed: 4F3637E4BDEF750D]

Total time: 3 minutes 8 seconds

[repro] Setting last failure code to 256

[repro] Failures w/original seeds:
[repro]   5/5 failed: org.apache.solr.index.hdfs.CheckHdfsIndexTest
[repro] Exiting with code 256
+ mv lucene/build lucene/build.repro
+ mv solr/build solr/build.repro
+ mv lucene/build.orig lucene/build
+ mv solr/build.orig solr/build
Archiving artifacts
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
[WARNINGS] Parsing warnings in console log with parser Java Compiler (javac)
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
No credentials specified
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
<Git Blamer> Using GitBlamer to create author and commit information for all warnings.
<Git Blamer> GIT_COMMIT=92a7d56ada9ff3213aad2e238868b00dc7f9be06, workspace=/home/jenkins/workspace/Lucene-Solr-8.x-Linux
[WARNINGS] Computing warning deltas based on reference build #2915
Recording test results
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Build step 'Publish JUnit test result report' changed build result to UNSTABLE
Email was triggered for: Unstable (Test Failures)
Sending email for trigger: Unstable (Test Failures)
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2

[JENKINS] Lucene-Solr-8.x-Linux (64bit/jdk1.8.0_201) - Build # 2918 - Still Failing!

Posted by Policeman Jenkins Server <je...@thetaphi.de>.
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-8.x-Linux/2918/
Java: 64bit/jdk1.8.0_201 -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC

1 tests failed.
FAILED:  org.apache.solr.index.hdfs.CheckHdfsIndexTest.doTest

Error Message:
Error from server at https://127.0.0.1:40045/collection1: java.lang.NullPointerException  at org.apache.solr.handler.admin.SystemInfoHandler.getSecurityInfo(SystemInfoHandler.java:326)  at org.apache.solr.handler.admin.SystemInfoHandler.handleRequestBody(SystemInfoHandler.java:146)  at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)  at org.apache.solr.core.SolrCore.execute(SolrCore.java:2600)  at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:803)  at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:582)  at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:432)  at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:362)  at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)  at org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)  at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)  at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)  at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)  at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1610)  at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)  at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1300)  at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)  at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)  at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1580)  at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)  at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1215)  at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)  at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)  at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)  at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:767)  at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)  at org.eclipse.jetty.server.Server.handle(Server.java:500)  at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)  at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)  at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)  at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)  at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)  at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)  at org.eclipse.jetty.io.ssl.SslConnection$DecryptedEndPoint.onFillable(SslConnection.java:543)  at org.eclipse.jetty.io.ssl.SslConnection.onFillable(SslConnection.java:398)  at org.eclipse.jetty.io.ssl.SslConnection$2.succeeded(SslConnection.java:161)  at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)  at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)  at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)  at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)  at java.lang.Thread.run(Thread.java:748) 

Stack Trace:
org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at https://127.0.0.1:40045/collection1: java.lang.NullPointerException
	at org.apache.solr.handler.admin.SystemInfoHandler.getSecurityInfo(SystemInfoHandler.java:326)
	at org.apache.solr.handler.admin.SystemInfoHandler.handleRequestBody(SystemInfoHandler.java:146)
	at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)
	at org.apache.solr.core.SolrCore.execute(SolrCore.java:2600)
	at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:803)
	at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:582)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:432)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:362)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)
	at org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)
	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1610)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1300)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)
	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1580)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1215)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)
	at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:767)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.server.Server.handle(Server.java:500)
	at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)
	at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)
	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)
	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)
	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
	at org.eclipse.jetty.io.ssl.SslConnection$DecryptedEndPoint.onFillable(SslConnection.java:543)
	at org.eclipse.jetty.io.ssl.SslConnection.onFillable(SslConnection.java:398)
	at org.eclipse.jetty.io.ssl.SslConnection$2.succeeded(SslConnection.java:161)
	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
	at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)
	at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)
	at java.lang.Thread.run(Thread.java:748)

	at __randomizedtesting.SeedInfo.seed([FF5368A94D977D5C:5817D00D202C6EE5]:0)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:665)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:265)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:248)
	at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:211)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1003)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1018)
	at org.apache.solr.index.hdfs.CheckHdfsIndexTest.doTest(CheckHdfsIndexTest.java:120)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1081)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1053)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at java.lang.Thread.run(Thread.java:748)




Build Log:
[...truncated 14888 lines...]
   [junit4] Suite: org.apache.solr.index.hdfs.CheckHdfsIndexTest
   [junit4]   2> 575727 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.a.s.SolrTestCase Setting 'solr.default.confdir' system property to test-framework derived value of '/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/server/solr/configsets/_default/conf'
   [junit4]   2> 575727 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.a.s.SolrTestCaseJ4 SecureRandom sanity checks: test.solr.allowed.securerandom=null & java.security.egd=file:/dev/./urandom
   [junit4]   2> 575727 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.a.s.SolrTestCaseJ4 Created dataDir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_FF5368A94D977D5C-001/data-dir-64-001
   [junit4]   2> 575728 WARN  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.a.s.SolrTestCaseJ4 startTrackingSearchers: numOpens=1 numCloses=1
   [junit4]   2> 575728 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.a.s.SolrTestCaseJ4 Using PointFields (NUMERIC_POINTS_SYSPROP=true) w/NUMERIC_DOCVALUES_SYSPROP=false
   [junit4]   2> 575729 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.a.s.SolrTestCaseJ4 Randomized ssl (true) and clientAuth (false) via: @org.apache.solr.util.RandomizeSSL(reason=, value=NaN, ssl=NaN, clientAuth=NaN)
   [junit4]   2> 575729 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.a.s.BaseDistributedSearchTestCase Setting hostContext system property: /
   [junit4]   1> Formatting using clusterid: testClusterID
   [junit4]   2> 576568 WARN  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 576591 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 1.8.0_201-b09
   [junit4]   2> 576600 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 576600 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 576600 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 576602 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@4efed1ef{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,AVAILABLE}
   [junit4]   2> 576774 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.w.WebAppContext@c4e54ac{hdfs,/,file:///home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/jetty-localhost_localdomain-37331-hadoop-hdfs-3_2_0-tests_jar-_-any-5521039022114509602.dir/webapp/,AVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/hdfs}
   [junit4]   2> 576775 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.AbstractConnector Started ServerConnector@7b3f6488{HTTP/1.1, (http/1.1)}{localhost.localdomain:37331}
   [junit4]   2> 576775 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.Server Started @576802ms
   [junit4]   2> 577263 WARN  (StorageLocationChecker thread 0) [     ] o.a.h.u.NativeCodeLoader Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
   [junit4]   2> 577305 WARN  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 577309 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 1.8.0_201-b09
   [junit4]   2> 577309 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 577309 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 577309 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 577309 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@1ac6dc6c{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,AVAILABLE}
   [junit4]   2> 577388 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.w.WebAppContext@3e27d41{datanode,/,file:///home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/jetty-localhost-42487-hadoop-hdfs-3_2_0-tests_jar-_-any-218875712723310697.dir/webapp/,AVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/datanode}
   [junit4]   2> 577389 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.AbstractConnector Started ServerConnector@304200a2{HTTP/1.1, (http/1.1)}{localhost:42487}
   [junit4]   2> 577389 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.Server Started @577415ms
   [junit4]   2> 577639 WARN  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 577640 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 1.8.0_201-b09
   [junit4]   2> 577641 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 577641 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 577641 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 577642 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@295d63cd{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,AVAILABLE}
   [junit4]   2> 577738 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.w.WebAppContext@5f4222ad{datanode,/,file:///home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/jetty-localhost-45593-hadoop-hdfs-3_2_0-tests_jar-_-any-4476988760293553139.dir/webapp/,AVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/datanode}
   [junit4]   2> 577739 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.AbstractConnector Started ServerConnector@647073ae{HTTP/1.1, (http/1.1)}{localhost:45593}
   [junit4]   2> 577739 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.Server Started @577765ms
   [junit4]   2> 578086 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x1f3bc7f6ce279616: Processing first storage report for DS-6692af67-e3ac-440d-941b-726548dcd1f1 from datanode 6d3f1028-7ab3-473d-9580-36325881f3b1
   [junit4]   2> 578097 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x1f3bc7f6ce279616: from storage DS-6692af67-e3ac-440d-941b-726548dcd1f1 node DatanodeRegistration(127.0.0.1:36667, datanodeUuid=6d3f1028-7ab3-473d-9580-36325881f3b1, infoPort=33355, infoSecurePort=0, ipcPort=43245, storageInfo=lv=-57;cid=testClusterID;nsid=1281727678;c=1588232268060), blocks: 0, hasStaleStorage: true, processing time: 11 msecs, invalidatedBlocks: 0
   [junit4]   2> 578097 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x80b4d3c1329973a4: Processing first storage report for DS-1fa2a9c1-270a-4f34-99de-eb13f6852dee from datanode 8edc898e-138c-447a-8560-e2a54eb67b52
   [junit4]   2> 578097 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x80b4d3c1329973a4: from storage DS-1fa2a9c1-270a-4f34-99de-eb13f6852dee node DatanodeRegistration(127.0.0.1:44753, datanodeUuid=8edc898e-138c-447a-8560-e2a54eb67b52, infoPort=45623, infoSecurePort=0, ipcPort=34765, storageInfo=lv=-57;cid=testClusterID;nsid=1281727678;c=1588232268060), blocks: 0, hasStaleStorage: true, processing time: 0 msecs, invalidatedBlocks: 0
   [junit4]   2> 578097 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x1f3bc7f6ce279616: Processing first storage report for DS-63a502d7-a3d5-4a24-825c-2886ede5f4c6 from datanode 6d3f1028-7ab3-473d-9580-36325881f3b1
   [junit4]   2> 578097 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x1f3bc7f6ce279616: from storage DS-63a502d7-a3d5-4a24-825c-2886ede5f4c6 node DatanodeRegistration(127.0.0.1:36667, datanodeUuid=6d3f1028-7ab3-473d-9580-36325881f3b1, infoPort=33355, infoSecurePort=0, ipcPort=43245, storageInfo=lv=-57;cid=testClusterID;nsid=1281727678;c=1588232268060), blocks: 0, hasStaleStorage: false, processing time: 0 msecs, invalidatedBlocks: 0
   [junit4]   2> 578097 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x80b4d3c1329973a4: Processing first storage report for DS-bc711e8d-be1e-47e5-bcdb-71521a6e0e0a from datanode 8edc898e-138c-447a-8560-e2a54eb67b52
   [junit4]   2> 578097 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x80b4d3c1329973a4: from storage DS-bc711e8d-be1e-47e5-bcdb-71521a6e0e0a node DatanodeRegistration(127.0.0.1:44753, datanodeUuid=8edc898e-138c-447a-8560-e2a54eb67b52, infoPort=45623, infoSecurePort=0, ipcPort=34765, storageInfo=lv=-57;cid=testClusterID;nsid=1281727678;c=1588232268060), blocks: 0, hasStaleStorage: false, processing time: 0 msecs, invalidatedBlocks: 0
   [junit4]   2> 578290 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER
   [junit4]   2> 578290 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer client port:0.0.0.0/0.0.0.0:0
   [junit4]   2> 578290 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer Starting server
   [junit4]   2> 578390 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.ZkTestServer start zk server on port:45667
   [junit4]   2> 578390 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.ZkTestServer waitForServerUp: 127.0.0.1:45667
   [junit4]   2> 578390 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.ZkTestServer parse host and port list: 127.0.0.1:45667
   [junit4]   2> 578390 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.ZkTestServer connecting to 127.0.0.1 45667
   [junit4]   2> 578391 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 578392 INFO  (zkConnectionManagerCallback-9416-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 578392 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 578394 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 578396 INFO  (zkConnectionManagerCallback-9418-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 578396 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 578398 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig-tlog.xml to /configs/conf1/solrconfig.xml
   [junit4]   2> 578400 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/schema.xml to /configs/conf1/schema.xml
   [junit4]   2> 578401 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig.snippet.randomindexconfig.xml to /configs/conf1/solrconfig.snippet.randomindexconfig.xml
   [junit4]   2> 578402 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/stopwords.txt to /configs/conf1/stopwords.txt
   [junit4]   2> 578403 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/protwords.txt to /configs/conf1/protwords.txt
   [junit4]   2> 578403 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/currency.xml to /configs/conf1/currency.xml
   [junit4]   2> 578409 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/enumsConfig.xml to /configs/conf1/enumsConfig.xml
   [junit4]   2> 578410 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/open-exchange-rates.json to /configs/conf1/open-exchange-rates.json
   [junit4]   2> 578410 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/mapping-ISOLatin1Accent.txt to /configs/conf1/mapping-ISOLatin1Accent.txt
   [junit4]   2> 578412 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/old_synonyms.txt to /configs/conf1/old_synonyms.txt
   [junit4]   2> 578413 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/synonyms.txt to /configs/conf1/synonyms.txt
   [junit4]   2> 578417 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 578417 INFO  (zkConnectionManagerCallback-9422-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 578417 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 578519 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Will use NRT replicas unless explicitly asked otherwise
   [junit4]   2> 578579 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 578579 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 578579 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ...
   [junit4]   2> 578579 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 1.8.0_201-b09
   [junit4]   2> 578580 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 578580 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 578580 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 578582 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@610b882f{/,null,AVAILABLE}
   [junit4]   2> 578586 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.e.j.s.AbstractConnector Started ServerConnector@98104d6{SSL, (ssl, http/1.1)}{127.0.0.1:37113}
   [junit4]   2> 578586 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.e.j.s.Server Started @578612ms
   [junit4]   2> 578586 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {solr.data.dir=hdfs://localhost.localdomain:39571/hdfs__localhost.localdomain_39571__home_jenkins_workspace_Lucene-Solr-8.x-Linux_solr_build_solr-core_test_J3_temp_solr.index.hdfs.CheckHdfsIndexTest_FF5368A94D977D5C-001_tempDir-002_control_data, hostContext=/, hostPort=37113, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_FF5368A94D977D5C-001/control-001/cores}
   [junit4]   2> 578586 ERROR (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 578586 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 578586 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solrâ„¢ version 8.6.0
   [junit4]   2> 578586 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 578586 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr
   [junit4]   2> 578586 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-04-30T07:37:50.363Z
   [junit4]   2> 578587 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 578588 INFO  (zkConnectionManagerCallback-9424-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 578588 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 578689 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 578689 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_FF5368A94D977D5C-001/control-001/solr.xml
   [junit4]   2> 578691 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 578691 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 578692 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@735fe358, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 578949 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 578949 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.s.i.Http2SolrClient Create Http2SolrClient with HTTP/1.1 transport since Java 8 or lower versions does not support SSL + HTTP/2
   [junit4]   2> 578950 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@5c110041[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 578950 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@5c110041[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 578951 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.s.i.Http2SolrClient Create Http2SolrClient with HTTP/1.1 transport since Java 8 or lower versions does not support SSL + HTTP/2
   [junit4]   2> 578957 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@5d486863[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 578957 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@5d486863[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 578957 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:45667/solr
   [junit4]   2> 578958 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 578959 INFO  (zkConnectionManagerCallback-9435-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 578959 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 579060 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [n:127.0.0.1:37113_     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 579061 INFO  (zkConnectionManagerCallback-9437-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 579061 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [n:127.0.0.1:37113_     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 579098 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [n:127.0.0.1:37113_     ] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:37113_
   [junit4]   2> 579098 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [n:127.0.0.1:37113_     ] o.a.s.c.Overseer Overseer (id=72186509859553285-127.0.0.1:37113_-n_0000000000) starting
   [junit4]   2> 579101 INFO  (OverseerStateUpdate-72186509859553285-127.0.0.1:37113_-n_0000000000) [n:127.0.0.1:37113_     ] o.a.s.c.Overseer Starting to work on the main queue : 127.0.0.1:37113_
   [junit4]   2> 579101 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [n:127.0.0.1:37113_     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:37113_
   [junit4]   2> 579103 INFO  (zkCallback-9436-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 579103 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [n:127.0.0.1:37113_     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 579104 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [n:127.0.0.1:37113_     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=you make it. Consider configuring authentication/authorization before exposing Solr to users internal or  authorization=external.  See https://s.apache.org/solrsecurity for more info.  Solr is only as secure as disableddisabled
   [junit4]   2> 579116 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [n:127.0.0.1:37113_     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 579131 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [n:127.0.0.1:37113_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@735fe358
   [junit4]   2> 579138 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [n:127.0.0.1:37113_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@735fe358
   [junit4]   2> 579138 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [n:127.0.0.1:37113_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@735fe358
   [junit4]   2> 579139 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [n:127.0.0.1:37113_     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_FF5368A94D977D5C-001/control-001/cores
   [junit4]   2> 579153 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 579154 INFO  (zkConnectionManagerCallback-9454-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 579154 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 579155 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 579156 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:45667/solr ready
   [junit4]   2> 579173 INFO  (qtp1106830107-17676) [n:127.0.0.1:37113_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:37113_&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 579176 INFO  (OverseerThreadFactory-9444-thread-1-processing-n:127.0.0.1:37113_) [n:127.0.0.1:37113_     ] o.a.s.c.a.c.CreateCollectionCmd Create collection control_collection
   [junit4]   2> 579285 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 579287 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 579290 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_    x:control_collection_shard1_replica_n1 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 579290 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_    x:control_collection_shard1_replica_n1 ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 580314 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.6.0
   [junit4]   2> 580344 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.IndexSchema Schema name=test
   [junit4]   2> 580429 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 580454 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.CoreContainer Creating SolrCore 'control_collection_shard1_replica_n1' using configuration from configset conf1, trusted=true
   [junit4]   2> 580454 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.control_collection.shard1.replica_n1' (registry 'solr.core.control_collection.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@735fe358
   [junit4]   2> 580459 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory solr.hdfs.home=hdfs://localhost.localdomain:39571/solr_hdfs_home
   [junit4]   2> 580459 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Solr Kerberos Authentication disabled
   [junit4]   2> 580460 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrCore [[control_collection_shard1_replica_n1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_FF5368A94D977D5C-001/control-001/cores/control_collection_shard1_replica_n1], dataDir=[hdfs://localhost.localdomain:39571/solr_hdfs_home/control_collection/core_node2/data/]
   [junit4]   2> 580460 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:39571/solr_hdfs_home/control_collection/core_node2/data/snapshot_metadata
   [junit4]   2> 580475 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 580475 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [33554432] will allocate [1] slabs and use ~[33554432] bytes
   [junit4]   2> 580475 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new global HDFS BlockCache
   [junit4]   2> 580582 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 580588 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:39571/solr_hdfs_home/control_collection/core_node2/data
   [junit4]   2> 580614 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:39571/solr_hdfs_home/control_collection/core_node2/data/index
   [junit4]   2> 580621 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 580621 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [33554432] will allocate [1] slabs and use ~[33554432] bytes
   [junit4]   2> 580631 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 580632 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=38, maxMergeAtOnceExplicit=49, maxMergedSegmentMB=81.123046875, floorSegmentMB=1.5166015625, forceMergeDeletesPctAllowed=19.46130018374416, segmentsPerTier=46.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0, deletesPctAllowed=35.541572013965094
   [junit4]   2> 580877 WARN  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 580922 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.HdfsUpdateLog
   [junit4]   2> 580922 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 580922 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.HdfsUpdateLog Initializing HdfsUpdateLog: tlogDfsReplication=2
   [junit4]   2> 580942 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 580942 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 580945 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogDocMergePolicy: [LogDocMergePolicy: minMergeSize=1000, mergeFactor=49, maxMergeSize=9223372036854775807, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.0]
   [junit4]   2> 581017 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@765661a9[control_collection_shard1_replica_n1] main]
   [junit4]   2> 581018 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 581018 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 581021 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms
   [junit4]   2> 581022 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1665382243682484224
   [junit4]   2> 581024 INFO  (searcherExecutor-9456-thread-1-processing-n:127.0.0.1:37113_ x:control_collection_shard1_replica_n1 c:control_collection s:shard1) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrCore [control_collection_shard1_replica_n1] Registered new searcher Searcher@765661a9[control_collection_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 581026 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ZkShardTerms Successful update of terms at /collections/control_collection/terms/shard1 to Terms{values={core_node2=0}, version=0}
   [junit4]   2> 581026 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/control_collection/leaders/shard1
   [junit4]   2> 581028 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 581028 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 581028 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync replicas to https://127.0.0.1:37113/control_collection_shard1_replica_n1/
   [junit4]   2> 581028 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 581028 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy https://127.0.0.1:37113/control_collection_shard1_replica_n1/ has no replicas
   [junit4]   2> 581028 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/control_collection/leaders/shard1/leader after winning as /collections/control_collection/leader_elect/shard1/election/72186509859553285-core_node2-n_0000000000
   [junit4]   2> 581029 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I am the new leader: https://127.0.0.1:37113/control_collection_shard1_replica_n1/ shard1
   [junit4]   2> 581130 INFO  (zkCallback-9436-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 581130 INFO  (zkCallback-9436-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 581134 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 581137 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1847
   [junit4]   2> 581144 INFO  (qtp1106830107-17676) [n:127.0.0.1:37113_     ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas
   [junit4]   2> 581177 INFO  (OverseerCollectionConfigSetProcessor-72186509859553285-127.0.0.1:37113_-n_0000000000) [n:127.0.0.1:37113_     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000000 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 581245 INFO  (zkCallback-9436-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 581245 INFO  (zkCallback-9436-thread-3) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 581245 INFO  (zkCallback-9436-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 581245 INFO  (qtp1106830107-17676) [n:127.0.0.1:37113_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:37113_&wt=javabin&version=2} status=0 QTime=2072
   [junit4]   2> 581246 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Waiting to see 1 active replicas in collection: control_collection
   [junit4]   2> 581355 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 581358 INFO  (zkConnectionManagerCallback-9465-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 581358 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 581359 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 581359 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:45667/solr ready
   [junit4]   2> 581359 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.ChaosMonkey monkey: init - expire sessions:false cause connection loss:false
   [junit4]   2> 581361 INFO  (qtp1106830107-17676) [n:127.0.0.1:37113_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=2&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 581372 INFO  (OverseerThreadFactory-9444-thread-2-processing-n:127.0.0.1:37113_) [n:127.0.0.1:37113_     ] o.a.s.c.a.c.CreateCollectionCmd Create collection collection1
   [junit4]   2> 581574 WARN  (OverseerThreadFactory-9444-thread-2-processing-n:127.0.0.1:37113_) [n:127.0.0.1:37113_     ] o.a.s.c.a.c.CreateCollectionCmd It is unusual to create a collection (collection1) without cores.
   [junit4]   2> 581574 INFO  (qtp1106830107-17676) [n:127.0.0.1:37113_     ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas
   [junit4]   2> 581580 INFO  (qtp1106830107-17676) [n:127.0.0.1:37113_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=2&wt=javabin&version=2} status=0 QTime=218
   [junit4]   2> 581582 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 581582 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 581583 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 581583 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 581583 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 581583 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 581583 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Creating jetty instances pullReplicaCount=0 numOtherReplicas=1
   [junit4]   2> 581654 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 1 in directory /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_FF5368A94D977D5C-001/shard-1-001 of type NRT for shard1
   [junit4]   2> 581656 WARN  (closeThreadPool-9466-thread-1) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 581656 INFO  (closeThreadPool-9466-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 581656 INFO  (closeThreadPool-9466-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ...
   [junit4]   2> 581656 INFO  (closeThreadPool-9466-thread-1) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 1.8.0_201-b09
   [junit4]   2> 581662 INFO  (closeThreadPool-9466-thread-1) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 581662 INFO  (closeThreadPool-9466-thread-1) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 581662 INFO  (closeThreadPool-9466-thread-1) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 581662 INFO  (closeThreadPool-9466-thread-1) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@46b3d805{/,null,AVAILABLE}
   [junit4]   2> 581663 INFO  (closeThreadPool-9466-thread-1) [     ] o.e.j.s.AbstractConnector Started ServerConnector@5608a297{SSL, (ssl, http/1.1)}{127.0.0.1:34729}
   [junit4]   2> 581663 INFO  (closeThreadPool-9466-thread-1) [     ] o.e.j.s.Server Started @581689ms
   [junit4]   2> 581663 INFO  (closeThreadPool-9466-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {solr.data.dir=hdfs://localhost.localdomain:39571/hdfs__localhost.localdomain_39571__home_jenkins_workspace_Lucene-Solr-8.x-Linux_solr_build_solr-core_test_J3_temp_solr.index.hdfs.CheckHdfsIndexTest_FF5368A94D977D5C-001_tempDir-002_jetty1, replicaType=NRT, solrconfig=solrconfig.xml, hostContext=/, hostPort=34729, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_FF5368A94D977D5C-001/shard-1-001/cores}
   [junit4]   2> 581663 ERROR (closeThreadPool-9466-thread-1) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 581663 INFO  (closeThreadPool-9466-thread-1) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 581663 INFO  (closeThreadPool-9466-thread-1) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solrâ„¢ version 8.6.0
   [junit4]   2> 581663 INFO  (closeThreadPool-9466-thread-1) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 581663 INFO  (closeThreadPool-9466-thread-1) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr
   [junit4]   2> 581663 INFO  (closeThreadPool-9466-thread-1) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-04-30T07:37:53.440Z
   [junit4]   2> 581666 INFO  (closeThreadPool-9466-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 581667 INFO  (zkConnectionManagerCallback-9468-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 581667 INFO  (closeThreadPool-9466-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 581768 INFO  (closeThreadPool-9466-thread-1) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 581768 INFO  (closeThreadPool-9466-thread-1) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_FF5368A94D977D5C-001/shard-1-001/solr.xml
   [junit4]   2> 581770 INFO  (closeThreadPool-9466-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 581770 INFO  (closeThreadPool-9466-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 581771 INFO  (closeThreadPool-9466-thread-1) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@735fe358, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 581865 INFO  (closeThreadPool-9466-thread-1) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 581865 WARN  (closeThreadPool-9466-thread-1) [     ] o.a.s.c.s.i.Http2SolrClient Create Http2SolrClient with HTTP/1.1 transport since Java 8 or lower versions does not support SSL + HTTP/2
   [junit4]   2> 581866 WARN  (closeThreadPool-9466-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@2c449fa5[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 581866 WARN  (closeThreadPool-9466-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@2c449fa5[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 581871 WARN  (closeThreadPool-9466-thread-1) [     ] o.a.s.c.s.i.Http2SolrClient Create Http2SolrClient with HTTP/1.1 transport since Java 8 or lower versions does not support SSL + HTTP/2
   [junit4]   2> 581871 WARN  (closeThreadPool-9466-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@7a98deff[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 581871 WARN  (closeThreadPool-9466-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@7a98deff[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 581872 INFO  (closeThreadPool-9466-thread-1) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:45667/solr
   [junit4]   2> 581882 INFO  (closeThreadPool-9466-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 581882 INFO  (zkConnectionManagerCallback-9479-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 581882 INFO  (closeThreadPool-9466-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 581983 INFO  (closeThreadPool-9466-thread-1) [n:127.0.0.1:34729_     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 581984 INFO  (zkConnectionManagerCallback-9481-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 581984 INFO  (closeThreadPool-9466-thread-1) [n:127.0.0.1:34729_     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 581986 INFO  (closeThreadPool-9466-thread-1) [n:127.0.0.1:34729_     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 581988 INFO  (closeThreadPool-9466-thread-1) [n:127.0.0.1:34729_     ] o.a.s.c.ZkController Publish node=127.0.0.1:34729_ as DOWN
   [junit4]   2> 581988 INFO  (closeThreadPool-9466-thread-1) [n:127.0.0.1:34729_     ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 581988 INFO  (closeThreadPool-9466-thread-1) [n:127.0.0.1:34729_     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:34729_
   [junit4]   2> 581989 INFO  (zkCallback-9480-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 581989 INFO  (zkCallback-9436-thread-4) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 581990 INFO  (zkCallback-9464-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 581991 INFO  (closeThreadPool-9466-thread-1) [n:127.0.0.1:34729_     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 581991 WARN  (closeThreadPool-9466-thread-1) [n:127.0.0.1:34729_     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=you make it. Consider configuring authentication/authorization before exposing Solr to users internal or  authorization=external.  See https://s.apache.org/solrsecurity for more info.  Solr is only as secure as disableddisabled
   [junit4]   2> 582001 INFO  (closeThreadPool-9466-thread-1) [n:127.0.0.1:34729_     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 582011 INFO  (closeThreadPool-9466-thread-1) [n:127.0.0.1:34729_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@735fe358
   [junit4]   2> 582016 INFO  (closeThreadPool-9466-thread-1) [n:127.0.0.1:34729_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@735fe358
   [junit4]   2> 582016 INFO  (closeThreadPool-9466-thread-1) [n:127.0.0.1:34729_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@735fe358
   [junit4]   2> 582017 INFO  (closeThreadPool-9466-thread-1) [n:127.0.0.1:34729_     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_FF5368A94D977D5C-001/shard-1-001/cores
   [junit4]   2> 582024 INFO  (closeThreadPool-9466-thread-1) [     ] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:34729_
   [junit4]   2> 582034 INFO  (qtp389392556-17739) [n:127.0.0.1:34729_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :addreplica with params node=127.0.0.1:34729_&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 582035 INFO  (OverseerCollectionConfigSetProcessor-72186509859553285-127.0.0.1:37113_-n_0000000000) [n:127.0.0.1:37113_     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000002 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 582040 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 582043 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.core.control_collection.shard1.replica_n1:INDEX.sizeInBytes} status=0 QTime=2
   [junit4]   2> 582044 INFO  (qtp1106830107-17676) [n:127.0.0.1:37113_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 582045 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 582048 INFO  (qtp1106830107-17679) [n:127.0.0.1:37113_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.core.control_collection.shard1.replica_n1:INDEX.sizeInBytes} status=0 QTime=2
   [junit4]   2> 582049 INFO  (qtp1106830107-17676) [n:127.0.0.1:37113_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 582049 INFO  (OverseerThreadFactory-9444-thread-3-processing-n:127.0.0.1:37113_) [n:127.0.0.1:37113_ c:collection1 s:shard1   ] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:34729_ for creating new replica of shard shard1 for collection collection1
   [junit4]   2> 582050 INFO  (OverseerThreadFactory-9444-thread-3-processing-n:127.0.0.1:37113_) [n:127.0.0.1:37113_ c:collection1 s:shard1   ] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command.
   [junit4]   2> 582067 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_    x:collection1_shard1_replica_n1 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_n1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 583077 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.6.0
   [junit4]   2> 583087 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.IndexSchema Schema name=test
   [junit4]   2> 583172 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 583181 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard1_replica_n1' using configuration from configset conf1, trusted=true
   [junit4]   2> 583182 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.collection1.shard1.replica_n1' (registry 'solr.core.collection1.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@735fe358
   [junit4]   2> 583182 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory solr.hdfs.home=hdfs://localhost.localdomain:39571/solr_hdfs_home
   [junit4]   2> 583182 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Solr Kerberos Authentication disabled
   [junit4]   2> 583182 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrCore [[collection1_shard1_replica_n1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_FF5368A94D977D5C-001/shard-1-001/cores/collection1_shard1_replica_n1], dataDir=[hdfs://localhost.localdomain:39571/solr_hdfs_home/collection1/core_node2/data/]
   [junit4]   2> 583183 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:39571/solr_hdfs_home/collection1/core_node2/data/snapshot_metadata
   [junit4]   2> 583189 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 583189 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [33554432] will allocate [1] slabs and use ~[33554432] bytes
   [junit4]   2> 583195 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 583196 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:39571/solr_hdfs_home/collection1/core_node2/data
   [junit4]   2> 583209 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:39571/solr_hdfs_home/collection1/core_node2/data/index
   [junit4]   2> 583214 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 583214 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [33554432] will allocate [1] slabs and use ~[33554432] bytes
   [junit4]   2> 583224 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 583224 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=38, maxMergeAtOnceExplicit=49, maxMergedSegmentMB=81.123046875, floorSegmentMB=1.5166015625, forceMergeDeletesPctAllowed=19.46130018374416, segmentsPerTier=46.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0, deletesPctAllowed=35.541572013965094
   [junit4]   2> 583249 WARN  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 583279 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.HdfsUpdateLog
   [junit4]   2> 583279 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 583279 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.HdfsUpdateLog Initializing HdfsUpdateLog: tlogDfsReplication=2
   [junit4]   2> 583293 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 583293 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 583295 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogDocMergePolicy: [LogDocMergePolicy: minMergeSize=1000, mergeFactor=49, maxMergeSize=9223372036854775807, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.0]
   [junit4]   2> 583308 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@7c8b06f7[collection1_shard1_replica_n1] main]
   [junit4]   2> 583309 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 583309 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 583309 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms
   [junit4]   2> 583309 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1665382246080577536
   [junit4]   2> 583311 INFO  (searcherExecutor-9492-thread-1-processing-n:127.0.0.1:34729_ x:collection1_shard1_replica_n1 c:collection1 s:shard1) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrCore [collection1_shard1_replica_n1] Registered new searcher Searcher@7c8b06f7[collection1_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 583312 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection1/terms/shard1 to Terms{values={core_node2=0}, version=0}
   [junit4]   2> 583312 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection1/leaders/shard1
   [junit4]   2> 583314 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 583314 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 583314 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync replicas to https://127.0.0.1:34729/collection1_shard1_replica_n1/
   [junit4]   2> 583314 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 583314 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy https://127.0.0.1:34729/collection1_shard1_replica_n1/ has no replicas
   [junit4]   2> 583314 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/collection1/leaders/shard1/leader after winning as /collections/collection1/leader_elect/shard1/election/72186509859553290-core_node2-n_0000000000
   [junit4]   2> 583315 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I am the new leader: https://127.0.0.1:34729/collection1_shard1_replica_n1/ shard1
   [junit4]   2> 583416 INFO  (zkCallback-9480-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 583417 INFO  (zkCallback-9480-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 583417 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 583419 INFO  (qtp389392556-17742) [n:127.0.0.1:34729_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_n1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1351
   [junit4]   2> 583425 INFO  (qtp389392556-17739) [n:127.0.0.1:34729_ c:collection1    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:34729_&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2} status=0 QTime=1391
   [junit4]   2> 583426 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Waiting to see 1 active replicas in collection: collection1
   [junit4]   2> 583519 INFO  (zkCallback-9464-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 583519 INFO  (zkCallback-9480-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 583519 INFO  (zkCallback-9480-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 583522 INFO  (zkCallback-9480-thread-3) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 583523 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.SolrTestCaseJ4 ###Starting testChecksumsOnly
   [junit4]   2> 584036 INFO  (OverseerCollectionConfigSetProcessor-72186509859553285-127.0.0.1:37113_-n_0000000000) [n:127.0.0.1:37113_     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000004 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 586288 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:39571/solr
   [junit4]   2> 586289 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnly-seed#[FF5368A94D977D5C]) [     ] o.a.s.SolrTestCaseJ4 ###Ending testChecksumsOnly
   [junit4]   2> 586401 INFO  (closeThreadPool-9499-thread-2) [     ] o.a.s.c.CoreContainer Shutting down CoreContainer instance=2030848349
   [junit4]   2> 586401 INFO  (closeThreadPool-9499-thread-2) [     ] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:37113_
   [junit4]   2> 586402 INFO  (closeThreadPool-9499-thread-2) [     ] o.a.s.c.ZkController Publish this node as DOWN...
   [junit4]   2> 586402 INFO  (closeThreadPool-9499-thread-2) [     ] o.a.s.c.ZkController Publish node=127.0.0.1:37113_ as DOWN
   [junit4]   2> 586402 INFO  (closeThreadPool-9499-thread-1) [     ] o.a.s.c.CoreContainer Shutting down CoreContainer instance=961415113
   [junit4]   2> 586402 INFO  (closeThreadPool-9499-thread-1) [     ] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:34729_
   [junit4]   2> 586402 INFO  (closeThreadPool-9499-thread-1) [     ] o.a.s.c.ZkController Publish this node as DOWN...
   [junit4]   2> 586402 INFO  (closeThreadPool-9499-thread-1) [     ] o.a.s.c.ZkController Publish node=127.0.0.1:34729_ as DOWN
   [junit4]   2> 586402 INFO  (coreCloseExecutor-9505-thread-1) [n:127.0.0.1:37113_     ] o.a.s.c.SolrCore [control_collection_shard1_replica_n1]  CLOSING SolrCore org.apache.solr.core.SolrCore@1b41a3b2
   [junit4]   2> 586402 INFO  (coreCloseExecutor-9505-thread-1) [n:127.0.0.1:37113_     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.control_collection.shard1.replica_n1 tag=SolrCore@1b41a3b2
   [junit4]   2> 586402 INFO  (coreCloseExecutor-9505-thread-1) [n:127.0.0.1:37113_     ] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@6da42ec8: rootName = null, domain = solr.core.control_collection.shard1.replica_n1, service url = null, agent id = null] for registry solr.core.control_collection.shard1.replica_n1/com.codahale.metrics.MetricRegistry@4e8416e7
   [junit4]   2> 586402 INFO  (zkCallback-9436-thread-4) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 586402 INFO  (zkCallback-9436-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating.

[...truncated too long message...]

ating to localhost.localdomain/127.0.0.1:39571) [     ] o.a.h.h.s.d.DataNode Ending block pool service for: Block pool BP-658467812-127.0.0.1-1588232268060 (Datanode Uuid 8edc898e-138c-447a-8560-e2a54eb67b52) service to localhost.localdomain/127.0.0.1:39571
   [junit4]   2> 612290 WARN  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.a.h.h.s.d.DirectoryScanner DirectoryScanner: shutdown has been called
   [junit4]   2> 612295 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.w.WebAppContext@3e27d41{datanode,/,null,UNAVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/datanode}
   [junit4]   2> 612295 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@304200a2{HTTP/1.1, (http/1.1)}{localhost:0}
   [junit4]   2> 612295 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 612295 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@1ac6dc6c{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,UNAVAILABLE}
   [junit4]   2> 612297 WARN  (BP-658467812-127.0.0.1-1588232268060 heartbeating to localhost.localdomain/127.0.0.1:39571) [     ] o.a.h.h.s.d.IncrementalBlockReportManager IncrementalBlockReportManager interrupted
   [junit4]   2> 612297 WARN  (BP-658467812-127.0.0.1-1588232268060 heartbeating to localhost.localdomain/127.0.0.1:39571) [     ] o.a.h.h.s.d.DataNode Ending block pool service for: Block pool BP-658467812-127.0.0.1-1588232268060 (Datanode Uuid 6d3f1028-7ab3-473d-9580-36325881f3b1) service to localhost.localdomain/127.0.0.1:39571
   [junit4]   2> 612315 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.w.WebAppContext@c4e54ac{hdfs,/,null,UNAVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/hdfs}
   [junit4]   2> 612316 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@7b3f6488{HTTP/1.1, (http/1.1)}{localhost.localdomain:0}
   [junit4]   2> 612316 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 612316 INFO  (SUITE-CheckHdfsIndexTest-seed#[FF5368A94D977D5C]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@4efed1ef{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,UNAVAILABLE}
   [junit4]   2> NOTE: leaving temporary files on disk at: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_FF5368A94D977D5C-001
   [junit4]   2> Apr 30, 2020 7:38:24 AM com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks
   [junit4]   2> WARNING: Will linger awaiting termination of 129 leaked thread(s).
   [junit4]   2> NOTE: test params are: codec=Asserting(Lucene84): {date=PostingsFormat(name=MockRandom), rnd_b=FST50, field=PostingsFormat(name=MockRandom), docid=FST50, multiDefault=PostingsFormat(name=MockRandom), _root_=Lucene84, titleTokenized=PostingsFormat(name=Direct), id=FST50, body=PostingsFormat(name=MockRandom), title=Lucene84}, docValues:{docid_intDV=DocValuesFormat(name=Lucene80), range_facet_l_dv=DocValuesFormat(name=Asserting), _version_=DocValuesFormat(name=Direct), range_facet_i_dv=DocValuesFormat(name=Lucene80), intDvoDefault=DocValuesFormat(name=Asserting), titleDV=DocValuesFormat(name=Direct), timestamp=DocValuesFormat(name=Lucene80)}, maxPointsInLeafNode=868, maxMBSortInHeap=5.959772353728755, sim=Asserting(org.apache.lucene.search.similarities.AssertingSimilarity@208d23af), locale=ar-KW, timezone=Australia/North
   [junit4]   2> NOTE: Linux 5.3.0-46-generic amd64/Oracle Corporation 1.8.0_201 (64-bit)/cpus=16,threads=15,free=118380384,total=518979584
   [junit4]   2> NOTE: All tests run in this JVM: [DistributedFacetPivotWhiteBoxTest, MoveReplicaTest, TestStressLiveNodes, TestGraphMLResponseWriter, TestAddFieldRealTimeGet, TestScoreJoinQPNoScore, TestImpersonationWithHadoopAuth, TestEmbeddedSolrServerSchemaAPI, TestPhraseSuggestions, DistributedSuggestComponentTest, BitVectorTest, IndexSchemaTest, TestRetrieveFieldsOptimizer, TestReplicaProperties, TestRandomDVFaceting, PreAnalyzedFieldManagedSchemaCloudTest, TestSimExtremeIndexing, FullSolrCloudDistribCmdsTest, TestNamedUpdateProcessors, TestStressThreadBackup, RecoveryZkTest, TestCopyFieldCollectionResource, BlockJoinFacetDistribTest, TestSystemIdResolver, TestPostingsSolrHighlighter, TestFunctionQuery, TestMaxTokenLenTokenizer, RootFieldTest, JSONWriterTest, DirectoryFactoryTest, TestSurroundQueryParser, ShardRoutingTest, LeaderElectionTest, WordBreakSolrSpellCheckerTest, TestCustomSort, TestUniqueKeyFieldResource, DistributedExpandComponentTest, BadComponentTest, QueryElevationComponentTest, AdminHandlersProxyTest, DistribJoinFromCollectionTest, NodeMarkersRegistrationTest, DocValuesMissingTest, TestFieldCacheVsDocValues, LukeRequestHandlerTest, TestFieldTypeResource, SimpleMLTQParserTest, ZkSolrClientTest, SearchRateTriggerTest, TestManagedSchemaThreadSafety, VMParamsZkACLAndCredentialsProvidersTest, TestSkipOverseerOperations, TestFilteredDocIdSet, ChaosMonkeySafeLeaderTest, TestSweetSpotSimilarityFactory, CheckHdfsIndexTest]
   [junit4] Completed [454/907 (1!)] on J3 in 39.69s, 5 tests, 1 error, 1 skipped <<< FAILURES!

[...truncated 39641 lines...]
-ecj-javadoc-lint-src:
    [mkdir] Created dir: /tmp/ecj609921858
 [ecj-lint] Compiling 931 source files to /tmp/ecj609921858
 [ecj-lint] ----------
 [ecj-lint] 1. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/CodecUtil.java (at line 523)
 [ecj-lint] 	throw new CorruptIndexException("misplaced codec footer (file truncated?): length=" + in.length() + " but footerLength==" + footerLength(), input);
 [ecj-lint] 	^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'in' is not closed at this location
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 2. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingStoredFieldsReader.java (at line 166)
 [ecj-lint] 	FieldsIndexReader fieldsIndexReader = new FieldsIndexReader(d, si.name, segmentSuffix, INDEX_EXTENSION_PREFIX, INDEX_CODEC_NAME, si.getId());
 [ecj-lint] 	                  ^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'fieldsIndexReader' is never closed
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 3. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingTermVectorsReader.java (at line 148)
 [ecj-lint] 	FieldsIndexReader fieldsIndexReader = new FieldsIndexReader(d, si.name, segmentSuffix, VECTORS_INDEX_EXTENSION_PREFIX, VECTORS_INDEX_CODEC_NAME, si.getId());
 [ecj-lint] 	                  ^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'fieldsIndexReader' is never closed
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 4. ERROR in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/search/IndexSearcher.java (at line 50)
 [ecj-lint] 	import org.apache.lucene.util.automaton.ByteRunAutomaton;
 [ecj-lint] 	       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 [ecj-lint] The import org.apache.lucene.util.automaton.ByteRunAutomaton is never used
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 5. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/util/automaton/Operations.java (at line 742)
 [ecj-lint] 	Integer q = newstate.get(statesSet);
 [ecj-lint] 	                         ^^^^^^^^^
 [ecj-lint] Unlikely argument type SortedIntSet for get(Object) on a Map<SortedIntSet.FrozenIntSet,Integer>
 [ecj-lint] ----------
 [ecj-lint] 5 problems (1 error, 4 warnings)

BUILD FAILED
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/build.xml:634: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/build.xml:101: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build.xml:201: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/common-build.xml:2127: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/common-build.xml:2166: Compile failed; see the compiler error output for details.

Total time: 36 minutes 33 seconds
Build step 'Invoke Ant' marked build as failure
Archiving artifacts
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
[WARNINGS] Skipping publisher since build result is FAILURE
Recording test results
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2

[JENKINS] Lucene-Solr-8.x-Linux (64bit/jdk-13.0.2) - Build # 2917 - Still Failing!

Posted by Policeman Jenkins Server <je...@thetaphi.de>.
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-8.x-Linux/2917/
Java: 64bit/jdk-13.0.2 -XX:-UseCompressedOops -XX:+UseParallelGC

1 tests failed.
FAILED:  org.apache.solr.index.hdfs.CheckHdfsIndexTest.doTest

Error Message:
Error from server at http://127.0.0.1:41677/collection1: java.lang.NullPointerException  at org.apache.solr.handler.admin.SystemInfoHandler.getSecurityInfo(SystemInfoHandler.java:326)  at org.apache.solr.handler.admin.SystemInfoHandler.handleRequestBody(SystemInfoHandler.java:146)  at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)  at org.apache.solr.core.SolrCore.execute(SolrCore.java:2600)  at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:803)  at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:582)  at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:432)  at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:362)  at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)  at org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)  at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)  at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)  at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)  at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1610)  at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)  at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1300)  at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)  at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)  at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1580)  at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)  at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1215)  at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)  at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)  at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)  at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:767)  at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)  at org.eclipse.jetty.server.Server.handle(Server.java:500)  at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)  at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)  at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)  at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)  at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)  at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)  at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)  at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)  at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)  at java.base/java.lang.Thread.run(Thread.java:830) 

Stack Trace:
org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:41677/collection1: java.lang.NullPointerException
	at org.apache.solr.handler.admin.SystemInfoHandler.getSecurityInfo(SystemInfoHandler.java:326)
	at org.apache.solr.handler.admin.SystemInfoHandler.handleRequestBody(SystemInfoHandler.java:146)
	at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)
	at org.apache.solr.core.SolrCore.execute(SolrCore.java:2600)
	at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:803)
	at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:582)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:432)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:362)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)
	at org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)
	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1610)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1300)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)
	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1580)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1215)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)
	at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:767)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.server.Server.handle(Server.java:500)
	at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)
	at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)
	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)
	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)
	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
	at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)
	at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)
	at java.base/java.lang.Thread.run(Thread.java:830)

	at __randomizedtesting.SeedInfo.seed([3EFEB584EE0CBD8F:99BA0D2083B7AE36]:0)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:665)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:265)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:248)
	at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:211)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1003)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1018)
	at org.apache.solr.index.hdfs.CheckHdfsIndexTest.doTest(CheckHdfsIndexTest.java:120)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:567)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1081)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1053)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at java.base/java.lang.Thread.run(Thread.java:830)




Build Log:
[...truncated 16498 lines...]
   [junit4] Suite: org.apache.solr.index.hdfs.CheckHdfsIndexTest
   [junit4]   2> 937049 INFO  (SUITE-CheckHdfsIndexTest-seed#[3EFEB584EE0CBD8F]-worker) [     ] o.a.s.SolrTestCase Setting 'solr.default.confdir' system property to test-framework derived value of '/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/server/solr/configsets/_default/conf'
   [junit4]   2> 937049 INFO  (SUITE-CheckHdfsIndexTest-seed#[3EFEB584EE0CBD8F]-worker) [     ] o.a.s.SolrTestCaseJ4 SecureRandom sanity checks: test.solr.allowed.securerandom=null & java.security.egd=file:/dev/./urandom
   [junit4]   2> 937049 INFO  (SUITE-CheckHdfsIndexTest-seed#[3EFEB584EE0CBD8F]-worker) [     ] o.a.s.SolrTestCaseJ4 Created dataDir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/temp/solr.index.hdfs.CheckHdfsIndexTest_3EFEB584EE0CBD8F-001/data-dir-102-001
   [junit4]   2> 937050 INFO  (SUITE-CheckHdfsIndexTest-seed#[3EFEB584EE0CBD8F]-worker) [     ] o.a.s.SolrTestCaseJ4 Using PointFields (NUMERIC_POINTS_SYSPROP=true) w/NUMERIC_DOCVALUES_SYSPROP=false
   [junit4]   2> 937050 INFO  (SUITE-CheckHdfsIndexTest-seed#[3EFEB584EE0CBD8F]-worker) [     ] o.a.s.SolrTestCaseJ4 Randomized ssl (false) and clientAuth (true) via: @org.apache.solr.util.RandomizeSSL(reason="", ssl=0.0/0.0, value=0.0/0.0, clientAuth=0.0/0.0)
   [junit4]   2> 937050 INFO  (SUITE-CheckHdfsIndexTest-seed#[3EFEB584EE0CBD8F]-worker) [     ] o.a.s.BaseDistributedSearchTestCase Setting hostContext system property: /
   [junit4]   1> Formatting using clusterid: testClusterID
   [junit4]   2> 937067 WARN  (SUITE-CheckHdfsIndexTest-seed#[3EFEB584EE0CBD8F]-worker) [     ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 937069 INFO  (SUITE-CheckHdfsIndexTest-seed#[3EFEB584EE0CBD8F]-worker) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 13.0.2+8
   [junit4]   2> 937073 INFO  (SUITE-CheckHdfsIndexTest-seed#[3EFEB584EE0CBD8F]-worker) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 937073 INFO  (SUITE-CheckHdfsIndexTest-seed#[3EFEB584EE0CBD8F]-worker) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 937073 INFO  (SUITE-CheckHdfsIndexTest-seed#[3EFEB584EE0CBD8F]-worker) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 937073 INFO  (SUITE-CheckHdfsIndexTest-seed#[3EFEB584EE0CBD8F]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@31c852{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,AVAILABLE}
   [junit4]   2> 937183 INFO  (SUITE-CheckHdfsIndexTest-seed#[3EFEB584EE0CBD8F]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.w.WebAppContext@1aeb5e27{hdfs,/,file:///home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/temp/jetty-localhost_localdomain-44211-hadoop-hdfs-3_2_0-tests_jar-_-any-12870463844880495403.dir/webapp/,AVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/hdfs}
   [junit4]   2> 937184 INFO  (SUITE-CheckHdfsIndexTest-seed#[3EFEB584EE0CBD8F]-worker) [     ] o.e.j.s.AbstractConnector Started ServerConnector@3baf4231{HTTP/1.1, (http/1.1)}{localhost.localdomain:44211}
   [junit4]   2> 937184 INFO  (SUITE-CheckHdfsIndexTest-seed#[3EFEB584EE0CBD8F]-worker) [     ] o.e.j.s.Server Started @937201ms
   [junit4]   2> 937231 WARN  (SUITE-CheckHdfsIndexTest-seed#[3EFEB584EE0CBD8F]-worker) [     ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 937232 INFO  (SUITE-CheckHdfsIndexTest-seed#[3EFEB584EE0CBD8F]-worker) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 13.0.2+8
   [junit4]   2> 937232 INFO  (SUITE-CheckHdfsIndexTest-seed#[3EFEB584EE0CBD8F]-worker) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 937232 INFO  (SUITE-CheckHdfsIndexTest-seed#[3EFEB584EE0CBD8F]-worker) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 937233 INFO  (SUITE-CheckHdfsIndexTest-seed#[3EFEB584EE0CBD8F]-worker) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 937233 INFO  (SUITE-CheckHdfsIndexTest-seed#[3EFEB584EE0CBD8F]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@208fdcba{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,AVAILABLE}
   [junit4]   2> 937322 INFO  (SUITE-CheckHdfsIndexTest-seed#[3EFEB584EE0CBD8F]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.w.WebAppContext@6b1e2f9d{datanode,/,file:///home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/temp/jetty-localhost-43855-hadoop-hdfs-3_2_0-tests_jar-_-any-13731381025024408648.dir/webapp/,AVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/datanode}
   [junit4]   2> 937322 INFO  (SUITE-CheckHdfsIndexTest-seed#[3EFEB584EE0CBD8F]-worker) [     ] o.e.j.s.AbstractConnector Started ServerConnector@69be31c6{HTTP/1.1, (http/1.1)}{localhost:43855}
   [junit4]   2> 937322 INFO  (SUITE-CheckHdfsIndexTest-seed#[3EFEB584EE0CBD8F]-worker) [     ] o.e.j.s.Server Started @937339ms
   [junit4]   2> 937356 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x87b57f9e7a51fcf8: Processing first storage report for DS-0464bf24-a1ba-46a8-9d01-b334633ef94d from datanode e9e72139-37c9-4d21-9747-5ffd727a7dd7
   [junit4]   2> 937356 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x87b57f9e7a51fcf8: from storage DS-0464bf24-a1ba-46a8-9d01-b334633ef94d node DatanodeRegistration(127.0.0.1:33779, datanodeUuid=e9e72139-37c9-4d21-9747-5ffd727a7dd7, infoPort=44045, infoSecurePort=0, ipcPort=45407, storageInfo=lv=-57;cid=testClusterID;nsid=636739946;c=1588225500131), blocks: 0, hasStaleStorage: true, processing time: 0 msecs, invalidatedBlocks: 0
   [junit4]   2> 937356 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x87b57f9e7a51fcf8: Processing first storage report for DS-89e9c041-25ff-40e0-b95c-2b2063a4286a from datanode e9e72139-37c9-4d21-9747-5ffd727a7dd7
   [junit4]   2> 937356 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x87b57f9e7a51fcf8: from storage DS-89e9c041-25ff-40e0-b95c-2b2063a4286a node DatanodeRegistration(127.0.0.1:33779, datanodeUuid=e9e72139-37c9-4d21-9747-5ffd727a7dd7, infoPort=44045, infoSecurePort=0, ipcPort=45407, storageInfo=lv=-57;cid=testClusterID;nsid=636739946;c=1588225500131), blocks: 0, hasStaleStorage: false, processing time: 0 msecs, invalidatedBlocks: 0
   [junit4]   2> 937452 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER
   [junit4]   2> 937452 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer client port:0.0.0.0/0.0.0.0:0
   [junit4]   2> 937452 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer Starting server
   [junit4]   2> 937553 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.ZkTestServer start zk server on port:36365
   [junit4]   2> 937553 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.ZkTestServer waitForServerUp: 127.0.0.1:36365
   [junit4]   2> 937553 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.ZkTestServer parse host and port list: 127.0.0.1:36365
   [junit4]   2> 937553 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.ZkTestServer connecting to 127.0.0.1 36365
   [junit4]   2> 937554 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 937556 INFO  (zkConnectionManagerCallback-14243-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 937556 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 937558 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 937561 INFO  (zkConnectionManagerCallback-14245-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 937561 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 937565 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig-tlog.xml to /configs/conf1/solrconfig.xml
   [junit4]   2> 937566 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/schema.xml to /configs/conf1/schema.xml
   [junit4]   2> 937569 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig.snippet.randomindexconfig.xml to /configs/conf1/solrconfig.snippet.randomindexconfig.xml
   [junit4]   2> 937573 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/stopwords.txt to /configs/conf1/stopwords.txt
   [junit4]   2> 937573 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/protwords.txt to /configs/conf1/protwords.txt
   [junit4]   2> 937574 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/currency.xml to /configs/conf1/currency.xml
   [junit4]   2> 937574 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/enumsConfig.xml to /configs/conf1/enumsConfig.xml
   [junit4]   2> 937575 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/open-exchange-rates.json to /configs/conf1/open-exchange-rates.json
   [junit4]   2> 937575 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/mapping-ISOLatin1Accent.txt to /configs/conf1/mapping-ISOLatin1Accent.txt
   [junit4]   2> 937576 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/old_synonyms.txt to /configs/conf1/old_synonyms.txt
   [junit4]   2> 937576 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/synonyms.txt to /configs/conf1/synonyms.txt
   [junit4]   2> 937577 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Will use NRT replicas unless explicitly asked otherwise
   [junit4]   2> 937668 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 937668 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 937668 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ...
   [junit4]   2> 937668 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 13.0.2+8
   [junit4]   2> 937669 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 937669 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 937669 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 937669 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@328a046a{/,null,AVAILABLE}
   [junit4]   2> 937669 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.e.j.s.AbstractConnector Started ServerConnector@7a974961{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:41907}
   [junit4]   2> 937669 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.e.j.s.Server Started @937687ms
   [junit4]   2> 937669 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/, solr.data.dir=hdfs://localhost.localdomain:42279/hdfs__localhost.localdomain_42279__home_jenkins_workspace_Lucene-Solr-8.x-Linux_solr_build_solr-core_test_J0_temp_solr.index.hdfs.CheckHdfsIndexTest_3EFEB584EE0CBD8F-001_tempDir-002_control_data, hostPort=41907, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/temp/solr.index.hdfs.CheckHdfsIndexTest_3EFEB584EE0CBD8F-001/control-001/cores, replicaType=NRT}
   [junit4]   2> 937670 ERROR (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 937670 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 937670 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solrâ„¢ version 8.6.0
   [junit4]   2> 937670 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 937670 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr
   [junit4]   2> 937670 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-04-30T05:45:00.741102Z
   [junit4]   2> 937670 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 937671 INFO  (zkConnectionManagerCallback-14247-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 937671 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 937771 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 937771 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/temp/solr.index.hdfs.CheckHdfsIndexTest_3EFEB584EE0CBD8F-001/control-001/solr.xml
   [junit4]   2> 937773 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 937773 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 937774 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@52b8757f, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 937836 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 937837 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@16db97b8[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 937837 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@16db97b8[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 937839 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@326753bf[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 937839 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@326753bf[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 937840 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:36365/solr
   [junit4]   2> 937841 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 937842 INFO  (zkConnectionManagerCallback-14258-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 937842 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 937943 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [n:127.0.0.1:41907_     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 937943 INFO  (zkConnectionManagerCallback-14260-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 937943 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [n:127.0.0.1:41907_     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 937970 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [n:127.0.0.1:41907_     ] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:41907_
   [junit4]   2> 937970 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [n:127.0.0.1:41907_     ] o.a.s.c.Overseer Overseer (id=72186066210717700-127.0.0.1:41907_-n_0000000000) starting
   [junit4]   2> 937972 INFO  (OverseerStateUpdate-72186066210717700-127.0.0.1:41907_-n_0000000000) [n:127.0.0.1:41907_     ] o.a.s.c.Overseer Starting to work on the main queue : 127.0.0.1:41907_
   [junit4]   2> 937972 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [n:127.0.0.1:41907_     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:41907_
   [junit4]   2> 937972 INFO  (OverseerStateUpdate-72186066210717700-127.0.0.1:41907_-n_0000000000) [n:127.0.0.1:41907_     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 937973 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [n:127.0.0.1:41907_     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 937973 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [n:127.0.0.1:41907_     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=you make it. Consider configuring authentication/authorization before exposing Solr to users internal or  authorization=external.  See https://s.apache.org/solrsecurity for more info.  Solr is only as secure as disableddisabled
   [junit4]   2> 937981 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [n:127.0.0.1:41907_     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 937991 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [n:127.0.0.1:41907_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@52b8757f
   [junit4]   2> 937996 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [n:127.0.0.1:41907_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@52b8757f
   [junit4]   2> 937996 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [n:127.0.0.1:41907_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@52b8757f
   [junit4]   2> 937997 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [n:127.0.0.1:41907_     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/temp/solr.index.hdfs.CheckHdfsIndexTest_3EFEB584EE0CBD8F-001/control-001/cores
   [junit4]   2> 938004 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 938005 INFO  (zkConnectionManagerCallback-14277-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 938005 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 938005 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 938005 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:36365/solr ready
   [junit4]   2> 938006 INFO  (qtp970966082-27251) [n:127.0.0.1:41907_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:41907_&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 938008 INFO  (OverseerThreadFactory-14267-thread-1-processing-n:127.0.0.1:41907_) [n:127.0.0.1:41907_     ] o.a.s.c.a.c.CreateCollectionCmd Create collection control_collection
   [junit4]   2> 938111 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 938112 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 938114 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_    x:control_collection_shard1_replica_n1 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 938114 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_    x:control_collection_shard1_replica_n1 ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 939119 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.6.0
   [junit4]   2> 939133 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.IndexSchema Schema name=test
   [junit4]   2> 939193 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 939199 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.CoreContainer Creating SolrCore 'control_collection_shard1_replica_n1' using configuration from configset conf1, trusted=true
   [junit4]   2> 939199 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.control_collection.shard1.replica_n1' (registry 'solr.core.control_collection.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@52b8757f
   [junit4]   2> 939199 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory solr.hdfs.home=hdfs://localhost.localdomain:42279/solr_hdfs_home
   [junit4]   2> 939199 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Solr Kerberos Authentication disabled
   [junit4]   2> 939199 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrCore [[control_collection_shard1_replica_n1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/temp/solr.index.hdfs.CheckHdfsIndexTest_3EFEB584EE0CBD8F-001/control-001/cores/control_collection_shard1_replica_n1], dataDir=[hdfs://localhost.localdomain:42279/solr_hdfs_home/control_collection/core_node2/data/]
   [junit4]   2> 939200 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:42279/solr_hdfs_home/control_collection/core_node2/data/snapshot_metadata
   [junit4]   2> 939204 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 939204 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 939204 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 939207 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 939207 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:42279/solr_hdfs_home/control_collection/core_node2/data
   [junit4]   2> 939219 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:42279/solr_hdfs_home/control_collection/core_node2/data/index
   [junit4]   2> 939223 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 939223 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 939223 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 939226 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 939226 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogDocMergePolicy: [LogDocMergePolicy: minMergeSize=1000, mergeFactor=33, maxMergeSize=9223372036854775807, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0]
   [junit4]   2> 939636 WARN  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 939671 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.HdfsUpdateLog
   [junit4]   2> 939671 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 939671 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.HdfsUpdateLog Initializing HdfsUpdateLog: tlogDfsReplication=2
   [junit4]   2> 939677 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 939677 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 939678 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogDocMergePolicy: [LogDocMergePolicy: minMergeSize=1000, mergeFactor=38, maxMergeSize=9223372036854775807, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.0]
   [junit4]   2> 939681 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@695988b[control_collection_shard1_replica_n1] main]
   [junit4]   2> 939682 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 939682 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 939682 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms
   [junit4]   2> 939682 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1665375144774729728
   [junit4]   2> 939684 INFO  (searcherExecutor-14279-thread-1-processing-n:127.0.0.1:41907_ x:control_collection_shard1_replica_n1 c:control_collection s:shard1) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrCore [control_collection_shard1_replica_n1] Registered new searcher Searcher@695988b[control_collection_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 939684 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ZkShardTerms Successful update of terms at /collections/control_collection/terms/shard1 to Terms{values={core_node2=0}, version=0}
   [junit4]   2> 939684 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/control_collection/leaders/shard1
   [junit4]   2> 939685 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 939685 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 939685 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:41907/control_collection_shard1_replica_n1/
   [junit4]   2> 939685 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 939685 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy http://127.0.0.1:41907/control_collection_shard1_replica_n1/ has no replicas
   [junit4]   2> 939685 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/control_collection/leaders/shard1/leader after winning as /collections/control_collection/leader_elect/shard1/election/72186066210717700-core_node2-n_0000000000
   [junit4]   2> 939686 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:41907/control_collection_shard1_replica_n1/ shard1
   [junit4]   2> 939787 INFO  (zkCallback-14259-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 939787 INFO  (zkCallback-14259-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 939787 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 939788 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1674
   [junit4]   2> 939789 INFO  (qtp970966082-27251) [n:127.0.0.1:41907_     ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas
   [junit4]   2> 939888 INFO  (zkCallback-14259-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 939888 INFO  (zkCallback-14259-thread-3) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 939888 INFO  (zkCallback-14259-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 939889 INFO  (qtp970966082-27251) [n:127.0.0.1:41907_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:41907_&wt=javabin&version=2} status=0 QTime=1882
   [junit4]   2> 939889 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Waiting to see 1 active replicas in collection: control_collection
   [junit4]   2> 940009 INFO  (OverseerCollectionConfigSetProcessor-72186066210717700-127.0.0.1:41907_-n_0000000000) [n:127.0.0.1:41907_     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000000 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 940016 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 940018 INFO  (zkConnectionManagerCallback-14288-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 940018 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 940018 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 940019 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:36365/solr ready
   [junit4]   2> 940019 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.ChaosMonkey monkey: init - expire sessions:false cause connection loss:false
   [junit4]   2> 940020 INFO  (qtp970966082-27251) [n:127.0.0.1:41907_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=1&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 940022 INFO  (OverseerThreadFactory-14267-thread-2-processing-n:127.0.0.1:41907_) [n:127.0.0.1:41907_     ] o.a.s.c.a.c.CreateCollectionCmd Create collection collection1
   [junit4]   2> 940239 WARN  (OverseerThreadFactory-14267-thread-2-processing-n:127.0.0.1:41907_) [n:127.0.0.1:41907_     ] o.a.s.c.a.c.CreateCollectionCmd It is unusual to create a collection (collection1) without cores.
   [junit4]   2> 940240 INFO  (qtp970966082-27251) [n:127.0.0.1:41907_     ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas
   [junit4]   2> 940240 INFO  (qtp970966082-27251) [n:127.0.0.1:41907_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=1&wt=javabin&version=2} status=0 QTime=220
   [junit4]   2> 940243 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 940243 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 940243 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 940243 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 940243 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 940243 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 940243 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Creating jetty instances pullReplicaCount=0 numOtherReplicas=1
   [junit4]   2> 940341 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 1 in directory /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/temp/solr.index.hdfs.CheckHdfsIndexTest_3EFEB584EE0CBD8F-001/shard-1-001 of type NRT for shard1
   [junit4]   2> 940342 WARN  (closeThreadPool-14289-thread-1) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 940342 INFO  (closeThreadPool-14289-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 940342 INFO  (closeThreadPool-14289-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ...
   [junit4]   2> 940342 INFO  (closeThreadPool-14289-thread-1) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 13.0.2+8
   [junit4]   2> 940352 INFO  (closeThreadPool-14289-thread-1) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 940352 INFO  (closeThreadPool-14289-thread-1) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 940353 INFO  (closeThreadPool-14289-thread-1) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 940353 INFO  (closeThreadPool-14289-thread-1) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@7d0c0ae4{/,null,AVAILABLE}
   [junit4]   2> 940465 INFO  (closeThreadPool-14289-thread-1) [     ] o.e.j.s.AbstractConnector Started ServerConnector@77853b79{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:40585}
   [junit4]   2> 940465 INFO  (closeThreadPool-14289-thread-1) [     ] o.e.j.s.Server Started @940482ms
   [junit4]   2> 940465 INFO  (closeThreadPool-14289-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/, solrconfig=solrconfig.xml, solr.data.dir=hdfs://localhost.localdomain:42279/hdfs__localhost.localdomain_42279__home_jenkins_workspace_Lucene-Solr-8.x-Linux_solr_build_solr-core_test_J0_temp_solr.index.hdfs.CheckHdfsIndexTest_3EFEB584EE0CBD8F-001_tempDir-002_jetty1, hostPort=40585, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/temp/solr.index.hdfs.CheckHdfsIndexTest_3EFEB584EE0CBD8F-001/shard-1-001/cores}
   [junit4]   2> 940465 ERROR (closeThreadPool-14289-thread-1) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 940465 INFO  (closeThreadPool-14289-thread-1) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 940465 INFO  (closeThreadPool-14289-thread-1) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solrâ„¢ version 8.6.0
   [junit4]   2> 940465 INFO  (closeThreadPool-14289-thread-1) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 940465 INFO  (closeThreadPool-14289-thread-1) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr
   [junit4]   2> 940465 INFO  (closeThreadPool-14289-thread-1) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-04-30T05:45:03.536550Z
   [junit4]   2> 940473 INFO  (closeThreadPool-14289-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 940489 INFO  (zkConnectionManagerCallback-14291-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 940489 INFO  (closeThreadPool-14289-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 940594 INFO  (closeThreadPool-14289-thread-1) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 940594 INFO  (closeThreadPool-14289-thread-1) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/temp/solr.index.hdfs.CheckHdfsIndexTest_3EFEB584EE0CBD8F-001/shard-1-001/solr.xml
   [junit4]   2> 940598 INFO  (closeThreadPool-14289-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 940598 INFO  (closeThreadPool-14289-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 940600 INFO  (closeThreadPool-14289-thread-1) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@52b8757f, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 940674 INFO  (closeThreadPool-14289-thread-1) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 940674 WARN  (closeThreadPool-14289-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@2950690f[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 940674 WARN  (closeThreadPool-14289-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@2950690f[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 940694 WARN  (closeThreadPool-14289-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@19149df7[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 940694 WARN  (closeThreadPool-14289-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@19149df7[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 940695 INFO  (closeThreadPool-14289-thread-1) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:36365/solr
   [junit4]   2> 940696 INFO  (closeThreadPool-14289-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 940701 INFO  (zkConnectionManagerCallback-14302-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 940701 INFO  (closeThreadPool-14289-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 940823 INFO  (closeThreadPool-14289-thread-1) [n:127.0.0.1:40585_     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 940825 INFO  (zkConnectionManagerCallback-14304-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 940825 INFO  (closeThreadPool-14289-thread-1) [n:127.0.0.1:40585_     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 940830 INFO  (closeThreadPool-14289-thread-1) [n:127.0.0.1:40585_     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 940831 INFO  (closeThreadPool-14289-thread-1) [n:127.0.0.1:40585_     ] o.a.s.c.ZkController Publish node=127.0.0.1:40585_ as DOWN
   [junit4]   2> 940832 INFO  (closeThreadPool-14289-thread-1) [n:127.0.0.1:40585_     ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 940832 INFO  (closeThreadPool-14289-thread-1) [n:127.0.0.1:40585_     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:40585_
   [junit4]   2> 940834 INFO  (closeThreadPool-14289-thread-1) [n:127.0.0.1:40585_     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 940834 WARN  (closeThreadPool-14289-thread-1) [n:127.0.0.1:40585_     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=you make it. Consider configuring authentication/authorization before exposing Solr to users internal or  authorization=external.  See https://s.apache.org/solrsecurity for more info.  Solr is only as secure as disableddisabled
   [junit4]   2> 940837 INFO  (zkCallback-14259-thread-3) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 940837 INFO  (zkCallback-14287-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 940847 INFO  (zkCallback-14303-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 940849 INFO  (closeThreadPool-14289-thread-1) [n:127.0.0.1:40585_     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 940866 INFO  (closeThreadPool-14289-thread-1) [n:127.0.0.1:40585_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@52b8757f
   [junit4]   2> 940873 INFO  (closeThreadPool-14289-thread-1) [n:127.0.0.1:40585_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@52b8757f
   [junit4]   2> 940873 INFO  (closeThreadPool-14289-thread-1) [n:127.0.0.1:40585_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@52b8757f
   [junit4]   2> 940874 INFO  (closeThreadPool-14289-thread-1) [n:127.0.0.1:40585_     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/temp/solr.index.hdfs.CheckHdfsIndexTest_3EFEB584EE0CBD8F-001/shard-1-001/cores
   [junit4]   2> 940884 INFO  (closeThreadPool-14289-thread-1) [     ] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:40585_
   [junit4]   2> 940903 INFO  (qtp1941615062-27310) [n:127.0.0.1:40585_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :addreplica with params node=127.0.0.1:40585_&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 940913 INFO  (OverseerCollectionConfigSetProcessor-72186066210717700-127.0.0.1:41907_-n_0000000000) [n:127.0.0.1:41907_     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000002 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 940921 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 940923 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.core.control_collection.shard1.replica_n1:INDEX.sizeInBytes} status=0 QTime=1
   [junit4]   2> 940925 INFO  (qtp970966082-27251) [n:127.0.0.1:41907_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 940926 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 940927 INFO  (qtp970966082-27254) [n:127.0.0.1:41907_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.core.control_collection.shard1.replica_n1:INDEX.sizeInBytes} status=0 QTime=1
   [junit4]   2> 940928 INFO  (qtp970966082-27251) [n:127.0.0.1:41907_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 940929 INFO  (OverseerThreadFactory-14267-thread-3-processing-n:127.0.0.1:41907_) [n:127.0.0.1:41907_ c:collection1 s:shard1   ] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:40585_ for creating new replica of shard shard1 for collection collection1
   [junit4]   2> 940929 INFO  (OverseerThreadFactory-14267-thread-3-processing-n:127.0.0.1:41907_) [n:127.0.0.1:41907_ c:collection1 s:shard1   ] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command.
   [junit4]   2> 940932 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_    x:collection1_shard1_replica_n1 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_n1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 941944 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.6.0
   [junit4]   2> 941977 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.IndexSchema Schema name=test
   [junit4]   2> 942105 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 942122 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard1_replica_n1' using configuration from configset conf1, trusted=true
   [junit4]   2> 942123 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.collection1.shard1.replica_n1' (registry 'solr.core.collection1.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@52b8757f
   [junit4]   2> 942123 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory solr.hdfs.home=hdfs://localhost.localdomain:42279/solr_hdfs_home
   [junit4]   2> 942123 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Solr Kerberos Authentication disabled
   [junit4]   2> 942123 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrCore [[collection1_shard1_replica_n1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/temp/solr.index.hdfs.CheckHdfsIndexTest_3EFEB584EE0CBD8F-001/shard-1-001/cores/collection1_shard1_replica_n1], dataDir=[hdfs://localhost.localdomain:42279/solr_hdfs_home/collection1/core_node2/data/]
   [junit4]   2> 942124 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:42279/solr_hdfs_home/collection1/core_node2/data/snapshot_metadata
   [junit4]   2> 942131 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 942131 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 942131 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 942134 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 942135 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:42279/solr_hdfs_home/collection1/core_node2/data
   [junit4]   2> 942151 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:42279/solr_hdfs_home/collection1/core_node2/data/index
   [junit4]   2> 942156 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 942156 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 942156 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 942160 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 942160 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogDocMergePolicy: [LogDocMergePolicy: minMergeSize=1000, mergeFactor=33, maxMergeSize=9223372036854775807, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0]
   [junit4]   2> 942574 WARN  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 942597 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.HdfsUpdateLog
   [junit4]   2> 942597 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 942597 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.HdfsUpdateLog Initializing HdfsUpdateLog: tlogDfsReplication=2
   [junit4]   2> 942609 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 942609 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 942609 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogDocMergePolicy: [LogDocMergePolicy: minMergeSize=1000, mergeFactor=38, maxMergeSize=9223372036854775807, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.0]
   [junit4]   2> 942612 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@2cfb6b13[collection1_shard1_replica_n1] main]
   [junit4]   2> 942612 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 942612 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 942613 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms
   [junit4]   2> 942613 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1665375147848105984
   [junit4]   2> 942614 INFO  (searcherExecutor-14315-thread-1-processing-n:127.0.0.1:40585_ x:collection1_shard1_replica_n1 c:collection1 s:shard1) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrCore [collection1_shard1_replica_n1] Registered new searcher Searcher@2cfb6b13[collection1_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 942615 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection1/terms/shard1 to Terms{values={core_node2=0}, version=0}
   [junit4]   2> 942615 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection1/leaders/shard1
   [junit4]   2> 942616 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 942616 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 942616 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:40585/collection1_shard1_replica_n1/
   [junit4]   2> 942616 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 942616 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy http://127.0.0.1:40585/collection1_shard1_replica_n1/ has no replicas
   [junit4]   2> 942616 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/collection1/leaders/shard1/leader after winning as /collections/collection1/leader_elect/shard1/election/72186066210717705-core_node2-n_0000000000
   [junit4]   2> 942617 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:40585/collection1_shard1_replica_n1/ shard1
   [junit4]   2> 942718 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 942719 INFO  (qtp1941615062-27312) [n:127.0.0.1:40585_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_n1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1787
   [junit4]   2> 942721 INFO  (qtp1941615062-27310) [n:127.0.0.1:40585_ c:collection1    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:40585_&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2} status=0 QTime=1817
   [junit4]   2> 942721 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Waiting to see 1 active replicas in collection: collection1
   [junit4]   2> 942821 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.SolrTestCaseJ4 ###Starting testChecksumsOnlyVerbose
   [junit4]   2> 942914 INFO  (OverseerCollectionConfigSetProcessor-72186066210717700-127.0.0.1:41907_-n_0000000000) [n:127.0.0.1:41907_     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000004 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 957655 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:42279/solr
   [junit4]   2> 957656 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[3EFEB584EE0CBD8F]) [     ] o.a.s.SolrTestCaseJ4 ###Ending testChecksumsOnlyVerbose
   [junit4]   2> 957759 INFO  (closeThreadPool-14322-thread-2) [     ] o.a.s.c.CoreContainer Shutting down CoreContainer instance=1262992374
   [junit4]   2> 957759 INFO  (closeThreadPool-14322-thread-2) [     ] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:41907_
   [junit4]   2> 957759 INFO  (closeThreadPool-14322-thread-2) [     ] o.a.s.c.ZkController Publish this node as DOWN...
   [junit4]   2> 957759 INFO  (closeThreadPool-14322-thread-2) [     ] o.a.s.c.ZkController Publish node=127.0.0.1:41907_ as DOWN
   [junit4]   2> 957759 INFO  (closeThreadPool-14322-thread-1) [     ] o.a.s.c.CoreContainer Shutting down CoreContainer instance=703422287
   [junit4]   2> 957759 INFO  (closeThreadPool-14322-thread-1) [     ] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:40585_
   [junit4]   2> 957760 INFO  (closeThreadPool-14322-thread-1) [     ] o.a.s.c.ZkController Publish this node as DOWN...
   [junit4]   2> 957760 INFO  (closeThreadPool-14322-thread-1) [     ] o.a.s.c.ZkController Publish node=127.0.0.1:40585_ as DOWN
   [junit4]   2> 957760 INFO  (coreCloseExecutor-14329-thread-1) [n:127.0.0.1:41907_     ] o.a.s.c.SolrCore [control_collection_shard1_replica_n1]  CLOSING SolrCore org.apache.solr.core.SolrCore@56375f11
   [junit4]   2> 957760 INFO  (coreCloseExecutor-14329-thread-1) [n:127.0.0.1:41907_     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.control_collection.shard1.replica_n1 tag=SolrCore@56375f11
   [junit4]   2> 957760 INFO  (coreCloseExecutor-14329-thread-1) [n:127.0.0.1:41907_     ] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@5abbb3cf: rootName = null, domain = solr.core.control_collection.shard1.replica_n1, service url = null, agent id = null] for registry solr.core.control_collection.shard1.replica_n1/com.codahale.metrics.MetricRegistry@6969d59
   [junit4]   2> 957760 INFO  (zkCallback-14259-thread-3) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 957760 INFO  (zkCallback-14259-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 957760 INFO  (zkCallback-14259-thread-4) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 957760 INFO  (coreCloseExecutor-14330-thread-1) [n:127.0.0.1:40585_     ] o.a.s.c.SolrCore [collection1_shard1_replica_n1]  CLOSING SolrCore org.apache.solr.core.SolrCore@5de10a42
   [junit4]   2> 957760 INFO  (coreCloseExecutor-14330-thread-1) [n:127.0.0.1:40585_     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.collection1.shard1.replica_n1 tag=SolrCore@5de10a42
   [junit4]   2> 957760 INFO  (coreCloseExecutor-14330-thread-1) [n:127.0.0.1:40585_     ] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@62bbd19f: rootName = null, domain = solr.core.collection1.shard1.replica_n1, service url = null, agent id = null] for registry solr.core.collection1.shard1.replica_n1/com.codahale.metrics.MetricRegistry@44d390ae
   [junit4]   2> 957766 INFO  (coreCloseExecutor-14329-thread-1) [n:127.0.0.1:41907_     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.control_collection.shard1.leader tag=SolrCore@56375f11
   [junit4]   2> 957766 INFO  (coreCloseExecutor-14330-thread-1) [n:127.0.0.1:40585_     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.collection1.shard1.leader tag=SolrCore@5de10a42
   [junit4]   2> 957766 INFO  (coreCloseExecutor-14329-thread-1) [n:127.0.0.1:41907_     ] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close()  ... SKIPPED (unnecessary).
   [junit4]   2> 957767 INFO  (coreCloseExecutor-14330-thread-1) [n:127.0.0.1:40585_     ] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close()  ... SKIPPED (unnecessary).
   [junit4]   2> 957770 INFO  (coreCloseExecutor-14330-thread-1) [n:127.0.0.1:40585_     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:42279/solr_hdfs_home/collection1/core_node2/data/snapshot_metadata
   [junit4]   2> 957770 INFO  (coreCloseExecutor-14329-thread-1) [n:127.0.0.1:41907_     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:42279/solr_hdfs_home/control_collection/core_node2/data/index
   [junit4]   2> 957770 INFO  (coreCloseExecutor-14329-thread-1) [n:127.0.0.1:41907_     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:42279/solr_hdfs_home/control_collection/core_node2/data/snapshot_metadata
   [junit4]   2> 957770 INFO  (coreCloseExecutor-14329-thread-1) [n:127.0.0.1:41907_     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:42279/solr_hdfs_home/control_collection/core_node2/data
   [junit4]   2> 957770 INFO  (coreCloseExecutor-14330-thread-1) [n:127.0.0.1:40585_     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:42279/solr_hdfs_home/collection1/core_node2/data/index
   [junit4]   2> 957770 INFO  (coreCloseExecutor-14330-thread-1) [n:127.0.0.1:40585_     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:42279/solr_hdfs_home/collection1/core_node2/data
   [junit4]   2> 957770 ERROR (OldIndexDirectoryCleanupThreadForCore-control_collection_shard1_replica_n1) [     ] o.a.s.c.HdfsDirectoryFactory Error checking if path hdfs://localhost.localdomain:42279/solr_hdfs_home/control_collection/core_node2/data/snapshot_metadata is an old index directory, caused by: java.io.IOException: Filesystem closed
   [junit4]   2> 957770 ERROR (OldIndexDirectoryCleanupThreadForCore-collection1_shard1_replica_n1) [     ] o.a.s.c.HdfsDirectoryFactory Error checking if path hdfs://localhost.localdomain:42279/solr_hdfs_home/collection1/core_node2/data/tlog is an old index directory, caused by: java.io.IOException: Filesystem closed
   [junit4]   2> 957770 ERROR (OldIndexDirectoryCleanupThreadForCore-control_collection_shard1_replica_n1) [     ] o.a.s.c.HdfsDirectoryFactory Error checking if path hdfs://localhost.localdomain:42279/solr_hdfs_home/control_collection/core_node2/data/tlog is an old index directory, caused by: java.io.IOException: Filesystem closed
   [junit4]   2> 957770 INFO  (closeThreadPool-14322-thread-2) [     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null
   [junit4]   2> 957770 INFO  (closeThreadPool-14322-thread-2) [     ] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@7a7207c: rootName = null, domain = solr.node, service url = null, agent id = null] for registry solr.node/com.codahale.metrics.MetricRegistry@5663cff
   [junit4]   2> 957771 INFO  (closeThreadPool-14322-thread-1) [     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null
   [junit4]   2> 957771 INFO  (closeThreadPool-14322-thread-1) [     ] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@72d30dec: rootName = null, domain = solr.node, service url = null, agent id = null] for registry solr.node/com.codahale.metrics.MetricRegistry@16b9194c
   [junit4]   2> 957773 INFO  (closeThreadPool-14322-thread-2) [     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jvm tag=null
   [junit4]   2> 957773 INFO  (closeThreadPool-14322-thread-2) [     ] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@74f87f85: rootName = null, domain = solr.jvm, service url = null, agent id = null] for registry solr.jvm/com.codahale.metrics.MetricRegistry@311289b7
   [junit4]   2> 957774 INFO  (closeThreadPool-14322-thread-2) [     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.jetty tag=null
   [junit4]   2> 957774 INFO  (closeThreadPool-14322-thread-2) [     ] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@783add68: rootName = null, domain = solr.jetty, service url = null, agent id = null] for registry solr.jetty/com.codahale.metrics.MetricRegistry@62fbe063
   [junit4]   2> 957774 INFO  (closeThreadPool-14322-thread-2) [     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.cluster tag=null
   [junit4]   2> 957775 INFO  (closeThreadPool-14333-thread-2) [     ] o.a.s.c.Overseer Overseer (id=72186066210717700-127.0.0.1:41907_-n_0000000000) closing
   [junit4]   2> 957775 INFO  (OverseerStateUpdate-72186066210717700-127.0.0.1:41907_-n_0000000000) [n:127.0

[...truncated too long message...]

errupted
   [junit4]   2> 1097210 WARN  (BP-228111201-127.0.0.1-1588225500131 heartbeating to localhost.localdomain/127.0.0.1:42279) [     ] o.a.h.h.s.d.DataNode Ending block pool service for: Block pool BP-228111201-127.0.0.1-1588225500131 (Datanode Uuid e9e72139-37c9-4d21-9747-5ffd727a7dd7) service to localhost.localdomain/127.0.0.1:42279
   [junit4]   2> 1097222 INFO  (SUITE-CheckHdfsIndexTest-seed#[3EFEB584EE0CBD8F]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.w.WebAppContext@1aeb5e27{hdfs,/,null,UNAVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/hdfs}
   [junit4]   2> 1097223 INFO  (SUITE-CheckHdfsIndexTest-seed#[3EFEB584EE0CBD8F]-worker) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@3baf4231{HTTP/1.1, (http/1.1)}{localhost.localdomain:0}
   [junit4]   2> 1097223 INFO  (SUITE-CheckHdfsIndexTest-seed#[3EFEB584EE0CBD8F]-worker) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 1097223 INFO  (SUITE-CheckHdfsIndexTest-seed#[3EFEB584EE0CBD8F]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@31c852{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,UNAVAILABLE}
   [junit4]   2> NOTE: leaving temporary files on disk at: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J0/temp/solr.index.hdfs.CheckHdfsIndexTest_3EFEB584EE0CBD8F-001
   [junit4]   2> Apr 30, 2020 5:47:40 AM com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks
   [junit4]   2> WARNING: Will linger awaiting termination of 65 leaked thread(s).
   [junit4]   2> NOTE: test params are: codec=Asserting(Lucene84): {date=BlockTreeOrds(blocksize=128), rnd_b=Lucene84, field=BlockTreeOrds(blocksize=128), docid=Lucene84, multiDefault=PostingsFormat(name=MockRandom), _root_=BlockTreeOrds(blocksize=128), titleTokenized=PostingsFormat(name=LuceneVarGapFixedInterval), id=Lucene84, body=BlockTreeOrds(blocksize=128), title=PostingsFormat(name=MockRandom)}, docValues:{docid_intDV=DocValuesFormat(name=Asserting), range_facet_l_dv=DocValuesFormat(name=Direct), _version_=DocValuesFormat(name=Lucene80), range_facet_i_dv=DocValuesFormat(name=Lucene80), intDvoDefault=DocValuesFormat(name=Asserting), titleDV=DocValuesFormat(name=Lucene80), timestamp=DocValuesFormat(name=Lucene80)}, maxPointsInLeafNode=858, maxMBSortInHeap=5.708371703442165, sim=Asserting(org.apache.lucene.search.similarities.AssertingSimilarity@78d88018), locale=nl-BE, timezone=Pacific/Kosrae
   [junit4]   2> NOTE: Linux 5.3.0-46-generic amd64/AdoptOpenJDK 13.0.2 (64-bit)/cpus=16,threads=5,free=118845952,total=526385152
   [junit4]   2> NOTE: All tests run in this JVM: [TestNumericTerms32, TestCSVResponseWriter, TestCloudPseudoReturnFields, HttpPartitionWithTlogReplicasTest, TestRecoveryHdfs, TestDistribIDF, InfixSuggestersTest, TestFieldCollectionResource, TestSlowCompositeReaderWrapper, TestRuleBasedAuthorizationPlugin, LeaderTragicEventTest, CollectionStateFormat2Test, PackageManagerCLITest, TestSolrCoreProperties, BasicAuthIntegrationTest, TestValueSourceCache, TestCloudJSONFacetJoinDomain, AuthToolTest, TestSchemaSimilarityResource, DataDrivenBlockJoinTest, CoreAdminHandlerTest, SolrCLIZkUtilsTest, SearchHandlerTest, V2StandaloneTest, CollectionReloadTest, CreateRoutedAliasTest, SignificantTermsQParserPluginTest, CurrencyFieldTypeTest, TriggerCooldownIntegrationTest, HdfsSyncSliceTest, TestSafeXMLParsing, TestNonDefinedSimilarityFactory, TlogReplayBufferedWhileIndexingTest, TestQueryingOnDownCollection, HLLSerializationTest, TestGroupingSearch, ActionThrottleTest, FileUtilsTest, DistributedFacetPivotLongTailTest, TestConfigOverlay, TestJsonFacetsWithNestedObjects, TestFieldCacheReopen, SolrPluginUtilsTest, TestStressReorder, HdfsChaosMonkeyNothingIsSafeTest, ReplaceNodeNoTargetTest, ZkFailoverTest, DocValuesNotIndexedTest, TolerantUpdateProcessorTest, DistributedQueryComponentOptimizationTest, OverseerRolesTest, TestSolrJ, TestBinaryField, HighlighterMaxOffsetTest, TestPivotHelperCode, SolrShardReporterTest, JWTVerificationkeyResolverTest, SolrCoreCheckLockOnStartupTest, BJQFilterAccessibleTest, ClassificationUpdateProcessorTest, TestUseDocValuesAsStored2, TestChildDocTransformerHierarchy, EchoParamsTest, TestNRTOpen, TestFacetMethods, FieldMutatingUpdateProcessorTest, SolrIndexConfigTest, TestManagedResourceStorage, TestCorePropertiesReload, CachingDirectoryFactoryTest, TestSnapshotCloudManager, NestedShardedAtomicUpdateTest, TestLegacyNumericUtils, MetricsHistoryHandlerTest, CloudExitableDirectoryReaderTest, CollectionsAPISolrJTest, RemoteQueryErrorTest, RestartWhileUpdatingTest, RollingRestartTest, SSLMigrationTest, SaslZkACLProviderTest, ShardRoutingCustomTest, TestConfigSetsAPIExclusivity, TestDownShardTolerantSearch, TestRandomRequestDistribution, TestShortCircuitedRequests, TestStressInPlaceUpdates, TestZkChroot, CollectionsAPIDistributedZkTest, AutoAddReplicasPlanActionTest, IndexSizeTriggerMixedBoundsTest, TestDynamicURP, TestImplicitCoreProperties, TestSolrConfigHandler, TestRestoreCore, TestSQLHandlerNonCloud, TestStressThreadBackup, TestSystemCollAutoCreate, CoreAdminRequestStatusTest, HealthCheckHandlerTest, PropertiesRequestHandlerTest, SegmentsInfoRequestHandlerTest, SplitHandlerTest, SystemInfoHandlerTest, TestCollectionAPIs, TestCoreAdminApis, ZookeeperStatusHandlerTest, BadComponentTest, CustomTermsComponentTest, DistributedSpellCheckComponentTest, TermVectorComponentDistributedTest, TestHttpShardHandlerFactory, TaggerTest, XmlInterpolationTest, HighlighterConfigTest, WrapperMergePolicyFactoryTest, CheckHdfsIndexTest]
   [junit4] Completed [905/907 (1!)] on J0 in 163.37s, 5 tests, 1 error, 1 skipped <<< FAILURES!

[...truncated 38093 lines...]
-ecj-javadoc-lint-src:
    [mkdir] Created dir: /tmp/ecj345313962
 [ecj-lint] Compiling 931 source files to /tmp/ecj345313962
 [ecj-lint] ----------
 [ecj-lint] 1. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/CodecUtil.java (at line 523)
 [ecj-lint] 	throw new CorruptIndexException("misplaced codec footer (file truncated?): length=" + in.length() + " but footerLength==" + footerLength(), input);
 [ecj-lint] 	^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'in' is not closed at this location
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 2. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingStoredFieldsReader.java (at line 166)
 [ecj-lint] 	FieldsIndexReader fieldsIndexReader = new FieldsIndexReader(d, si.name, segmentSuffix, INDEX_EXTENSION_PREFIX, INDEX_CODEC_NAME, si.getId());
 [ecj-lint] 	                  ^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'fieldsIndexReader' is never closed
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 3. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingTermVectorsReader.java (at line 148)
 [ecj-lint] 	FieldsIndexReader fieldsIndexReader = new FieldsIndexReader(d, si.name, segmentSuffix, VECTORS_INDEX_EXTENSION_PREFIX, VECTORS_INDEX_CODEC_NAME, si.getId());
 [ecj-lint] 	                  ^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'fieldsIndexReader' is never closed
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 4. ERROR in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/search/IndexSearcher.java (at line 50)
 [ecj-lint] 	import org.apache.lucene.util.automaton.ByteRunAutomaton;
 [ecj-lint] 	       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 [ecj-lint] The import org.apache.lucene.util.automaton.ByteRunAutomaton is never used
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 5. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/util/automaton/Operations.java (at line 742)
 [ecj-lint] 	Integer q = newstate.get(statesSet);
 [ecj-lint] 	                         ^^^^^^^^^
 [ecj-lint] Unlikely argument type SortedIntSet for get(Object) on a Map<SortedIntSet.FrozenIntSet,Integer>
 [ecj-lint] ----------
 [ecj-lint] 5 problems (1 error, 4 warnings)

BUILD FAILED
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/build.xml:634: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/build.xml:101: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build.xml:201: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/common-build.xml:2127: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/common-build.xml:2166: Compile failed; see the compiler error output for details.

Total time: 36 minutes 38 seconds
Build step 'Invoke Ant' marked build as failure
Archiving artifacts
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
[WARNINGS] Skipping publisher since build result is FAILURE
Recording test results
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2

[JENKINS] Lucene-Solr-8.x-Linux (64bit/jdk-12.0.2) - Build # 2916 - Failure!

Posted by Policeman Jenkins Server <je...@thetaphi.de>.
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-8.x-Linux/2916/
Java: 64bit/jdk-12.0.2 -XX:+UseCompressedOops -XX:+UseG1GC

1 tests failed.
FAILED:  org.apache.solr.index.hdfs.CheckHdfsIndexTest.doTest

Error Message:
Error from server at https://127.0.0.1:32991/collection1: java.lang.NullPointerException  at org.apache.solr.handler.admin.SystemInfoHandler.getSecurityInfo(SystemInfoHandler.java:326)  at org.apache.solr.handler.admin.SystemInfoHandler.handleRequestBody(SystemInfoHandler.java:146)  at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)  at org.apache.solr.core.SolrCore.execute(SolrCore.java:2600)  at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:803)  at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:582)  at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:432)  at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:362)  at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)  at org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)  at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)  at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)  at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)  at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1610)  at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)  at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1300)  at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)  at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)  at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1580)  at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)  at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1215)  at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)  at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)  at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)  at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:767)  at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)  at org.eclipse.jetty.server.Server.handle(Server.java:500)  at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)  at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)  at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)  at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)  at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)  at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)  at org.eclipse.jetty.io.ssl.SslConnection$1.run(SslConnection.java:146)  at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)  at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)  at java.base/java.lang.Thread.run(Thread.java:835) 

Stack Trace:
org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at https://127.0.0.1:32991/collection1: java.lang.NullPointerException
	at org.apache.solr.handler.admin.SystemInfoHandler.getSecurityInfo(SystemInfoHandler.java:326)
	at org.apache.solr.handler.admin.SystemInfoHandler.handleRequestBody(SystemInfoHandler.java:146)
	at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)
	at org.apache.solr.core.SolrCore.execute(SolrCore.java:2600)
	at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:803)
	at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:582)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:432)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:362)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)
	at org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)
	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1610)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1300)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)
	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1580)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1215)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)
	at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:767)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.server.Server.handle(Server.java:500)
	at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)
	at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)
	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)
	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)
	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
	at org.eclipse.jetty.io.ssl.SslConnection$1.run(SslConnection.java:146)
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)
	at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)
	at java.base/java.lang.Thread.run(Thread.java:835)

	at __randomizedtesting.SeedInfo.seed([65A7948048252E96:C2E32C24259E3D2F]:0)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:665)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:265)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:248)
	at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:211)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1003)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1018)
	at org.apache.solr.index.hdfs.CheckHdfsIndexTest.doTest(CheckHdfsIndexTest.java:120)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:567)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1081)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1053)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at java.base/java.lang.Thread.run(Thread.java:835)




Build Log:
[...truncated 14607 lines...]
   [junit4] Suite: org.apache.solr.index.hdfs.CheckHdfsIndexTest
   [junit4]   2> 521068 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.a.s.SolrTestCase Setting 'solr.default.confdir' system property to test-framework derived value of '/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/server/solr/configsets/_default/conf'
   [junit4]   2> 521068 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.a.s.SolrTestCaseJ4 Created dataDir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_65A7948048252E96-001/data-dir-63-001
   [junit4]   2> 521068 WARN  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.a.s.SolrTestCaseJ4 startTrackingSearchers: numOpens=19 numCloses=19
   [junit4]   2> 521068 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.a.s.SolrTestCaseJ4 Using PointFields (NUMERIC_POINTS_SYSPROP=true) w/NUMERIC_DOCVALUES_SYSPROP=false
   [junit4]   2> 521069 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.a.s.SolrTestCaseJ4 Randomized ssl (true) and clientAuth (false) via: @org.apache.solr.util.RandomizeSSL(reason="", value=0.0/0.0, ssl=0.0/0.0, clientAuth=0.0/0.0)
   [junit4]   2> 521069 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.a.s.SolrTestCaseJ4 SecureRandom sanity checks: test.solr.allowed.securerandom=null & java.security.egd=file:/dev/./urandom
   [junit4]   2> 521069 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.a.s.BaseDistributedSearchTestCase Setting hostContext system property: /
   [junit4]   1> Formatting using clusterid: testClusterID
   [junit4]   2> 521098 WARN  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 521100 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 12.0.2+10
   [junit4]   2> 521113 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 521113 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 521113 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 521124 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@2bc5b07b{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,AVAILABLE}
   [junit4]   2> 521231 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.w.WebAppContext@710ef0a7{hdfs,/,file:///home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/jetty-localhost_localdomain-33367-hadoop-hdfs-3_2_0-tests_jar-_-any-3981994019931771010.dir/webapp/,AVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/hdfs}
   [junit4]   2> 521239 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.AbstractConnector Started ServerConnector@1c0450d5{HTTP/1.1, (http/1.1)}{localhost.localdomain:33367}
   [junit4]   2> 521240 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.Server Started @521256ms
   [junit4]   2> 521320 WARN  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 521321 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 12.0.2+10
   [junit4]   2> 521331 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 521331 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 521331 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 521336 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@df3c027{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,AVAILABLE}
   [junit4]   2> 521448 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.w.WebAppContext@f6c1877{datanode,/,file:///home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/jetty-localhost-44659-hadoop-hdfs-3_2_0-tests_jar-_-any-13939478604313969489.dir/webapp/,AVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/datanode}
   [junit4]   2> 521448 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.AbstractConnector Started ServerConnector@b1dea15{HTTP/1.1, (http/1.1)}{localhost:44659}
   [junit4]   2> 521448 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.Server Started @521464ms
   [junit4]   2> 521492 WARN  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 521493 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 12.0.2+10
   [junit4]   2> 521494 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 521494 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 521494 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 521494 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@3fdb9448{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,AVAILABLE}
   [junit4]   2> 521554 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0xb3cd75267a7da53c: Processing first storage report for DS-025a1804-3564-4274-8df0-9424d870af4d from datanode 31472468-e6d0-4b27-b838-96069df90346
   [junit4]   2> 521554 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0xb3cd75267a7da53c: from storage DS-025a1804-3564-4274-8df0-9424d870af4d node DatanodeRegistration(127.0.0.1:43451, datanodeUuid=31472468-e6d0-4b27-b838-96069df90346, infoPort=39295, infoSecurePort=0, ipcPort=43063, storageInfo=lv=-57;cid=testClusterID;nsid=1516725826;c=1588217213343), blocks: 0, hasStaleStorage: true, processing time: 0 msecs, invalidatedBlocks: 0
   [junit4]   2> 521554 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0xb3cd75267a7da53c: Processing first storage report for DS-c5918e7b-2364-42a4-a87e-b44583fce1cd from datanode 31472468-e6d0-4b27-b838-96069df90346
   [junit4]   2> 521554 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0xb3cd75267a7da53c: from storage DS-c5918e7b-2364-42a4-a87e-b44583fce1cd node DatanodeRegistration(127.0.0.1:43451, datanodeUuid=31472468-e6d0-4b27-b838-96069df90346, infoPort=39295, infoSecurePort=0, ipcPort=43063, storageInfo=lv=-57;cid=testClusterID;nsid=1516725826;c=1588217213343), blocks: 0, hasStaleStorage: false, processing time: 0 msecs, invalidatedBlocks: 0
   [junit4]   2> 521595 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.w.WebAppContext@3995225d{datanode,/,file:///home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/jetty-localhost-34425-hadoop-hdfs-3_2_0-tests_jar-_-any-18195628476186436189.dir/webapp/,AVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/datanode}
   [junit4]   2> 521596 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.AbstractConnector Started ServerConnector@adf0911{HTTP/1.1, (http/1.1)}{localhost:34425}
   [junit4]   2> 521596 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.Server Started @521612ms
   [junit4]   2> 521681 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0xcced72f3fd895a42: Processing first storage report for DS-807d12ac-96a4-4db5-87b7-d0203f45706a from datanode 5c92dee9-ed0f-448b-9d6f-8edd3cb4a093
   [junit4]   2> 521681 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0xcced72f3fd895a42: from storage DS-807d12ac-96a4-4db5-87b7-d0203f45706a node DatanodeRegistration(127.0.0.1:39073, datanodeUuid=5c92dee9-ed0f-448b-9d6f-8edd3cb4a093, infoPort=43387, infoSecurePort=0, ipcPort=39445, storageInfo=lv=-57;cid=testClusterID;nsid=1516725826;c=1588217213343), blocks: 0, hasStaleStorage: true, processing time: 0 msecs, invalidatedBlocks: 0
   [junit4]   2> 521681 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0xcced72f3fd895a42: Processing first storage report for DS-fa80186a-bfe7-425c-9de4-1418ffd0da95 from datanode 5c92dee9-ed0f-448b-9d6f-8edd3cb4a093
   [junit4]   2> 521681 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0xcced72f3fd895a42: from storage DS-fa80186a-bfe7-425c-9de4-1418ffd0da95 node DatanodeRegistration(127.0.0.1:39073, datanodeUuid=5c92dee9-ed0f-448b-9d6f-8edd3cb4a093, infoPort=43387, infoSecurePort=0, ipcPort=39445, storageInfo=lv=-57;cid=testClusterID;nsid=1516725826;c=1588217213343), blocks: 0, hasStaleStorage: false, processing time: 0 msecs, invalidatedBlocks: 0
   [junit4]   2> 521750 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER
   [junit4]   2> 521750 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer client port:0.0.0.0/0.0.0.0:0
   [junit4]   2> 521751 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer Starting server
   [junit4]   2> 521850 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.ZkTestServer start zk server on port:38025
   [junit4]   2> 521851 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.ZkTestServer waitForServerUp: 127.0.0.1:38025
   [junit4]   2> 521851 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.ZkTestServer parse host and port list: 127.0.0.1:38025
   [junit4]   2> 521851 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.ZkTestServer connecting to 127.0.0.1 38025
   [junit4]   2> 521852 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 521854 INFO  (zkConnectionManagerCallback-4632-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 521854 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 521855 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 521857 INFO  (zkConnectionManagerCallback-4634-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 521857 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 521858 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig-tlog.xml to /configs/conf1/solrconfig.xml
   [junit4]   2> 521859 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/schema.xml to /configs/conf1/schema.xml
   [junit4]   2> 521861 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig.snippet.randomindexconfig.xml to /configs/conf1/solrconfig.snippet.randomindexconfig.xml
   [junit4]   2> 521862 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/stopwords.txt to /configs/conf1/stopwords.txt
   [junit4]   2> 521863 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/protwords.txt to /configs/conf1/protwords.txt
   [junit4]   2> 521863 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/currency.xml to /configs/conf1/currency.xml
   [junit4]   2> 521864 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/enumsConfig.xml to /configs/conf1/enumsConfig.xml
   [junit4]   2> 521864 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/open-exchange-rates.json to /configs/conf1/open-exchange-rates.json
   [junit4]   2> 521865 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/mapping-ISOLatin1Accent.txt to /configs/conf1/mapping-ISOLatin1Accent.txt
   [junit4]   2> 521865 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/old_synonyms.txt to /configs/conf1/old_synonyms.txt
   [junit4]   2> 521866 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/synonyms.txt to /configs/conf1/synonyms.txt
   [junit4]   2> 521867 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 521868 INFO  (zkConnectionManagerCallback-4638-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 521868 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 521969 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Will use NRT replicas unless explicitly asked otherwise
   [junit4]   2> 522064 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 522064 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 522064 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ...
   [junit4]   2> 522064 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 12.0.2+10
   [junit4]   2> 522072 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 522072 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 522072 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 522072 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@4b3f5bc2{/,null,AVAILABLE}
   [junit4]   2> 522083 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.e.j.s.AbstractConnector Started ServerConnector@5dbc9e8b{ssl, (ssl, alpn, http/1.1, h2)}{127.0.0.1:46099}
   [junit4]   2> 522084 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.e.j.s.Server Started @522100ms
   [junit4]   2> 522084 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/, solr.data.dir=hdfs://localhost.localdomain:37601/hdfs__localhost.localdomain_37601__home_jenkins_workspace_Lucene-Solr-8.x-Linux_solr_build_solr-core_test_J3_temp_solr.index.hdfs.CheckHdfsIndexTest_65A7948048252E96-001_tempDir-002_control_data, hostPort=46099, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_65A7948048252E96-001/control-001/cores}
   [junit4]   2> 522084 ERROR (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 522084 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 522084 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solrâ„¢ version 8.6.0
   [junit4]   2> 522084 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 522084 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr
   [junit4]   2> 522084 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-04-30T03:26:54.344591Z
   [junit4]   2> 522090 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 522090 INFO  (zkConnectionManagerCallback-4640-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 522090 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 522191 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 522191 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_65A7948048252E96-001/control-001/solr.xml
   [junit4]   2> 522194 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 522194 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 522195 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@1bc7ff73, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 522236 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 522237 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@6aef27ec[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 522237 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@6aef27ec[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 522239 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@4a6ea826[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 522239 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@4a6ea826[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 522240 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:38025/solr
   [junit4]   2> 522241 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 522242 INFO  (zkConnectionManagerCallback-4651-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 522242 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 522343 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [n:127.0.0.1:46099_     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 522344 INFO  (zkConnectionManagerCallback-4653-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 522344 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [n:127.0.0.1:46099_     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 522389 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [n:127.0.0.1:46099_     ] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:46099_
   [junit4]   2> 522389 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [n:127.0.0.1:46099_     ] o.a.s.c.Overseer Overseer (id=72185523145736197-127.0.0.1:46099_-n_0000000000) starting
   [junit4]   2> 522391 INFO  (OverseerStateUpdate-72185523145736197-127.0.0.1:46099_-n_0000000000) [n:127.0.0.1:46099_     ] o.a.s.c.Overseer Starting to work on the main queue : 127.0.0.1:46099_
   [junit4]   2> 522391 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [n:127.0.0.1:46099_     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:46099_
   [junit4]   2> 522392 INFO  (OverseerStateUpdate-72185523145736197-127.0.0.1:46099_-n_0000000000) [n:127.0.0.1:46099_     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 522393 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [n:127.0.0.1:46099_     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 522393 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [n:127.0.0.1:46099_     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=you make it. Consider configuring authentication/authorization before exposing Solr to users internal or  authorization=external.  See https://s.apache.org/solrsecurity for more info.  Solr is only as secure as disableddisabled
   [junit4]   2> 522404 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [n:127.0.0.1:46099_     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 522416 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [n:127.0.0.1:46099_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@1bc7ff73
   [junit4]   2> 522422 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [n:127.0.0.1:46099_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@1bc7ff73
   [junit4]   2> 522422 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [n:127.0.0.1:46099_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@1bc7ff73
   [junit4]   2> 522423 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [n:127.0.0.1:46099_     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_65A7948048252E96-001/control-001/cores
   [junit4]   2> 522428 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 522428 INFO  (zkConnectionManagerCallback-4670-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 522428 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 522429 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 522429 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:38025/solr ready
   [junit4]   2> 522439 INFO  (qtp265242249-8809) [n:127.0.0.1:46099_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:46099_&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 522444 INFO  (OverseerThreadFactory-4660-thread-1-processing-n:127.0.0.1:46099_) [n:127.0.0.1:46099_     ] o.a.s.c.a.c.CreateCollectionCmd Create collection control_collection
   [junit4]   2> 522557 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 522558 INFO  (qtp265242249-8813) [n:127.0.0.1:46099_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 522561 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_    x:control_collection_shard1_replica_n1 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 522562 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_    x:control_collection_shard1_replica_n1 ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 523568 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.6.0
   [junit4]   2> 523579 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.IndexSchema Schema name=test
   [junit4]   2> 523651 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 523661 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.CoreContainer Creating SolrCore 'control_collection_shard1_replica_n1' using configuration from configset conf1, trusted=true
   [junit4]   2> 523661 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.control_collection.shard1.replica_n1' (registry 'solr.core.control_collection.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@1bc7ff73
   [junit4]   2> 523663 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory solr.hdfs.home=hdfs://localhost.localdomain:37601/solr_hdfs_home
   [junit4]   2> 523664 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Solr Kerberos Authentication disabled
   [junit4]   2> 523664 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrCore [[control_collection_shard1_replica_n1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_65A7948048252E96-001/control-001/cores/control_collection_shard1_replica_n1], dataDir=[hdfs://localhost.localdomain:37601/solr_hdfs_home/control_collection/core_node2/data/]
   [junit4]   2> 523664 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:37601/solr_hdfs_home/control_collection/core_node2/data/snapshot_metadata
   [junit4]   2> 523674 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 523674 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [33554432] will allocate [1] slabs and use ~[33554432] bytes
   [junit4]   2> 523674 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new global HDFS BlockCache
   [junit4]   2> 523705 WARN  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.h.HdfsDirectory The NameNode is in SafeMode - Solr will wait 5 seconds and try again.
   [junit4]   2> 528706 WARN  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.h.HdfsDirectory The NameNode is in SafeMode - Solr will wait 5 seconds and try again.
   [junit4]   2> 533713 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 533717 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:37601/solr_hdfs_home/control_collection/core_node2/data
   [junit4]   2> 533737 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:37601/solr_hdfs_home/control_collection/core_node2/data/index
   [junit4]   2> 533743 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 533743 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [33554432] will allocate [1] slabs and use ~[33554432] bytes
   [junit4]   2> 533747 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 533747 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=38, maxMergeAtOnceExplicit=49, maxMergedSegmentMB=75.2119140625, floorSegmentMB=0.6337890625, forceMergeDeletesPctAllowed=2.093891609271097, segmentsPerTier=44.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.839941937737286, deletesPctAllowed=22.17795614484704
   [junit4]   2> 533779 WARN  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 533842 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.HdfsUpdateLog
   [junit4]   2> 533842 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 533842 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.HdfsUpdateLog Initializing HdfsUpdateLog: tlogDfsReplication=2
   [junit4]   2> 533858 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 533858 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 533860 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=41, maxMergeAtOnceExplicit=45, maxMergedSegmentMB=42.8203125, floorSegmentMB=1.080078125, forceMergeDeletesPctAllowed=12.737292583859347, segmentsPerTier=20.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.0, deletesPctAllowed=31.37838405826533
   [junit4]   2> 533925 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@790bfc8d[control_collection_shard1_replica_n1] main]
   [junit4]   2> 533926 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 533926 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 533929 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms
   [junit4]   2> 533930 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1665366466169405440
   [junit4]   2> 533932 INFO  (searcherExecutor-4672-thread-1-processing-n:127.0.0.1:46099_ x:control_collection_shard1_replica_n1 c:control_collection s:shard1) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrCore [control_collection_shard1_replica_n1] Registered new searcher Searcher@790bfc8d[control_collection_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 533938 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ZkShardTerms Successful update of terms at /collections/control_collection/terms/shard1 to Terms{values={core_node2=0}, version=0}
   [junit4]   2> 533938 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/control_collection/leaders/shard1
   [junit4]   2> 533939 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 533939 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 533939 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync replicas to https://127.0.0.1:46099/control_collection_shard1_replica_n1/
   [junit4]   2> 533939 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 533940 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy https://127.0.0.1:46099/control_collection_shard1_replica_n1/ has no replicas
   [junit4]   2> 533940 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/control_collection/leaders/shard1/leader after winning as /collections/control_collection/leader_elect/shard1/election/72185523145736197-core_node2-n_0000000000
   [junit4]   2> 533940 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I am the new leader: https://127.0.0.1:46099/control_collection_shard1_replica_n1/ shard1
   [junit4]   2> 533941 INFO  (zkCallback-4652-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 533941 INFO  (zkCallback-4652-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 533941 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 533942 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=11381
   [junit4]   2> 533944 INFO  (qtp265242249-8809) [n:127.0.0.1:46099_     ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas
   [junit4]   2> 534042 INFO  (zkCallback-4652-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 534042 INFO  (zkCallback-4652-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 534042 INFO  (zkCallback-4652-thread-3) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 534043 INFO  (qtp265242249-8809) [n:127.0.0.1:46099_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:46099_&wt=javabin&version=2} status=0 QTime=11603
   [junit4]   2> 534043 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Waiting to see 1 active replicas in collection: control_collection
   [junit4]   2> 534159 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 534161 INFO  (zkConnectionManagerCallback-4681-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 534162 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 534169 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 534170 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:38025/solr ready
   [junit4]   2> 534170 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.ChaosMonkey monkey: init - expire sessions:false cause connection loss:false
   [junit4]   2> 534182 INFO  (qtp265242249-8809) [n:127.0.0.1:46099_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=2&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 534188 INFO  (OverseerThreadFactory-4660-thread-2-processing-n:127.0.0.1:46099_) [n:127.0.0.1:46099_     ] o.a.s.c.a.c.CreateCollectionCmd Create collection collection1
   [junit4]   2> 534188 INFO  (OverseerCollectionConfigSetProcessor-72185523145736197-127.0.0.1:46099_-n_0000000000) [n:127.0.0.1:46099_     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000000 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 534391 WARN  (OverseerThreadFactory-4660-thread-2-processing-n:127.0.0.1:46099_) [n:127.0.0.1:46099_     ] o.a.s.c.a.c.CreateCollectionCmd It is unusual to create a collection (collection1) without cores.
   [junit4]   2> 534394 INFO  (qtp265242249-8809) [n:127.0.0.1:46099_     ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas
   [junit4]   2> 534395 INFO  (qtp265242249-8809) [n:127.0.0.1:46099_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=2&wt=javabin&version=2} status=0 QTime=213
   [junit4]   2> 534396 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 534396 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 534396 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 534396 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 534396 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 534396 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 534396 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Creating jetty instances pullReplicaCount=0 numOtherReplicas=1
   [junit4]   2> 534482 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 1 in directory /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_65A7948048252E96-001/shard-1-001 of type NRT for shard1
   [junit4]   2> 534484 WARN  (closeThreadPool-4682-thread-1) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 534484 INFO  (closeThreadPool-4682-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 534484 INFO  (closeThreadPool-4682-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ...
   [junit4]   2> 534484 INFO  (closeThreadPool-4682-thread-1) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 12.0.2+10
   [junit4]   2> 534496 INFO  (closeThreadPool-4682-thread-1) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 534496 INFO  (closeThreadPool-4682-thread-1) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 534496 INFO  (closeThreadPool-4682-thread-1) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 534496 INFO  (closeThreadPool-4682-thread-1) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@584fc037{/,null,AVAILABLE}
   [junit4]   2> 534503 INFO  (closeThreadPool-4682-thread-1) [     ] o.e.j.s.AbstractConnector Started ServerConnector@54c4829{ssl, (ssl, alpn, http/1.1, h2)}{127.0.0.1:39291}
   [junit4]   2> 534504 INFO  (closeThreadPool-4682-thread-1) [     ] o.e.j.s.Server Started @534520ms
   [junit4]   2> 534504 INFO  (closeThreadPool-4682-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/, solrconfig=solrconfig.xml, solr.data.dir=hdfs://localhost.localdomain:37601/hdfs__localhost.localdomain_37601__home_jenkins_workspace_Lucene-Solr-8.x-Linux_solr_build_solr-core_test_J3_temp_solr.index.hdfs.CheckHdfsIndexTest_65A7948048252E96-001_tempDir-002_jetty1, hostPort=39291, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_65A7948048252E96-001/shard-1-001/cores, replicaType=NRT}
   [junit4]   2> 534504 ERROR (closeThreadPool-4682-thread-1) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 534504 INFO  (closeThreadPool-4682-thread-1) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 534504 INFO  (closeThreadPool-4682-thread-1) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solrâ„¢ version 8.6.0
   [junit4]   2> 534504 INFO  (closeThreadPool-4682-thread-1) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 534504 INFO  (closeThreadPool-4682-thread-1) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr
   [junit4]   2> 534504 INFO  (closeThreadPool-4682-thread-1) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-04-30T03:27:06.764442Z
   [junit4]   2> 534507 INFO  (closeThreadPool-4682-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 534511 INFO  (zkConnectionManagerCallback-4684-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 534511 INFO  (closeThreadPool-4682-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 534612 INFO  (closeThreadPool-4682-thread-1) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 534612 INFO  (closeThreadPool-4682-thread-1) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_65A7948048252E96-001/shard-1-001/solr.xml
   [junit4]   2> 534615 INFO  (closeThreadPool-4682-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 534615 INFO  (closeThreadPool-4682-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 534616 INFO  (closeThreadPool-4682-thread-1) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@1bc7ff73, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 534838 INFO  (closeThreadPool-4682-thread-1) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 534839 WARN  (closeThreadPool-4682-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@4d7a8cca[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 534839 WARN  (closeThreadPool-4682-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@4d7a8cca[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 534848 WARN  (closeThreadPool-4682-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@72d96df5[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 534848 WARN  (closeThreadPool-4682-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@72d96df5[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 534849 INFO  (closeThreadPool-4682-thread-1) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:38025/solr
   [junit4]   2> 534852 INFO  (closeThreadPool-4682-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 534855 INFO  (zkConnectionManagerCallback-4695-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 534856 INFO  (closeThreadPool-4682-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 534957 INFO  (closeThreadPool-4682-thread-1) [n:127.0.0.1:39291_     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 534958 INFO  (zkConnectionManagerCallback-4697-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 534958 INFO  (closeThreadPool-4682-thread-1) [n:127.0.0.1:39291_     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 534961 INFO  (closeThreadPool-4682-thread-1) [n:127.0.0.1:39291_     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 534963 INFO  (closeThreadPool-4682-thread-1) [n:127.0.0.1:39291_     ] o.a.s.c.ZkController Publish node=127.0.0.1:39291_ as DOWN
   [junit4]   2> 534963 INFO  (closeThreadPool-4682-thread-1) [n:127.0.0.1:39291_     ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 534963 INFO  (closeThreadPool-4682-thread-1) [n:127.0.0.1:39291_     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:39291_
   [junit4]   2> 534964 INFO  (zkCallback-4652-thread-3) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 534964 INFO  (zkCallback-4696-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 534965 INFO  (zkCallback-4680-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 534966 INFO  (closeThreadPool-4682-thread-1) [n:127.0.0.1:39291_     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 534967 WARN  (closeThreadPool-4682-thread-1) [n:127.0.0.1:39291_     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=you make it. Consider configuring authentication/authorization before exposing Solr to users internal or  authorization=external.  See https://s.apache.org/solrsecurity for more info.  Solr is only as secure as disableddisabled
   [junit4]   2> 534978 INFO  (closeThreadPool-4682-thread-1) [n:127.0.0.1:39291_     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 534995 INFO  (closeThreadPool-4682-thread-1) [n:127.0.0.1:39291_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@1bc7ff73
   [junit4]   2> 535003 INFO  (closeThreadPool-4682-thread-1) [n:127.0.0.1:39291_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@1bc7ff73
   [junit4]   2> 535003 INFO  (closeThreadPool-4682-thread-1) [n:127.0.0.1:39291_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@1bc7ff73
   [junit4]   2> 535004 INFO  (closeThreadPool-4682-thread-1) [n:127.0.0.1:39291_     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_65A7948048252E96-001/shard-1-001/cores
   [junit4]   2> 535010 INFO  (closeThreadPool-4682-thread-1) [     ] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:39291_
   [junit4]   2> 535028 INFO  (qtp1148273170-8874) [n:127.0.0.1:39291_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :addreplica with params node=127.0.0.1:39291_&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 535036 INFO  (OverseerCollectionConfigSetProcessor-72185523145736197-127.0.0.1:46099_-n_0000000000) [n:127.0.0.1:46099_     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000002 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 535048 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 535052 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.core.control_collection.shard1.replica_n1:INDEX.sizeInBytes} status=0 QTime=2
   [junit4]   2> 535053 INFO  (qtp265242249-8813) [n:127.0.0.1:46099_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 535055 INFO  (qtp1148273170-8878) [n:127.0.0.1:39291_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 535058 INFO  (qtp265242249-8809) [n:127.0.0.1:46099_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.core.control_collection.shard1.replica_n1:INDEX.sizeInBytes} status=0 QTime=3
   [junit4]   2> 535059 INFO  (qtp265242249-8811) [n:127.0.0.1:46099_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 535060 INFO  (OverseerThreadFactory-4660-thread-3-processing-n:127.0.0.1:46099_) [n:127.0.0.1:46099_ c:collection1 s:shard1   ] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:39291_ for creating new replica of shard shard1 for collection collection1
   [junit4]   2> 535061 INFO  (OverseerThreadFactory-4660-thread-3-processing-n:127.0.0.1:46099_) [n:127.0.0.1:46099_ c:collection1 s:shard1   ] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command.
   [junit4]   2> 535083 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_    x:collection1_shard1_replica_n1 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_n1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 536094 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.6.0
   [junit4]   2> 536107 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.IndexSchema Schema name=test
   [junit4]   2> 536187 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 536202 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard1_replica_n1' using configuration from configset conf1, trusted=true
   [junit4]   2> 536203 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.collection1.shard1.replica_n1' (registry 'solr.core.collection1.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@1bc7ff73
   [junit4]   2> 536203 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory solr.hdfs.home=hdfs://localhost.localdomain:37601/solr_hdfs_home
   [junit4]   2> 536203 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Solr Kerberos Authentication disabled
   [junit4]   2> 536203 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrCore [[collection1_shard1_replica_n1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_65A7948048252E96-001/shard-1-001/cores/collection1_shard1_replica_n1], dataDir=[hdfs://localhost.localdomain:37601/solr_hdfs_home/collection1/core_node2/data/]
   [junit4]   2> 536204 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:37601/solr_hdfs_home/collection1/core_node2/data/snapshot_metadata
   [junit4]   2> 536212 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 536212 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [33554432] will allocate [1] slabs and use ~[33554432] bytes
   [junit4]   2> 536225 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 536226 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:37601/solr_hdfs_home/collection1/core_node2/data
   [junit4]   2> 536252 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:37601/solr_hdfs_home/collection1/core_node2/data/index
   [junit4]   2> 536258 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 536258 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [33554432] will allocate [1] slabs and use ~[33554432] bytes
   [junit4]   2> 536263 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 536264 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=38, maxMergeAtOnceExplicit=49, maxMergedSegmentMB=75.2119140625, floorSegmentMB=0.6337890625, forceMergeDeletesPctAllowed=2.093891609271097, segmentsPerTier=44.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.839941937737286, deletesPctAllowed=22.17795614484704
   [junit4]   2> 536316 WARN  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 536375 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.HdfsUpdateLog
   [junit4]   2> 536375 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 536375 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.HdfsUpdateLog Initializing HdfsUpdateLog: tlogDfsReplication=2
   [junit4]   2> 536388 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 536388 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 536393 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=41, maxMergeAtOnceExplicit=45, maxMergedSegmentMB=42.8203125, floorSegmentMB=1.080078125, forceMergeDeletesPctAllowed=12.737292583859347, segmentsPerTier=20.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.0, deletesPctAllowed=31.37838405826533
   [junit4]   2> 536403 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@489f310b[collection1_shard1_replica_n1] main]
   [junit4]   2> 536405 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 536406 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 536406 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms
   [junit4]   2> 536407 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1665366468766728192
   [junit4]   2> 536409 INFO  (searcherExecutor-4708-thread-1-processing-n:127.0.0.1:39291_ x:collection1_shard1_replica_n1 c:collection1 s:shard1) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrCore [collection1_shard1_replica_n1] Registered new searcher Searcher@489f310b[collection1_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 536409 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection1/terms/shard1 to Terms{values={core_node2=0}, version=0}
   [junit4]   2> 536409 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection1/leaders/shard1
   [junit4]   2> 536410 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 536410 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 536410 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync replicas to https://127.0.0.1:39291/collection1_shard1_replica_n1/
   [junit4]   2> 536410 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 536410 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy https://127.0.0.1:39291/collection1_shard1_replica_n1/ has no replicas
   [junit4]   2> 536411 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/collection1/leaders/shard1/leader after winning as /collections/collection1/leader_elect/shard1/election/72185523145736202-core_node2-n_0000000000
   [junit4]   2> 536411 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I am the new leader: https://127.0.0.1:39291/collection1_shard1_replica_n1/ shard1
   [junit4]   2> 536513 INFO  (zkCallback-4696-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 536514 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 536515 INFO  (qtp1148273170-8876) [n:127.0.0.1:39291_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_n1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1431
   [junit4]   2> 536524 INFO  (zkCallback-4696-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 536525 INFO  (qtp1148273170-8874) [n:127.0.0.1:39291_ c:collection1    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:39291_&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2} status=0 QTime=1496
   [junit4]   2> 536525 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Waiting to see 1 active replicas in collection: collection1
   [junit4]   2> 536615 INFO  (zkCallback-4680-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 536615 INFO  (zkCallback-4696-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 536615 INFO  (zkCallback-4696-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 536616 INFO  (zkCallback-4696-thread-3) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 536617 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.SolrTestCaseJ4 ###Starting testDeletedDocs
   [junit4]   2> 537037 INFO  (OverseerCollectionConfigSetProcessor-72185523145736197-127.0.0.1:46099_-n_0000000000) [n:127.0.0.1:46099_     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000004 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 539243 WARN  (DataStreamer for file /solr/_1_Lucene85TermVectorsIndexfile_pointers_7.tmp block BP-606256027-127.0.0.1-1588217213343:blk_1073741855_1031) [     ] o.a.h.h.DataStreamer Caught exception
   [junit4]   2>           => java.lang.InterruptedException
   [junit4]   2> 	at java.base/java.lang.Object.wait(Native Method)
   [junit4]   2> java.lang.InterruptedException: null
   [junit4]   2> 	at java.lang.Object.wait(Native Method) ~[?:?]
   [junit4]   2> 	at java.lang.Thread.join(Thread.java:1308) ~[?:?]
   [junit4]   2> 	at java.lang.Thread.join(Thread.java:1375) [?:?]
   [junit4]   2> 	at org.apache.hadoop.hdfs.DataStreamer.closeResponder(DataStreamer.java:986) ~[hadoop-hdfs-client-3.2.0.jar:?]
   [junit4]   2> 	at org.apache.hadoop.hdfs.DataStreamer.closeInternal(DataStreamer.java:847) ~[hadoop-hdfs-client-3.2.0.jar:?]
   [junit4]   2> 	at org.apache.hadoop.hdfs.DataStreamer.run(DataStreamer.java:843) ~[hadoop-hdfs-client-3.2.0.jar:?]
   [junit4]   2> 555657 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:37601/solr
   [junit4]   2> 555658 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[65A7948048252E96]) [     ] o.a.s.SolrTestCaseJ4 ###Ending testDeletedDocs
   [junit4]   2> 555768 INFO  (closeThreadPool-4715-thread-2) [     ] o.a.s.c.CoreContainer Shutting down CoreContainer instance=839846426
   [junit4]   2> 555768 INFO  (closeThreadPool-4715-thread-2) [     ] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:46099_
   [junit4]   2> 555768 INFO  (closeThreadPool-4715-thread-2) [     ] o.a.s.c.ZkController Publish this node as DOWN...
   [junit4]   2> 555768 INFO  (closeThreadPool-4715-thread-2) [     ] o.a.s.c.ZkController Publish node=127.0.0.1:46099_ as DOWN
   [junit4]   2> 555768 INFO  (closeThreadPool-4715-thread-1) [     ] o.a.s.c.CoreContainer Shutting down CoreContainer instance=1854616293
   [junit4]   2> 555768 INFO  (closeThreadPool-4715-thread-1) [     ] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:39291_
   [junit4]   2> 555769 INFO  (closeThreadPool-4715-thread-1) [     ] o.a.s.c.ZkController Publish this node as DOWN...
   [junit4]   2> 555769 INFO  (closeThreadPool-4715-thread-1) [     ] o.a.s.c.ZkController Publish node=127.0.0.1:39291_ as DOWN
   [junit4]   2> 555770 INFO  (zkCallback-4652-thread-3) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 555770 INFO  (zkCallback-4652-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 555770 INFO  (zkCallback-4652-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 555770 INFO  (coreCloseExecutor-4722-thread-1) [n:127.0.0.1:46099_     ] o.a.s.c.SolrCore [control_collection_shard1_replica_n1]  CLOSING SolrCore org.apache.solr.core.SolrCore@294cbb39
   [junit4]   2> 555770 INFO  (coreCloseExecutor-4723-thread-1) [n:127.0.0.1:39291_     ] o.a.s.c.SolrCore [collection1_shard1_replica_n1]  CLOSING SolrCore org.apache.solr.core.SolrCore@65b1b84e
   [junit4]   2> 555770 INFO  (coreCloseExecutor-4723-thread-1) [n:127.0.0.1:39291_     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.collection1.shard1.replica_n1 tag=SolrCore@65b1b84e
   [junit4]   2> 555770 INFO  (coreCloseExecutor-4722-thread-1) [n:127.0.0.1:46099_     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.control_collection.shard1.replica_n1 tag=SolrCore@294cbb39
   [junit4]   2> 555770 INFO  (coreCloseExecutor-4723-thread-1) [n:127.0.0.1:39291_     ] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@510d0ab1: rootName = null, domain = solr.core.collection1.shard1.replica_n1, service url = null, agent id = null] for registry solr.core.collection1.shard1.replica_n1/com.codahale.metrics.MetricRegistry@5050e67d
   [junit4]   2> 555770 INFO  (coreCloseExecutor-4722-thread-1) [n:127.0.0.1:46099_     ] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@15a9c103: rootName = null, domain = solr.core.control_collection.shard1.replica_n1, service url = null, agent id = null] for registry solr.core.control_collection.shard1.replica_n1/com.codahale.metrics.MetricRegistry@5d19fc05
   [junit4]   2> 555780 INFO  (coreCloseExecutor-4723-thread-1) [n:127.0.0.1:39291

[...truncated too long message...]

dd3cb4a093) service to localhost.localdomain/127.0.0.1:37601
   [junit4]   2> 580886 WARN  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.a.h.h.s.d.DirectoryScanner DirectoryScanner: shutdown has been called
   [junit4]   2> 580941 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.w.WebAppContext@f6c1877{datanode,/,null,UNAVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/datanode}
   [junit4]   2> 580941 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@b1dea15{HTTP/1.1, (http/1.1)}{localhost:0}
   [junit4]   2> 580941 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 580941 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@df3c027{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,UNAVAILABLE}
   [junit4]   2> 580944 WARN  (BP-606256027-127.0.0.1-1588217213343 heartbeating to localhost.localdomain/127.0.0.1:37601) [     ] o.a.h.h.s.d.IncrementalBlockReportManager IncrementalBlockReportManager interrupted
   [junit4]   2> 580944 WARN  (BP-606256027-127.0.0.1-1588217213343 heartbeating to localhost.localdomain/127.0.0.1:37601) [     ] o.a.h.h.s.d.DataNode Ending block pool service for: Block pool BP-606256027-127.0.0.1-1588217213343 (Datanode Uuid 31472468-e6d0-4b27-b838-96069df90346) service to localhost.localdomain/127.0.0.1:37601
   [junit4]   2> 580966 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.w.WebAppContext@710ef0a7{hdfs,/,null,UNAVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/hdfs}
   [junit4]   2> 580966 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@1c0450d5{HTTP/1.1, (http/1.1)}{localhost.localdomain:0}
   [junit4]   2> 580966 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 580966 INFO  (SUITE-CheckHdfsIndexTest-seed#[65A7948048252E96]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@2bc5b07b{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,UNAVAILABLE}
   [junit4]   2> NOTE: leaving temporary files on disk at: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_65A7948048252E96-001
   [junit4]   2> Apr 30, 2020 3:27:53 AM com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks
   [junit4]   2> WARNING: Will linger awaiting termination of 129 leaked thread(s).
   [junit4]   2> NOTE: test params are: codec=Asserting(Lucene84): {date=PostingsFormat(name=LuceneVarGapFixedInterval), rnd_b=BlockTreeOrds(blocksize=128), field=PostingsFormat(name=LuceneVarGapFixedInterval), docid=BlockTreeOrds(blocksize=128), multiDefault=PostingsFormat(name=LuceneVarGapFixedInterval), _root_=PostingsFormat(name=LuceneFixedGap), titleTokenized=PostingsFormat(name=MockRandom), id=BlockTreeOrds(blocksize=128), body=PostingsFormat(name=LuceneVarGapFixedInterval), title=PostingsFormat(name=LuceneFixedGap)}, docValues:{docid_intDV=DocValuesFormat(name=Asserting), range_facet_l_dv=DocValuesFormat(name=Direct), _version_=DocValuesFormat(name=Lucene80), range_facet_i_dv=DocValuesFormat(name=Lucene80), intDvoDefault=DocValuesFormat(name=Direct), titleDV=DocValuesFormat(name=Lucene80), timestamp=DocValuesFormat(name=Lucene80)}, maxPointsInLeafNode=1120, maxMBSortInHeap=7.774782266514823, sim=Asserting(org.apache.lucene.search.similarities.AssertingSimilarity@462fe092), locale=naq-NA, timezone=Etc/GMT-2
   [junit4]   2> NOTE: Linux 5.3.0-46-generic amd64/AdoptOpenJDK 12.0.2 (64-bit)/cpus=16,threads=7,free=89774264,total=444596224
   [junit4]   2> NOTE: All tests run in this JVM: [TestSimExtremeIndexing, TestFieldCacheVsDocValues, BitVectorTest, MoveReplicaTest, TestInitQParser, SparseHLLTest, RegexBytesRefFilterTest, TestInPlaceUpdatesStandalone, TestBadConfig, TestJmxIntegration, TestTrie, BlockJoinFacetDistribTest, MetricTriggerIntegrationTest, SplitShardTest, TestGraphMLResponseWriter, TestDeleteCollectionOnDownNodes, DateMathParserTest, AssignBackwardCompatibilityTest, LeaderElectionTest, UpdateParamsTest, HdfsRecoverLeaseTest, CursorPagingTest, ConfigSetsAPITest, TestCloudSearcherWarming, MetricsHistoryIntegrationTest, TestFilteredDocIdSet, DocValuesMissingTest, IndexBasedSpellCheckerTest, QueryParsingTest, TestNumericRangeQuery32, DimensionalRoutedAliasUpdateProcessorTest, TestAtomicUpdateErrorCases, IndexSchemaTest, JSONWriterTest, TestRebalanceLeaders, TestReRankQParserPlugin, SimpleCollectionCreateDeleteTest, XmlInterpolationTest, HdfsUnloadDistributedZkTest, VMParamsZkACLAndCredentialsProvidersTest, TriggerSetPropertiesIntegrationTest, TestReplicaProperties, TestManagedSchemaThreadSafety, TestReloadAndDeleteDocs, TestReversedWildcardFilterFactory, SolrJmxReporterTest, TestSortableTextField, AnalyticsQueryTest, PrimUtilsTest, ShardRoutingTest, TestWaitForStateWithJettyShutdowns, TestCloudInspectUtil, TestJsonFacetRefinement, TestHashPartitioner, TestFunctionQuery, SearchRateTriggerTest, OverseerStatusTest, TestAnalyzedSuggestions, TestPostingsSolrHighlighter, CheckHdfsIndexTest]
   [junit4] Completed [369/907 (1!)] on J3 in 62.99s, 5 tests, 1 error, 1 skipped <<< FAILURES!

[...truncated 39949 lines...]
-ecj-javadoc-lint-src:
    [mkdir] Created dir: /tmp/ecj848977538
 [ecj-lint] Compiling 931 source files to /tmp/ecj848977538
 [ecj-lint] ----------
 [ecj-lint] 1. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/CodecUtil.java (at line 523)
 [ecj-lint] 	throw new CorruptIndexException("misplaced codec footer (file truncated?): length=" + in.length() + " but footerLength==" + footerLength(), input);
 [ecj-lint] 	^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'in' is not closed at this location
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 2. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingStoredFieldsReader.java (at line 166)
 [ecj-lint] 	FieldsIndexReader fieldsIndexReader = new FieldsIndexReader(d, si.name, segmentSuffix, INDEX_EXTENSION_PREFIX, INDEX_CODEC_NAME, si.getId());
 [ecj-lint] 	                  ^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'fieldsIndexReader' is never closed
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 3. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingTermVectorsReader.java (at line 148)
 [ecj-lint] 	FieldsIndexReader fieldsIndexReader = new FieldsIndexReader(d, si.name, segmentSuffix, VECTORS_INDEX_EXTENSION_PREFIX, VECTORS_INDEX_CODEC_NAME, si.getId());
 [ecj-lint] 	                  ^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'fieldsIndexReader' is never closed
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 4. ERROR in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/search/IndexSearcher.java (at line 50)
 [ecj-lint] 	import org.apache.lucene.util.automaton.ByteRunAutomaton;
 [ecj-lint] 	       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 [ecj-lint] The import org.apache.lucene.util.automaton.ByteRunAutomaton is never used
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 5. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/util/automaton/Operations.java (at line 742)
 [ecj-lint] 	Integer q = newstate.get(statesSet);
 [ecj-lint] 	                         ^^^^^^^^^
 [ecj-lint] Unlikely argument type SortedIntSet for get(Object) on a Map<SortedIntSet.FrozenIntSet,Integer>
 [ecj-lint] ----------
 [ecj-lint] 5 problems (1 error, 4 warnings)

BUILD FAILED
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/build.xml:634: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/build.xml:101: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build.xml:201: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/common-build.xml:2127: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/common-build.xml:2166: Compile failed; see the compiler error output for details.

Total time: 40 minutes 45 seconds
Build step 'Invoke Ant' marked build as failure
Archiving artifacts
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
[WARNINGS] Skipping publisher since build result is FAILURE
Recording test results
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2

[JENKINS] Lucene-Solr-8.x-Linux (64bit/jdk-14) - Build # 2915 - Unstable!

Posted by Policeman Jenkins Server <je...@thetaphi.de>.
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-8.x-Linux/2915/
Java: 64bit/jdk-14 -XX:+UseCompressedOops -XX:+UseG1GC

2 tests failed.
FAILED:  org.apache.solr.index.hdfs.CheckHdfsIndexTest.doTest

Error Message:
Error from server at http://127.0.0.1:33415/collection1: java.lang.NullPointerException  at org.apache.solr.handler.admin.SystemInfoHandler.getSecurityInfo(SystemInfoHandler.java:326)  at org.apache.solr.handler.admin.SystemInfoHandler.handleRequestBody(SystemInfoHandler.java:146)  at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)  at org.apache.solr.core.SolrCore.execute(SolrCore.java:2600)  at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:803)  at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:582)  at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:432)  at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:362)  at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)  at org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)  at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)  at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)  at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)  at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1610)  at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)  at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1300)  at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)  at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)  at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1580)  at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)  at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1215)  at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)  at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)  at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)  at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:767)  at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)  at org.eclipse.jetty.server.Server.handle(Server.java:500)  at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)  at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)  at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)  at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)  at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)  at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)  at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)  at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)  at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)  at java.base/java.lang.Thread.run(Thread.java:832) 

Stack Trace:
org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:33415/collection1: java.lang.NullPointerException
	at org.apache.solr.handler.admin.SystemInfoHandler.getSecurityInfo(SystemInfoHandler.java:326)
	at org.apache.solr.handler.admin.SystemInfoHandler.handleRequestBody(SystemInfoHandler.java:146)
	at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)
	at org.apache.solr.core.SolrCore.execute(SolrCore.java:2600)
	at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:803)
	at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:582)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:432)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:362)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)
	at org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)
	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1610)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1300)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)
	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1580)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1215)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)
	at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:767)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.server.Server.handle(Server.java:500)
	at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)
	at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)
	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)
	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)
	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
	at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)
	at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)
	at java.base/java.lang.Thread.run(Thread.java:832)

	at __randomizedtesting.SeedInfo.seed([A34733627ED746C4:4038BC6136C557D]:0)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:665)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:265)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:248)
	at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:211)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1003)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1018)
	at org.apache.solr.index.hdfs.CheckHdfsIndexTest.doTest(CheckHdfsIndexTest.java:120)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:564)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1081)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1053)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at java.base/java.lang.Thread.run(Thread.java:832)


FAILED:  org.apache.solr.index.hdfs.CheckHdfsIndexTest.doTest

Error Message:
Error from server at http://127.0.0.1:38735/collection1: java.lang.NullPointerException  at org.apache.solr.handler.admin.SystemInfoHandler.getSecurityInfo(SystemInfoHandler.java:326)  at org.apache.solr.handler.admin.SystemInfoHandler.handleRequestBody(SystemInfoHandler.java:146)  at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)  at org.apache.solr.core.SolrCore.execute(SolrCore.java:2600)  at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:803)  at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:582)  at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:432)  at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:362)  at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)  at org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)  at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)  at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)  at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)  at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1610)  at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)  at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1300)  at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)  at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)  at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1580)  at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)  at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1215)  at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)  at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)  at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)  at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:767)  at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)  at org.eclipse.jetty.server.Server.handle(Server.java:500)  at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)  at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)  at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)  at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)  at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)  at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)  at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)  at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)  at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)  at java.base/java.lang.Thread.run(Thread.java:832) 

Stack Trace:
org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:38735/collection1: java.lang.NullPointerException
	at org.apache.solr.handler.admin.SystemInfoHandler.getSecurityInfo(SystemInfoHandler.java:326)
	at org.apache.solr.handler.admin.SystemInfoHandler.handleRequestBody(SystemInfoHandler.java:146)
	at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)
	at org.apache.solr.core.SolrCore.execute(SolrCore.java:2600)
	at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:803)
	at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:582)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:432)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:362)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)
	at org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)
	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1610)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1300)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)
	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1580)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1215)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)
	at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:767)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.server.Server.handle(Server.java:500)
	at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)
	at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)
	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)
	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)
	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
	at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)
	at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)
	at java.base/java.lang.Thread.run(Thread.java:832)

	at __randomizedtesting.SeedInfo.seed([A34733627ED746C4:4038BC6136C557D]:0)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:665)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:265)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:248)
	at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:211)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1003)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1018)
	at org.apache.solr.index.hdfs.CheckHdfsIndexTest.doTest(CheckHdfsIndexTest.java:120)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:564)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1081)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1053)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at java.base/java.lang.Thread.run(Thread.java:832)




Build Log:
[...truncated 16372 lines...]
   [junit4] Suite: org.apache.solr.index.hdfs.CheckHdfsIndexTest
   [junit4]   2> 1023244 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.a.s.SolrTestCase Setting 'solr.default.confdir' system property to test-framework derived value of '/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/server/solr/configsets/_default/conf'
   [junit4]   2> 1023245 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.a.s.SolrTestCaseJ4 Created dataDir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J5/temp/solr.index.hdfs.CheckHdfsIndexTest_A34733627ED746C4-001/data-dir-138-001
   [junit4]   2> 1023245 WARN  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.a.s.SolrTestCaseJ4 startTrackingSearchers: numOpens=38 numCloses=38
   [junit4]   2> 1023245 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.a.s.SolrTestCaseJ4 Using PointFields (NUMERIC_POINTS_SYSPROP=true) w/NUMERIC_DOCVALUES_SYSPROP=true
   [junit4]   2> 1023245 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.a.s.SolrTestCaseJ4 Randomized ssl (false) and clientAuth (true) via: @org.apache.solr.util.RandomizeSSL(reason="", value=0.0/0.0, ssl=0.0/0.0, clientAuth=0.0/0.0)
   [junit4]   2> 1023245 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.a.s.SolrTestCaseJ4 SecureRandom sanity checks: test.solr.allowed.securerandom=null & java.security.egd=file:/dev/./urandom
   [junit4]   2> 1023245 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.a.s.BaseDistributedSearchTestCase Setting hostContext system property: /
   [junit4]   1> Formatting using clusterid: testClusterID
   [junit4]   2> 1023265 WARN  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 1023267 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 14+36
   [junit4]   2> 1023272 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 1023272 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 1023272 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 1023289 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@66dcace1{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,AVAILABLE}
   [junit4]   2> 1023380 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.w.WebAppContext@4ac53507{hdfs,/,file:///home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J5/temp/jetty-localhost_localdomain-33169-hadoop-hdfs-3_2_0-tests_jar-_-any-13496652660396838948.dir/webapp/,AVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/hdfs}
   [junit4]   2> 1023381 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.e.j.s.AbstractConnector Started ServerConnector@5450f000{HTTP/1.1, (http/1.1)}{localhost.localdomain:33169}
   [junit4]   2> 1023381 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.e.j.s.Server Started @1023391ms
   [junit4]   2> 1023425 WARN  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 1023426 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 14+36
   [junit4]   2> 1023427 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 1023427 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 1023427 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 1023428 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@50e70df{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,AVAILABLE}
   [junit4]   2> 1023525 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.w.WebAppContext@4d16ebc7{datanode,/,file:///home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J5/temp/jetty-localhost-38075-hadoop-hdfs-3_2_0-tests_jar-_-any-2286865835582356784.dir/webapp/,AVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/datanode}
   [junit4]   2> 1023526 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.e.j.s.AbstractConnector Started ServerConnector@2acb2f90{HTTP/1.1, (http/1.1)}{localhost:38075}
   [junit4]   2> 1023526 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.e.j.s.Server Started @1023536ms
   [junit4]   2> 1023565 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0xb87c0ebd6a745c99: Processing first storage report for DS-8efbe20a-bb55-4457-9e08-0c3ba492b406 from datanode 1ea5645a-954d-48ee-9edf-55bbf5fa7e9a
   [junit4]   2> 1023565 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0xb87c0ebd6a745c99: from storage DS-8efbe20a-bb55-4457-9e08-0c3ba492b406 node DatanodeRegistration(127.0.0.1:41471, datanodeUuid=1ea5645a-954d-48ee-9edf-55bbf5fa7e9a, infoPort=38645, infoSecurePort=0, ipcPort=38365, storageInfo=lv=-57;cid=testClusterID;nsid=1261959287;c=1588210993835), blocks: 0, hasStaleStorage: true, processing time: 0 msecs, invalidatedBlocks: 0
   [junit4]   2> 1023565 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0xb87c0ebd6a745c99: Processing first storage report for DS-7232b16d-ec18-4d14-b28f-b9388f486d92 from datanode 1ea5645a-954d-48ee-9edf-55bbf5fa7e9a
   [junit4]   2> 1023565 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0xb87c0ebd6a745c99: from storage DS-7232b16d-ec18-4d14-b28f-b9388f486d92 node DatanodeRegistration(127.0.0.1:41471, datanodeUuid=1ea5645a-954d-48ee-9edf-55bbf5fa7e9a, infoPort=38645, infoSecurePort=0, ipcPort=38365, storageInfo=lv=-57;cid=testClusterID;nsid=1261959287;c=1588210993835), blocks: 0, hasStaleStorage: false, processing time: 0 msecs, invalidatedBlocks: 0
   [junit4]   2> 1023657 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER
   [junit4]   2> 1023657 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer client port:0.0.0.0/0.0.0.0:0
   [junit4]   2> 1023657 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer Starting server
   [junit4]   2> 1023757 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.ZkTestServer start zk server on port:43717
   [junit4]   2> 1023757 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.ZkTestServer waitForServerUp: 127.0.0.1:43717
   [junit4]   2> 1023757 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.ZkTestServer parse host and port list: 127.0.0.1:43717
   [junit4]   2> 1023757 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.ZkTestServer connecting to 127.0.0.1 43717
   [junit4]   2> 1023759 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1023760 INFO  (zkConnectionManagerCallback-16089-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1023760 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1023761 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1023762 INFO  (zkConnectionManagerCallback-16091-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1023762 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1023763 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig-tlog.xml to /configs/conf1/solrconfig.xml
   [junit4]   2> 1023770 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/schema.xml to /configs/conf1/schema.xml
   [junit4]   2> 1023776 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig.snippet.randomindexconfig.xml to /configs/conf1/solrconfig.snippet.randomindexconfig.xml
   [junit4]   2> 1023778 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/stopwords.txt to /configs/conf1/stopwords.txt
   [junit4]   2> 1023779 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/protwords.txt to /configs/conf1/protwords.txt
   [junit4]   2> 1023779 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/currency.xml to /configs/conf1/currency.xml
   [junit4]   2> 1023779 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/enumsConfig.xml to /configs/conf1/enumsConfig.xml
   [junit4]   2> 1023780 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/open-exchange-rates.json to /configs/conf1/open-exchange-rates.json
   [junit4]   2> 1023780 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/mapping-ISOLatin1Accent.txt to /configs/conf1/mapping-ISOLatin1Accent.txt
   [junit4]   2> 1023781 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/old_synonyms.txt to /configs/conf1/old_synonyms.txt
   [junit4]   2> 1023781 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/synonyms.txt to /configs/conf1/synonyms.txt
   [junit4]   2> 1023782 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Will use NRT replicas unless explicitly asked otherwise
   [junit4]   2> 1023839 WARN  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 1023839 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 1023839 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ...
   [junit4]   2> 1023839 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 14+36
   [junit4]   2> 1023841 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 1023841 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 1023841 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 1023842 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@77a10bee{/,null,AVAILABLE}
   [junit4]   2> 1023846 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.e.j.s.AbstractConnector Started ServerConnector@7acfe7cf{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:37709}
   [junit4]   2> 1023846 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.e.j.s.Server Started @1023857ms
   [junit4]   2> 1023846 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/, solr.data.dir=hdfs://localhost.localdomain:44793/hdfs__localhost.localdomain_44793__home_jenkins_workspace_Lucene-Solr-8.x-Linux_solr_build_solr-core_test_J5_temp_solr.index.hdfs.CheckHdfsIndexTest_A34733627ED746C4-001_tempDir-002_control_data, hostPort=37709, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J5/temp/solr.index.hdfs.CheckHdfsIndexTest_A34733627ED746C4-001/control-001/cores}
   [junit4]   2> 1023847 ERROR (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 1023847 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 1023847 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solrâ„¢ version 8.6.0
   [junit4]   2> 1023847 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 1023847 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr
   [junit4]   2> 1023847 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-04-30T01:43:14.425154Z
   [junit4]   2> 1023847 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1023848 INFO  (zkConnectionManagerCallback-16093-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1023848 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1023949 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 1023949 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J5/temp/solr.index.hdfs.CheckHdfsIndexTest_A34733627ED746C4-001/control-001/solr.xml
   [junit4]   2> 1023951 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 1023951 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 1023951 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@e24a5d7, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 1024038 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 1024039 WARN  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@4bd89514[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1024039 WARN  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@4bd89514[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1024040 WARN  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@7cb583ed[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1024040 WARN  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@7cb583ed[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1024041 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:43717/solr
   [junit4]   2> 1024042 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1024042 INFO  (zkConnectionManagerCallback-16104-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1024042 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1024143 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [n:127.0.0.1:37709_     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1024144 INFO  (zkConnectionManagerCallback-16106-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1024144 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [n:127.0.0.1:37709_     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1024176 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [n:127.0.0.1:37709_     ] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:37709_
   [junit4]   2> 1024176 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [n:127.0.0.1:37709_     ] o.a.s.c.Overseer Overseer (id=72185115526496260-127.0.0.1:37709_-n_0000000000) starting
   [junit4]   2> 1024178 INFO  (OverseerStateUpdate-72185115526496260-127.0.0.1:37709_-n_0000000000) [n:127.0.0.1:37709_     ] o.a.s.c.Overseer Starting to work on the main queue : 127.0.0.1:37709_
   [junit4]   2> 1024178 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [n:127.0.0.1:37709_     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:37709_
   [junit4]   2> 1024179 INFO  (zkCallback-16105-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 1024179 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [n:127.0.0.1:37709_     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 1024179 WARN  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [n:127.0.0.1:37709_     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=you make it. Consider configuring authentication/authorization before exposing Solr to users internal or  authorization=external.  See https://s.apache.org/solrsecurity for more info.  Solr is only as secure as disableddisabled
   [junit4]   2> 1024187 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [n:127.0.0.1:37709_     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 1024204 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [n:127.0.0.1:37709_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@e24a5d7
   [junit4]   2> 1024213 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [n:127.0.0.1:37709_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@e24a5d7
   [junit4]   2> 1024213 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [n:127.0.0.1:37709_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@e24a5d7
   [junit4]   2> 1024214 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [n:127.0.0.1:37709_     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J5/temp/solr.index.hdfs.CheckHdfsIndexTest_A34733627ED746C4-001/control-001/cores
   [junit4]   2> 1024222 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1024230 INFO  (zkConnectionManagerCallback-16123-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1024230 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1024231 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 1024232 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:43717/solr ready
   [junit4]   2> 1024243 INFO  (qtp2129820679-24959) [n:127.0.0.1:37709_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:37709_&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 1024246 INFO  (OverseerThreadFactory-16113-thread-1-processing-n:127.0.0.1:37709_) [n:127.0.0.1:37709_     ] o.a.s.c.a.c.CreateCollectionCmd Create collection control_collection
   [junit4]   2> 1024350 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 1024351 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 1024353 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_    x:control_collection_shard1_replica_n1 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 1024353 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_    x:control_collection_shard1_replica_n1 ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 1025358 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.6.0
   [junit4]   2> 1025370 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.IndexSchema Schema name=test
   [junit4]   2> 1025424 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 1025431 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.CoreContainer Creating SolrCore 'control_collection_shard1_replica_n1' using configuration from configset conf1, trusted=true
   [junit4]   2> 1025432 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.control_collection.shard1.replica_n1' (registry 'solr.core.control_collection.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@e24a5d7
   [junit4]   2> 1025432 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory solr.hdfs.home=hdfs://localhost.localdomain:44793/solr_hdfs_home
   [junit4]   2> 1025432 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Solr Kerberos Authentication disabled
   [junit4]   2> 1025432 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrCore [[control_collection_shard1_replica_n1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J5/temp/solr.index.hdfs.CheckHdfsIndexTest_A34733627ED746C4-001/control-001/cores/control_collection_shard1_replica_n1], dataDir=[hdfs://localhost.localdomain:44793/solr_hdfs_home/control_collection/core_node2/data/]
   [junit4]   2> 1025432 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:44793/solr_hdfs_home/control_collection/core_node2/data/snapshot_metadata
   [junit4]   2> 1025448 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 1025448 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 1025448 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 1025452 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 1025452 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:44793/solr_hdfs_home/control_collection/core_node2/data
   [junit4]   2> 1025466 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:44793/solr_hdfs_home/control_collection/core_node2/data/index
   [junit4]   2> 1025471 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 1025471 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 1025471 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 1025474 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 1025475 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogDocMergePolicy: [LogDocMergePolicy: minMergeSize=1000, mergeFactor=21, maxMergeSize=9223372036854775807, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.47385221018568835]
   [junit4]   2> 1025895 WARN  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 1025940 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.HdfsUpdateLog
   [junit4]   2> 1025940 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 1025940 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.HdfsUpdateLog Initializing HdfsUpdateLog: tlogDfsReplication=2
   [junit4]   2> 1025951 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 1025951 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 1025952 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=24, maxMergeAtOnceExplicit=45, maxMergedSegmentMB=20.4189453125, floorSegmentMB=0.625, forceMergeDeletesPctAllowed=0.7556694266661701, segmentsPerTier=23.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0, deletesPctAllowed=46.17491984845125
   [junit4]   2> 1025957 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@7016bef5[control_collection_shard1_replica_n1] main]
   [junit4]   2> 1025958 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 1025958 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 1025960 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms
   [junit4]   2> 1025961 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1665359933906878464
   [junit4]   2> 1025963 INFO  (searcherExecutor-16125-thread-1-processing-n:127.0.0.1:37709_ x:control_collection_shard1_replica_n1 c:control_collection s:shard1) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrCore [control_collection_shard1_replica_n1] Registered new searcher Searcher@7016bef5[control_collection_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 1025966 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ZkShardTerms Successful update of terms at /collections/control_collection/terms/shard1 to Terms{values={core_node2=0}, version=0}
   [junit4]   2> 1025966 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/control_collection/leaders/shard1
   [junit4]   2> 1025967 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 1025967 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 1025967 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:37709/control_collection_shard1_replica_n1/
   [junit4]   2> 1025967 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 1025967 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy http://127.0.0.1:37709/control_collection_shard1_replica_n1/ has no replicas
   [junit4]   2> 1025967 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/control_collection/leaders/shard1/leader after winning as /collections/control_collection/leader_elect/shard1/election/72185115526496260-core_node2-n_0000000000
   [junit4]   2> 1025968 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:37709/control_collection_shard1_replica_n1/ shard1
   [junit4]   2> 1026069 INFO  (zkCallback-16105-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 1026069 INFO  (zkCallback-16105-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 1026069 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 1026070 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1717
   [junit4]   2> 1026071 INFO  (qtp2129820679-24959) [n:127.0.0.1:37709_     ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas
   [junit4]   2> 1026170 INFO  (zkCallback-16105-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 1026170 INFO  (zkCallback-16105-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 1026170 INFO  (zkCallback-16105-thread-3) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 1026173 INFO  (qtp2129820679-24959) [n:127.0.0.1:37709_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:37709_&wt=javabin&version=2} status=0 QTime=1929
   [junit4]   2> 1026173 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Waiting to see 1 active replicas in collection: control_collection
   [junit4]   2> 1026247 INFO  (OverseerCollectionConfigSetProcessor-72185115526496260-127.0.0.1:37709_-n_0000000000) [n:127.0.0.1:37709_     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000000 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 1026281 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1026282 INFO  (zkConnectionManagerCallback-16134-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1026282 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1026283 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 1026283 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:43717/solr ready
   [junit4]   2> 1026283 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.ChaosMonkey monkey: init - expire sessions:false cause connection loss:false
   [junit4]   2> 1026284 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=2&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 1026286 INFO  (OverseerThreadFactory-16113-thread-2-processing-n:127.0.0.1:37709_) [n:127.0.0.1:37709_     ] o.a.s.c.a.c.CreateCollectionCmd Create collection collection1
   [junit4]   2> 1026488 WARN  (OverseerThreadFactory-16113-thread-2-processing-n:127.0.0.1:37709_) [n:127.0.0.1:37709_     ] o.a.s.c.a.c.CreateCollectionCmd It is unusual to create a collection (collection1) without cores.
   [junit4]   2> 1026488 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_     ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas
   [junit4]   2> 1026488 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=2&wt=javabin&version=2} status=0 QTime=204
   [junit4]   2> 1026489 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 1026489 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 1026489 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 1026489 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 1026489 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 1026489 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 1026489 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Creating jetty instances pullReplicaCount=0 numOtherReplicas=1
   [junit4]   2> 1026547 INFO  (TEST-CheckHdfsIndexTest.doTest-seed#[A34733627ED746C4]) [     ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 1 in directory /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J5/temp/solr.index.hdfs.CheckHdfsIndexTest_A34733627ED746C4-001/shard-1-001 of type NRT for shard1
   [junit4]   2> 1026548 WARN  (closeThreadPool-16135-thread-1) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 1026548 INFO  (closeThreadPool-16135-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 1026548 INFO  (closeThreadPool-16135-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ...
   [junit4]   2> 1026548 INFO  (closeThreadPool-16135-thread-1) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 14+36
   [junit4]   2> 1026549 INFO  (closeThreadPool-16135-thread-1) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 1026549 INFO  (closeThreadPool-16135-thread-1) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 1026549 INFO  (closeThreadPool-16135-thread-1) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 1026550 INFO  (closeThreadPool-16135-thread-1) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@1003280d{/,null,AVAILABLE}
   [junit4]   2> 1026550 INFO  (closeThreadPool-16135-thread-1) [     ] o.e.j.s.AbstractConnector Started ServerConnector@3f777a37{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:38735}
   [junit4]   2> 1026550 INFO  (closeThreadPool-16135-thread-1) [     ] o.e.j.s.Server Started @1026560ms
   [junit4]   2> 1026550 INFO  (closeThreadPool-16135-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/, solrconfig=solrconfig.xml, solr.data.dir=hdfs://localhost.localdomain:44793/hdfs__localhost.localdomain_44793__home_jenkins_workspace_Lucene-Solr-8.x-Linux_solr_build_solr-core_test_J5_temp_solr.index.hdfs.CheckHdfsIndexTest_A34733627ED746C4-001_tempDir-002_jetty1, hostPort=38735, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J5/temp/solr.index.hdfs.CheckHdfsIndexTest_A34733627ED746C4-001/shard-1-001/cores}
   [junit4]   2> 1026550 ERROR (closeThreadPool-16135-thread-1) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 1026550 INFO  (closeThreadPool-16135-thread-1) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 1026550 INFO  (closeThreadPool-16135-thread-1) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solrâ„¢ version 8.6.0
   [junit4]   2> 1026550 INFO  (closeThreadPool-16135-thread-1) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 1026550 INFO  (closeThreadPool-16135-thread-1) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr
   [junit4]   2> 1026550 INFO  (closeThreadPool-16135-thread-1) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-04-30T01:43:17.128819Z
   [junit4]   2> 1026554 INFO  (closeThreadPool-16135-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1026554 INFO  (zkConnectionManagerCallback-16137-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1026554 INFO  (closeThreadPool-16135-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1026655 INFO  (closeThreadPool-16135-thread-1) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 1026655 INFO  (closeThreadPool-16135-thread-1) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J5/temp/solr.index.hdfs.CheckHdfsIndexTest_A34733627ED746C4-001/shard-1-001/solr.xml
   [junit4]   2> 1026657 INFO  (closeThreadPool-16135-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 1026657 INFO  (closeThreadPool-16135-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 1026658 INFO  (closeThreadPool-16135-thread-1) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@e24a5d7, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 1026704 INFO  (closeThreadPool-16135-thread-1) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 1026705 WARN  (closeThreadPool-16135-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@3c4875cc[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1026705 WARN  (closeThreadPool-16135-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@3c4875cc[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1026706 WARN  (closeThreadPool-16135-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@2200da99[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1026706 WARN  (closeThreadPool-16135-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@2200da99[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 1026707 INFO  (closeThreadPool-16135-thread-1) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:43717/solr
   [junit4]   2> 1026707 INFO  (closeThreadPool-16135-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1026708 INFO  (zkConnectionManagerCallback-16148-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1026708 INFO  (closeThreadPool-16135-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1026809 INFO  (closeThreadPool-16135-thread-1) [n:127.0.0.1:38735_     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 1026810 INFO  (zkConnectionManagerCallback-16150-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 1026810 INFO  (closeThreadPool-16135-thread-1) [n:127.0.0.1:38735_     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 1026812 INFO  (closeThreadPool-16135-thread-1) [n:127.0.0.1:38735_     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 1026813 INFO  (closeThreadPool-16135-thread-1) [n:127.0.0.1:38735_     ] o.a.s.c.ZkController Publish node=127.0.0.1:38735_ as DOWN
   [junit4]   2> 1026814 INFO  (closeThreadPool-16135-thread-1) [n:127.0.0.1:38735_     ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 1026814 INFO  (closeThreadPool-16135-thread-1) [n:127.0.0.1:38735_     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:38735_
   [junit4]   2> 1026814 INFO  (zkCallback-16105-thread-3) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 1026814 INFO  (zkCallback-16149-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 1026814 INFO  (zkCallback-16133-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 1026815 INFO  (closeThreadPool-16135-thread-1) [n:127.0.0.1:38735_     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 1026815 WARN  (closeThreadPool-16135-thread-1) [n:127.0.0.1:38735_     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=you make it. Consider configuring authentication/authorization before exposing Solr to users internal or  authorization=external.  See https://s.apache.org/solrsecurity for more info.  Solr is only as secure as disableddisabled
   [junit4]   2> 1026823 INFO  (closeThreadPool-16135-thread-1) [n:127.0.0.1:38735_     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 1026833 INFO  (closeThreadPool-16135-thread-1) [n:127.0.0.1:38735_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@e24a5d7
   [junit4]   2> 1026842 INFO  (closeThreadPool-16135-thread-1) [n:127.0.0.1:38735_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@e24a5d7
   [junit4]   2> 1026842 INFO  (closeThreadPool-16135-thread-1) [n:127.0.0.1:38735_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@e24a5d7
   [junit4]   2> 1026843 INFO  (closeThreadPool-16135-thread-1) [n:127.0.0.1:38735_     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J5/temp/solr.index.hdfs.CheckHdfsIndexTest_A34733627ED746C4-001/shard-1-001/cores
   [junit4]   2> 1026847 INFO  (closeThreadPool-16135-thread-1) [     ] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:38735_
   [junit4]   2> 1026848 INFO  (qtp2129820679-24959) [n:127.0.0.1:37709_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :addreplica with params node=127.0.0.1:38735_&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 1026849 INFO  (OverseerCollectionConfigSetProcessor-72185115526496260-127.0.0.1:37709_-n_0000000000) [n:127.0.0.1:37709_     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000002 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 1026852 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.core.control_collection.shard1.replica_n1:INDEX.sizeInBytes} status=0 QTime=1
   [junit4]   2> 1026853 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 1026854 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 1026856 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.core.control_collection.shard1.replica_n1:INDEX.sizeInBytes} status=0 QTime=1
   [junit4]   2> 1026857 INFO  (qtp2129820679-24961) [n:127.0.0.1:37709_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 1026857 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 1026858 INFO  (OverseerThreadFactory-16113-thread-3-processing-n:127.0.0.1:37709_) [n:127.0.0.1:37709_ c:collection1 s:shard1   ] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:38735_ for creating new replica of shard shard1 for collection collection1
   [junit4]   2> 1026859 INFO  (OverseerThreadFactory-16113-thread-3-processing-n:127.0.0.1:37709_) [n:127.0.0.1:37709_ c:collection1 s:shard1   ] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command.
   [junit4]   2> 1026860 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_    x:collection1_shard1_replica_n1 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_n1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 1027867 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.6.0
   [junit4]   2> 1027875 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.IndexSchema Schema name=test
   [junit4]   2> 1027934 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 1027942 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard1_replica_n1' using configuration from configset conf1, trusted=true
   [junit4]   2> 1027942 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.collection1.shard1.replica_n1' (registry 'solr.core.collection1.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@e24a5d7
   [junit4]   2> 1027942 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory solr.hdfs.home=hdfs://localhost.localdomain:44793/solr_hdfs_home
   [junit4]   2> 1027942 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Solr Kerberos Authentication disabled
   [junit4]   2> 1027942 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrCore [[collection1_shard1_replica_n1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J5/temp/solr.index.hdfs.CheckHdfsIndexTest_A34733627ED746C4-001/shard-1-001/cores/collection1_shard1_replica_n1], dataDir=[hdfs://localhost.localdomain:44793/solr_hdfs_home/collection1/core_node2/data/]
   [junit4]   2> 1027943 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:44793/solr_hdfs_home/collection1/core_node2/data/snapshot_metadata
   [junit4]   2> 1027948 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 1027948 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 1027948 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 1027951 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 1027952 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:44793/solr_hdfs_home/collection1/core_node2/data
   [junit4]   2> 1027962 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:44793/solr_hdfs_home/collection1/core_node2/data/index
   [junit4]   2> 1027966 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 1027966 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 1027966 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 1027968 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 1027969 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogDocMergePolicy: [LogDocMergePolicy: minMergeSize=1000, mergeFactor=21, maxMergeSize=9223372036854775807, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.47385221018568835]
   [junit4]   2> 1028382 WARN  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 1028407 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.HdfsUpdateLog
   [junit4]   2> 1028407 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 1028407 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.HdfsUpdateLog Initializing HdfsUpdateLog: tlogDfsReplication=2
   [junit4]   2> 1028414 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 1028414 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 1028415 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=24, maxMergeAtOnceExplicit=45, maxMergedSegmentMB=20.4189453125, floorSegmentMB=0.625, forceMergeDeletesPctAllowed=0.7556694266661701, segmentsPerTier=23.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0, deletesPctAllowed=46.17491984845125
   [junit4]   2> 1028419 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@3c7b0bc5[collection1_shard1_replica_n1] main]
   [junit4]   2> 1028419 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 1028419 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 1028420 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms
   [junit4]   2> 1028420 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1665359936485326848
   [junit4]   2> 1028422 INFO  (searcherExecutor-16161-thread-1-processing-n:127.0.0.1:38735_ x:collection1_shard1_replica_n1 c:collection1 s:shard1) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrCore [collection1_shard1_replica_n1] Registered new searcher Searcher@3c7b0bc5[collection1_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 1028422 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection1/terms/shard1 to Terms{values={core_node2=0}, version=0}
   [junit4]   2> 1028422 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection1/leaders/shard1
   [junit4]   2> 1028423 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 1028423 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 1028423 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:38735/collection1_shard1_replica_n1/
   [junit4]   2> 1028423 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 1028423 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy http://127.0.0.1:38735/collection1_shard1_replica_n1/ has no replicas
   [junit4]   2> 1028423 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/collection1/leaders/shard1/leader after winning as /collections/collection1/leader_elect/shard1/election/72185115526496265-core_node2-n_0000000000
   [junit4]   2> 1028424 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:38735/collection1_shard1_replica_n1/ shard1
   [junit4]   2> 1028525 INFO  (zkCallback-16149-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 1028526 INFO  (zkCallback-16149-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 1028526 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 1028531 INFO  (qtp994638214-25019) [n:127.0.0.1:38735_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_n1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1670
   [junit4]   2> 1028532 INFO  (qtp2129820679-24959) [n:127.0.0.1:37709_ c:collection1    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:38735_&action=ADDREPLICA&collection=collection1&shard=sh

[...truncated too long message...]

oud.hdfs.HdfsTestUtil.teardownClass(HdfsTestUtil.java:283) ~[test/:?]
   [junit4]   2> 	at org.apache.solr.index.hdfs.CheckHdfsIndexTest.teardownClass(CheckHdfsIndexTest.java:71) ~[test/:?]
   [junit4]   2> 	at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
   [junit4]   2> 	at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?]
   [junit4]   2> 	at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
   [junit4]   2> 	at java.lang.reflect.Method.invoke(Method.java:564) ~[?:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:901) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) ~[java/:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41) ~[java/:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) ~[java/:?]
   [junit4]   2> 	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) ~[java/:?]
   [junit4]   2> 	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) ~[java/:?]
   [junit4]   2> 	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54) ~[java/:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at java.lang.Thread.run(Thread.java:832) [?:?]
   [junit4]   2> 172231 WARN  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.a.h.h.s.d.DirectoryScanner DirectoryScanner: shutdown has been called
   [junit4]   2> 172276 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.w.WebAppContext@1a3cf07c{datanode,/,null,UNAVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/datanode}
   [junit4]   2> 172276 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@49694087{HTTP/1.1, (http/1.1)}{localhost:0}
   [junit4]   2> 172276 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 172276 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@5bbeb697{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,UNAVAILABLE}
   [junit4]   2> 172277 WARN  (BP-1807086933-127.0.0.1-1588211662332 heartbeating to localhost.localdomain/127.0.0.1:40871) [     ] o.a.h.h.s.d.IncrementalBlockReportManager IncrementalBlockReportManager interrupted
   [junit4]   2> 172277 WARN  (BP-1807086933-127.0.0.1-1588211662332 heartbeating to localhost.localdomain/127.0.0.1:40871) [     ] o.a.h.h.s.d.DataNode Ending block pool service for: Block pool BP-1807086933-127.0.0.1-1588211662332 (Datanode Uuid 5a251169-e493-4cf6-95be-10dfcb061e58) service to localhost.localdomain/127.0.0.1:40871
   [junit4]   2> 172309 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.w.WebAppContext@b63fec3{hdfs,/,null,UNAVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/hdfs}
   [junit4]   2> 172310 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@8f602cc{HTTP/1.1, (http/1.1)}{localhost.localdomain:0}
   [junit4]   2> 172310 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 172310 INFO  (SUITE-CheckHdfsIndexTest-seed#[A34733627ED746C4]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@20348b8b{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,UNAVAILABLE}
   [junit4]   2> NOTE: leaving temporary files on disk at: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J2/temp/solr.index.hdfs.CheckHdfsIndexTest_A34733627ED746C4-001
   [junit4]   2> Apr 30, 2020 1:57:10 AM com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks
   [junit4]   2> WARNING: Will linger awaiting termination of 65 leaked thread(s).
   [junit4]   2> NOTE: test params are: codec=Asserting(Lucene84): {date=PostingsFormat(name=MockRandom), rnd_b=Lucene84, field=PostingsFormat(name=MockRandom), multiDefault=BlockTreeOrds(blocksize=128), docid=Lucene84, _root_=PostingsFormat(name=MockRandom), titleTokenized=FST50, id=Lucene84, body=PostingsFormat(name=MockRandom), title=BlockTreeOrds(blocksize=128)}, docValues:{range_facet_l_dv=DocValuesFormat(name=Lucene80), n_l1=DocValuesFormat(name=Asserting), intDefault=DocValuesFormat(name=Asserting), n_dt1=DocValuesFormat(name=Lucene80), n_td1=DocValuesFormat(name=Lucene80), n_d1=DocValuesFormat(name=Asserting), range_facet_l=DocValuesFormat(name=Asserting), n_f1=DocValuesFormat(name=Lucene80), n_ti1=DocValuesFormat(name=Lucene80), docid_intDV=DocValuesFormat(name=Direct), n_tl1=DocValuesFormat(name=Lucene80), _version_=DocValuesFormat(name=Asserting), n_tf1=DocValuesFormat(name=Asserting), n_tdt1=DocValuesFormat(name=Asserting), id_i1=DocValuesFormat(name=Lucene80), range_facet_i_dv=DocValuesFormat(name=Asserting), intDvoDefault=DocValuesFormat(name=Direct), titleDV=DocValuesFormat(name=Asserting), timestamp=DocValuesFormat(name=Asserting)}, maxPointsInLeafNode=1706, maxMBSortInHeap=5.039073755871414, sim=Asserting(org.apache.lucene.search.similarities.AssertingSimilarity@3d9a31e4), locale=en-AI, timezone=Canada/Yukon
   [junit4]   2> NOTE: Linux 5.3.0-46-generic amd64/AdoptOpenJDK 14 (64-bit)/cpus=16,threads=13,free=49599936,total=229638144
   [junit4]   2> NOTE: All tests run in this JVM: [CheckHdfsIndexTest]
   [junit4] Completed [5/5 (5!)] on J2 in 174.49s, 5 tests, 1 error, 1 skipped <<< FAILURES!

[...truncated 17 lines...]
BUILD FAILED
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/common-build.xml:1599: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/common-build.xml:1126: There were test failures: 5 suites, 25 tests, 5 errors, 5 ignored [seed: A34733627ED746C4]

Total time: 2 minutes 57 seconds

[repro] Setting last failure code to 256

[repro] Failures w/original seeds:
[repro]   5/5 failed: org.apache.solr.index.hdfs.CheckHdfsIndexTest
[repro] Exiting with code 256
+ mv lucene/build lucene/build.repro
+ mv solr/build solr/build.repro
+ mv lucene/build.orig lucene/build
+ mv solr/build.orig solr/build
Archiving artifacts
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
[WARNINGS] Parsing warnings in console log with parser Java Compiler (javac)
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
No credentials specified
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
<Git Blamer> Using GitBlamer to create author and commit information for all warnings.
<Git Blamer> GIT_COMMIT=92a7d56ada9ff3213aad2e238868b00dc7f9be06, workspace=/home/jenkins/workspace/Lucene-Solr-8.x-Linux
[WARNINGS] Computing warning deltas based on reference build #2907
Recording test results
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Build step 'Publish JUnit test result report' changed build result to UNSTABLE
Email was triggered for: Unstable (Test Failures)
Sending email for trigger: Unstable (Test Failures)
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2

[JENKINS] Lucene-Solr-8.x-Linux (64bit/jdk1.8.0_201) - Build # 2914 - Still Failing!

Posted by Policeman Jenkins Server <je...@thetaphi.de>.
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-8.x-Linux/2914/
Java: 64bit/jdk1.8.0_201 -XX:-UseCompressedOops -XX:+UseSerialGC

1 tests failed.
FAILED:  org.apache.solr.index.hdfs.CheckHdfsIndexTest.doTest

Error Message:
Error from server at https://127.0.0.1:34717/x_/collection1: java.lang.NullPointerException  at org.apache.solr.handler.admin.SystemInfoHandler.getSecurityInfo(SystemInfoHandler.java:326)  at org.apache.solr.handler.admin.SystemInfoHandler.handleRequestBody(SystemInfoHandler.java:146)  at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)  at org.apache.solr.core.SolrCore.execute(SolrCore.java:2600)  at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:803)  at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:582)  at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:432)  at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:362)  at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)  at org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)  at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)  at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)  at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)  at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1610)  at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)  at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1300)  at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)  at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)  at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1580)  at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)  at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1215)  at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)  at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)  at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)  at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:767)  at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)  at org.eclipse.jetty.server.Server.handle(Server.java:500)  at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)  at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)  at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)  at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)  at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)  at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)  at org.eclipse.jetty.io.ssl.SslConnection$DecryptedEndPoint.onFillable(SslConnection.java:543)  at org.eclipse.jetty.io.ssl.SslConnection.onFillable(SslConnection.java:398)  at org.eclipse.jetty.io.ssl.SslConnection$2.succeeded(SslConnection.java:161)  at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)  at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)  at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)  at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)  at java.lang.Thread.run(Thread.java:748) 

Stack Trace:
org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at https://127.0.0.1:34717/x_/collection1: java.lang.NullPointerException
	at org.apache.solr.handler.admin.SystemInfoHandler.getSecurityInfo(SystemInfoHandler.java:326)
	at org.apache.solr.handler.admin.SystemInfoHandler.handleRequestBody(SystemInfoHandler.java:146)
	at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)
	at org.apache.solr.core.SolrCore.execute(SolrCore.java:2600)
	at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:803)
	at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:582)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:432)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:362)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)
	at org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)
	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1610)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1300)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)
	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1580)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1215)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)
	at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:767)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.server.Server.handle(Server.java:500)
	at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)
	at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)
	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)
	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)
	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
	at org.eclipse.jetty.io.ssl.SslConnection$DecryptedEndPoint.onFillable(SslConnection.java:543)
	at org.eclipse.jetty.io.ssl.SslConnection.onFillable(SslConnection.java:398)
	at org.eclipse.jetty.io.ssl.SslConnection$2.succeeded(SslConnection.java:161)
	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
	at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)
	at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)
	at java.lang.Thread.run(Thread.java:748)

	at __randomizedtesting.SeedInfo.seed([55468B90EE9CF2FC:F20233348327E145]:0)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:665)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:265)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:248)
	at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:211)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1003)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1018)
	at org.apache.solr.index.hdfs.CheckHdfsIndexTest.doTest(CheckHdfsIndexTest.java:120)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1081)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1053)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at java.lang.Thread.run(Thread.java:748)




Build Log:
[...truncated 15169 lines...]
   [junit4] Suite: org.apache.solr.index.hdfs.CheckHdfsIndexTest
   [junit4]   2> 440026 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.a.s.SolrTestCase Setting 'solr.default.confdir' system property to test-framework derived value of '/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/server/solr/configsets/_default/conf'
   [junit4]   2> 440026 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.a.s.SolrTestCaseJ4 SecureRandom sanity checks: test.solr.allowed.securerandom=null & java.security.egd=file:/dev/./urandom
   [junit4]   2> 440026 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.a.s.SolrTestCaseJ4 Created dataDir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_55468B90EE9CF2FC-001/data-dir-60-001
   [junit4]   2> 440026 WARN  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.a.s.SolrTestCaseJ4 startTrackingSearchers: numOpens=3 numCloses=3
   [junit4]   2> 440027 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.a.s.SolrTestCaseJ4 Using PointFields (NUMERIC_POINTS_SYSPROP=true) w/NUMERIC_DOCVALUES_SYSPROP=true
   [junit4]   2> 440027 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.a.s.SolrTestCaseJ4 Randomized ssl (true) and clientAuth (false) via: @org.apache.solr.util.RandomizeSSL(reason=, ssl=NaN, value=NaN, clientAuth=NaN)
   [junit4]   2> 440028 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.a.s.BaseDistributedSearchTestCase Setting hostContext system property: /x_/
   [junit4]   1> Formatting using clusterid: testClusterID
   [junit4]   2> 440812 WARN  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 440834 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 1.8.0_201-b09
   [junit4]   2> 440837 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 440837 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 440837 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 440839 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@3f158e02{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,AVAILABLE}
   [junit4]   2> 440983 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.w.WebAppContext@72949340{hdfs,/,file:///home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/jetty-localhost_localdomain-37501-hadoop-hdfs-3_2_0-tests_jar-_-any-8955337851763521077.dir/webapp/,AVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/hdfs}
   [junit4]   2> 440983 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.e.j.s.AbstractConnector Started ServerConnector@81b6d25{HTTP/1.1, (http/1.1)}{localhost.localdomain:37501}
   [junit4]   2> 440983 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.e.j.s.Server Started @441006ms
   [junit4]   2> 441443 WARN  (StorageLocationChecker thread 0) [     ] o.a.h.u.NativeCodeLoader Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
   [junit4]   2> 441487 WARN  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 441491 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 1.8.0_201-b09
   [junit4]   2> 441491 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 441491 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 441491 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 441491 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@1668ccc9{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,AVAILABLE}
   [junit4]   2> 441584 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.w.WebAppContext@65dbf20d{datanode,/,file:///home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/jetty-localhost-33353-hadoop-hdfs-3_2_0-tests_jar-_-any-3156179638796085029.dir/webapp/,AVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/datanode}
   [junit4]   2> 441585 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.e.j.s.AbstractConnector Started ServerConnector@3d7bb401{HTTP/1.1, (http/1.1)}{localhost:33353}
   [junit4]   2> 441585 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.e.j.s.Server Started @441608ms
   [junit4]   2> 442365 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x6ddd7658fb0d9840: Processing first storage report for DS-ca3b7716-19e5-4197-8aa9-44ce2ff71235 from datanode 57250380-5cb2-4f47-a281-4148b1d0d29c
   [junit4]   2> 442367 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x6ddd7658fb0d9840: from storage DS-ca3b7716-19e5-4197-8aa9-44ce2ff71235 node DatanodeRegistration(127.0.0.1:34849, datanodeUuid=57250380-5cb2-4f47-a281-4148b1d0d29c, infoPort=35401, infoSecurePort=0, ipcPort=33369, storageInfo=lv=-57;cid=testClusterID;nsid=1588236702;c=1588203674110), blocks: 0, hasStaleStorage: true, processing time: 2 msecs, invalidatedBlocks: 0
   [junit4]   2> 442367 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x6ddd7658fb0d9840: Processing first storage report for DS-2bd611e0-6cf0-45b4-8ba7-b0044e82edc4 from datanode 57250380-5cb2-4f47-a281-4148b1d0d29c
   [junit4]   2> 442367 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x6ddd7658fb0d9840: from storage DS-2bd611e0-6cf0-45b4-8ba7-b0044e82edc4 node DatanodeRegistration(127.0.0.1:34849, datanodeUuid=57250380-5cb2-4f47-a281-4148b1d0d29c, infoPort=35401, infoSecurePort=0, ipcPort=33369, storageInfo=lv=-57;cid=testClusterID;nsid=1588236702;c=1588203674110), blocks: 0, hasStaleStorage: false, processing time: 0 msecs, invalidatedBlocks: 0
   [junit4]   2> 442446 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER
   [junit4]   2> 442446 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer client port:0.0.0.0/0.0.0.0:0
   [junit4]   2> 442446 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer Starting server
   [junit4]   2> 442546 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.ZkTestServer start zk server on port:38135
   [junit4]   2> 442546 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.ZkTestServer waitForServerUp: 127.0.0.1:38135
   [junit4]   2> 442546 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.ZkTestServer parse host and port list: 127.0.0.1:38135
   [junit4]   2> 442546 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.ZkTestServer connecting to 127.0.0.1 38135
   [junit4]   2> 442548 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 442550 INFO  (zkConnectionManagerCallback-6075-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 442550 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 442554 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 442555 INFO  (zkConnectionManagerCallback-6077-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 442555 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 442556 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig-tlog.xml to /configs/conf1/solrconfig.xml
   [junit4]   2> 442557 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/schema.xml to /configs/conf1/schema.xml
   [junit4]   2> 442558 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig.snippet.randomindexconfig.xml to /configs/conf1/solrconfig.snippet.randomindexconfig.xml
   [junit4]   2> 442563 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/stopwords.txt to /configs/conf1/stopwords.txt
   [junit4]   2> 442563 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/protwords.txt to /configs/conf1/protwords.txt
   [junit4]   2> 442564 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/currency.xml to /configs/conf1/currency.xml
   [junit4]   2> 442564 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/enumsConfig.xml to /configs/conf1/enumsConfig.xml
   [junit4]   2> 442567 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/open-exchange-rates.json to /configs/conf1/open-exchange-rates.json
   [junit4]   2> 442567 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/mapping-ISOLatin1Accent.txt to /configs/conf1/mapping-ISOLatin1Accent.txt
   [junit4]   2> 442568 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/old_synonyms.txt to /configs/conf1/old_synonyms.txt
   [junit4]   2> 442573 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/synonyms.txt to /configs/conf1/synonyms.txt
   [junit4]   2> 442576 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 442577 INFO  (zkConnectionManagerCallback-6081-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 442577 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 442678 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Will use NRT replicas unless explicitly asked otherwise
   [junit4]   2> 442743 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 442743 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 442743 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ...
   [junit4]   2> 442743 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 1.8.0_201-b09
   [junit4]   2> 442746 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 442746 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 442746 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 442747 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@16292686{/x_,null,AVAILABLE}
   [junit4]   2> 442748 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.e.j.s.AbstractConnector Started ServerConnector@412dbbb5{SSL, (ssl, http/1.1)}{127.0.0.1:44915}
   [junit4]   2> 442748 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.e.j.s.Server Started @442771ms
   [junit4]   2> 442748 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {solr.data.dir=hdfs://localhost.localdomain:33357/hdfs__localhost.localdomain_33357__home_jenkins_workspace_Lucene-Solr-8.x-Linux_solr_build_solr-core_test_J3_temp_solr.index.hdfs.CheckHdfsIndexTest_55468B90EE9CF2FC-001_tempDir-002_control_data, hostContext=/x_, hostPort=44915, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_55468B90EE9CF2FC-001/control-001/cores}
   [junit4]   2> 442748 ERROR (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 442748 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 442748 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr? version 8.6.0
   [junit4]   2> 442748 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 442748 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr
   [junit4]   2> 442748 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-04-29T23:41:16.281Z
   [junit4]   2> 442750 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 442751 INFO  (zkConnectionManagerCallback-6083-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 442751 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 442852 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 442852 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_55468B90EE9CF2FC-001/control-001/solr.xml
   [junit4]   2> 442855 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 442855 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 442856 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@22ff6c08, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 442936 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 442936 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.s.i.Http2SolrClient Create Http2SolrClient with HTTP/1.1 transport since Java 8 or lower versions does not support SSL + HTTP/2
   [junit4]   2> 442938 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@9f1bafb[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 442938 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@9f1bafb[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 442943 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.s.i.Http2SolrClient Create Http2SolrClient with HTTP/1.1 transport since Java 8 or lower versions does not support SSL + HTTP/2
   [junit4]   2> 442944 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@4f60bf8[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 442944 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@4f60bf8[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 442945 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:38135/solr
   [junit4]   2> 442946 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 442947 INFO  (zkConnectionManagerCallback-6094-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 442947 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 443048 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [n:127.0.0.1:44915_x_     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 443049 INFO  (zkConnectionManagerCallback-6096-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 443049 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [n:127.0.0.1:44915_x_     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 443073 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [n:127.0.0.1:44915_x_     ] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:44915_x_
   [junit4]   2> 443073 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [n:127.0.0.1:44915_x_     ] o.a.s.c.Overseer Overseer (id=72184635917336581-127.0.0.1:44915_x_-n_0000000000) starting
   [junit4]   2> 443075 INFO  (OverseerStateUpdate-72184635917336581-127.0.0.1:44915_x_-n_0000000000) [n:127.0.0.1:44915_x_     ] o.a.s.c.Overseer Starting to work on the main queue : 127.0.0.1:44915_x_
   [junit4]   2> 443075 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [n:127.0.0.1:44915_x_     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:44915_x_
   [junit4]   2> 443076 INFO  (zkCallback-6095-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 443076 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [n:127.0.0.1:44915_x_     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 443077 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [n:127.0.0.1:44915_x_     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=you make it. Consider configuring authentication/authorization before exposing Solr to users internal or  authorization=external.  See https://s.apache.org/solrsecurity for more info.  Solr is only as secure as disableddisabled
   [junit4]   2> 443085 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [n:127.0.0.1:44915_x_     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 443098 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [n:127.0.0.1:44915_x_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@22ff6c08
   [junit4]   2> 443102 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [n:127.0.0.1:44915_x_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@22ff6c08
   [junit4]   2> 443102 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [n:127.0.0.1:44915_x_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@22ff6c08
   [junit4]   2> 443103 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [n:127.0.0.1:44915_x_     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_55468B90EE9CF2FC-001/control-001/cores
   [junit4]   2> 443114 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 443115 INFO  (zkConnectionManagerCallback-6113-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 443115 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 443116 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 443116 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:38135/solr ready
   [junit4]   2> 443138 INFO  (qtp17828847-11285) [n:127.0.0.1:44915_x_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:44915_x_&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 443143 INFO  (OverseerThreadFactory-6103-thread-1-processing-n:127.0.0.1:44915_x_) [n:127.0.0.1:44915_x_     ] o.a.s.c.a.c.CreateCollectionCmd Create collection control_collection
   [junit4]   2> 443248 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 443249 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 443256 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_    x:control_collection_shard1_replica_n1 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 443256 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_    x:control_collection_shard1_replica_n1 ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 444265 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.6.0
   [junit4]   2> 444288 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.IndexSchema Schema name=test
   [junit4]   2> 444357 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 444362 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.CoreContainer Creating SolrCore 'control_collection_shard1_replica_n1' using configuration from configset conf1, trusted=true
   [junit4]   2> 444363 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.control_collection.shard1.replica_n1' (registry 'solr.core.control_collection.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@22ff6c08
   [junit4]   2> 444366 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory solr.hdfs.home=hdfs://localhost.localdomain:33357/solr_hdfs_home
   [junit4]   2> 444366 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Solr Kerberos Authentication disabled
   [junit4]   2> 444366 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrCore [[control_collection_shard1_replica_n1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_55468B90EE9CF2FC-001/control-001/cores/control_collection_shard1_replica_n1], dataDir=[hdfs://localhost.localdomain:33357/solr_hdfs_home/control_collection/core_node2/data/]
   [junit4]   2> 444367 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:33357/solr_hdfs_home/control_collection/core_node2/data/snapshot_metadata
   [junit4]   2> 444375 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 444375 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 444375 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 444461 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 444465 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:33357/solr_hdfs_home/control_collection/core_node2/data
   [junit4]   2> 444485 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:33357/solr_hdfs_home/control_collection/core_node2/data/index
   [junit4]   2> 444489 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 444489 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 444489 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 444493 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 444494 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=10, maxMergeAtOnceExplicit=22, maxMergedSegmentMB=20.95703125, floorSegmentMB=0.544921875, forceMergeDeletesPctAllowed=21.1746934873665, segmentsPerTier=23.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0, deletesPctAllowed=27.960494849315776
   [junit4]   2> 445073 WARN  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 445113 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.HdfsUpdateLog
   [junit4]   2> 445113 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 445113 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.HdfsUpdateLog Initializing HdfsUpdateLog: tlogDfsReplication=2
   [junit4]   2> 445124 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 445124 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 445127 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogDocMergePolicy: [LogDocMergePolicy: minMergeSize=1000, mergeFactor=29, maxMergeSize=9223372036854775807, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.40373772706721056]
   [junit4]   2> 445190 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@52283504[control_collection_shard1_replica_n1] main]
   [junit4]   2> 445191 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 445191 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 445193 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms
   [junit4]   2> 445194 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1665352260624842752
   [junit4]   2> 445196 INFO  (searcherExecutor-6115-thread-1-processing-n:127.0.0.1:44915_x_ x:control_collection_shard1_replica_n1 c:control_collection s:shard1) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrCore [control_collection_shard1_replica_n1] Registered new searcher Searcher@52283504[control_collection_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 445197 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ZkShardTerms Successful update of terms at /collections/control_collection/terms/shard1 to Terms{values={core_node2=0}, version=0}
   [junit4]   2> 445197 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/control_collection/leaders/shard1
   [junit4]   2> 445198 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 445198 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 445198 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync replicas to https://127.0.0.1:44915/x_/control_collection_shard1_replica_n1/
   [junit4]   2> 445198 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 445199 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy https://127.0.0.1:44915/x_/control_collection_shard1_replica_n1/ has no replicas
   [junit4]   2> 445199 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/control_collection/leaders/shard1/leader after winning as /collections/control_collection/leader_elect/shard1/election/72184635917336581-core_node2-n_0000000000
   [junit4]   2> 445199 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I am the new leader: https://127.0.0.1:44915/x_/control_collection_shard1_replica_n1/ shard1
   [junit4]   2> 445300 INFO  (zkCallback-6095-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 445300 INFO  (zkCallback-6095-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 445300 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 445303 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=2047
   [junit4]   2> 445304 INFO  (qtp17828847-11285) [n:127.0.0.1:44915_x_     ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas
   [junit4]   2> 445402 INFO  (zkCallback-6095-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 445402 INFO  (zkCallback-6095-thread-3) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 445402 INFO  (zkCallback-6095-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 445402 INFO  (qtp17828847-11285) [n:127.0.0.1:44915_x_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:44915_x_&wt=javabin&version=2} status=0 QTime=2265
   [junit4]   2> 445402 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Waiting to see 1 active replicas in collection: control_collection
   [junit4]   2> 445506 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 445507 INFO  (zkConnectionManagerCallback-6124-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 445507 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 445508 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 445508 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:38135/solr ready
   [junit4]   2> 445508 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.ChaosMonkey monkey: init - expire sessions:false cause connection loss:false
   [junit4]   2> 445510 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=2&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 445515 INFO  (OverseerThreadFactory-6103-thread-2-processing-n:127.0.0.1:44915_x_) [n:127.0.0.1:44915_x_     ] o.a.s.c.a.c.CreateCollectionCmd Create collection collection1
   [junit4]   2> 445515 INFO  (OverseerCollectionConfigSetProcessor-72184635917336581-127.0.0.1:44915_x_-n_0000000000) [n:127.0.0.1:44915_x_     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000000 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 445716 WARN  (OverseerThreadFactory-6103-thread-2-processing-n:127.0.0.1:44915_x_) [n:127.0.0.1:44915_x_     ] o.a.s.c.a.c.CreateCollectionCmd It is unusual to create a collection (collection1) without cores.
   [junit4]   2> 445717 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_     ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas
   [junit4]   2> 445717 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=2&wt=javabin&version=2} status=0 QTime=207
   [junit4]   2> 445718 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 445718 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 445718 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 445718 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 445718 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 445718 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 445718 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Creating jetty instances pullReplicaCount=0 numOtherReplicas=1
   [junit4]   2> 445776 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 1 in directory /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_55468B90EE9CF2FC-001/shard-1-001 of type NRT for shard1
   [junit4]   2> 445778 WARN  (closeThreadPool-6125-thread-1) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 445778 INFO  (closeThreadPool-6125-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 445778 INFO  (closeThreadPool-6125-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ...
   [junit4]   2> 445778 INFO  (closeThreadPool-6125-thread-1) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 1.8.0_201-b09
   [junit4]   2> 445780 INFO  (closeThreadPool-6125-thread-1) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 445780 INFO  (closeThreadPool-6125-thread-1) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 445781 INFO  (closeThreadPool-6125-thread-1) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 445786 INFO  (closeThreadPool-6125-thread-1) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@3518fc06{/x_,null,AVAILABLE}
   [junit4]   2> 445787 INFO  (closeThreadPool-6125-thread-1) [     ] o.e.j.s.AbstractConnector Started ServerConnector@3e9ec76f{SSL, (ssl, http/1.1)}{127.0.0.1:43257}
   [junit4]   2> 445787 INFO  (closeThreadPool-6125-thread-1) [     ] o.e.j.s.Server Started @445810ms
   [junit4]   2> 445787 INFO  (closeThreadPool-6125-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {solr.data.dir=hdfs://localhost.localdomain:33357/hdfs__localhost.localdomain_33357__home_jenkins_workspace_Lucene-Solr-8.x-Linux_solr_build_solr-core_test_J3_temp_solr.index.hdfs.CheckHdfsIndexTest_55468B90EE9CF2FC-001_tempDir-002_jetty1, replicaType=NRT, solrconfig=solrconfig.xml, hostContext=/x_, hostPort=43257, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_55468B90EE9CF2FC-001/shard-1-001/cores}
   [junit4]   2> 445787 ERROR (closeThreadPool-6125-thread-1) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 445787 INFO  (closeThreadPool-6125-thread-1) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 445787 INFO  (closeThreadPool-6125-thread-1) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr? version 8.6.0
   [junit4]   2> 445787 INFO  (closeThreadPool-6125-thread-1) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 445787 INFO  (closeThreadPool-6125-thread-1) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr
   [junit4]   2> 445787 INFO  (closeThreadPool-6125-thread-1) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-04-29T23:41:19.320Z
   [junit4]   2> 445790 INFO  (closeThreadPool-6125-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 445791 INFO  (zkConnectionManagerCallback-6127-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 445791 INFO  (closeThreadPool-6125-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 445892 INFO  (closeThreadPool-6125-thread-1) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 445892 INFO  (closeThreadPool-6125-thread-1) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_55468B90EE9CF2FC-001/shard-1-001/solr.xml
   [junit4]   2> 445894 INFO  (closeThreadPool-6125-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 445894 INFO  (closeThreadPool-6125-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 445912 INFO  (closeThreadPool-6125-thread-1) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@22ff6c08, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 446041 INFO  (closeThreadPool-6125-thread-1) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 446042 WARN  (closeThreadPool-6125-thread-1) [     ] o.a.s.c.s.i.Http2SolrClient Create Http2SolrClient with HTTP/1.1 transport since Java 8 or lower versions does not support SSL + HTTP/2
   [junit4]   2> 446042 WARN  (closeThreadPool-6125-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@167d69c9[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 446042 WARN  (closeThreadPool-6125-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@167d69c9[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 446046 WARN  (closeThreadPool-6125-thread-1) [     ] o.a.s.c.s.i.Http2SolrClient Create Http2SolrClient with HTTP/1.1 transport since Java 8 or lower versions does not support SSL + HTTP/2
   [junit4]   2> 446047 WARN  (closeThreadPool-6125-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@1fb7441f[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 446047 WARN  (closeThreadPool-6125-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@1fb7441f[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 446048 INFO  (closeThreadPool-6125-thread-1) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:38135/solr
   [junit4]   2> 446050 INFO  (closeThreadPool-6125-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 446051 INFO  (zkConnectionManagerCallback-6138-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 446051 INFO  (closeThreadPool-6125-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 446152 INFO  (closeThreadPool-6125-thread-1) [n:127.0.0.1:43257_x_     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 446153 INFO  (zkConnectionManagerCallback-6140-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 446153 INFO  (closeThreadPool-6125-thread-1) [n:127.0.0.1:43257_x_     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 446157 INFO  (closeThreadPool-6125-thread-1) [n:127.0.0.1:43257_x_     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 446159 INFO  (closeThreadPool-6125-thread-1) [n:127.0.0.1:43257_x_     ] o.a.s.c.ZkController Publish node=127.0.0.1:43257_x_ as DOWN
   [junit4]   2> 446159 INFO  (closeThreadPool-6125-thread-1) [n:127.0.0.1:43257_x_     ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 446159 INFO  (closeThreadPool-6125-thread-1) [n:127.0.0.1:43257_x_     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:43257_x_
   [junit4]   2> 446160 INFO  (zkCallback-6139-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 446160 INFO  (zkCallback-6095-thread-2) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 446161 INFO  (zkCallback-6123-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 446161 INFO  (closeThreadPool-6125-thread-1) [n:127.0.0.1:43257_x_     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 446161 WARN  (closeThreadPool-6125-thread-1) [n:127.0.0.1:43257_x_     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=you make it. Consider configuring authentication/authorization before exposing Solr to users internal or  authorization=external.  See https://s.apache.org/solrsecurity for more info.  Solr is only as secure as disableddisabled
   [junit4]   2> 446170 INFO  (closeThreadPool-6125-thread-1) [n:127.0.0.1:43257_x_     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 446184 INFO  (closeThreadPool-6125-thread-1) [n:127.0.0.1:43257_x_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@22ff6c08
   [junit4]   2> 446189 INFO  (closeThreadPool-6125-thread-1) [n:127.0.0.1:43257_x_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@22ff6c08
   [junit4]   2> 446189 INFO  (closeThreadPool-6125-thread-1) [n:127.0.0.1:43257_x_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@22ff6c08
   [junit4]   2> 446190 INFO  (closeThreadPool-6125-thread-1) [n:127.0.0.1:43257_x_     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_55468B90EE9CF2FC-001/shard-1-001/cores
   [junit4]   2> 446196 INFO  (closeThreadPool-6125-thread-1) [     ] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:43257_x_
   [junit4]   2> 446208 INFO  (qtp2141441364-11348) [n:127.0.0.1:43257_x_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :addreplica with params node=127.0.0.1:43257_x_&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 446209 INFO  (OverseerCollectionConfigSetProcessor-72184635917336581-127.0.0.1:44915_x_-n_0000000000) [n:127.0.0.1:44915_x_     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000002 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 446215 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 446220 INFO  (qtp17828847-11285) [n:127.0.0.1:44915_x_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.core.control_collection.shard1.replica_n1:INDEX.sizeInBytes} status=0 QTime=4
   [junit4]   2> 446222 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 446223 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 446227 INFO  (qtp17828847-11285) [n:127.0.0.1:44915_x_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.core.control_collection.shard1.replica_n1:INDEX.sizeInBytes} status=0 QTime=2
   [junit4]   2> 446228 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 446229 INFO  (OverseerThreadFactory-6103-thread-3-processing-n:127.0.0.1:44915_x_) [n:127.0.0.1:44915_x_ c:collection1 s:shard1   ] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:43257_x_ for creating new replica of shard shard1 for collection collection1
   [junit4]   2> 446230 INFO  (OverseerThreadFactory-6103-thread-3-processing-n:127.0.0.1:44915_x_) [n:127.0.0.1:44915_x_ c:collection1 s:shard1   ] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command.
   [junit4]   2> 446247 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_    x:collection1_shard1_replica_n1 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_n1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 447256 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.6.0
   [junit4]   2> 447265 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.IndexSchema Schema name=test
   [junit4]   2> 447636 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 447644 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard1_replica_n1' using configuration from configset conf1, trusted=true
   [junit4]   2> 447645 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.collection1.shard1.replica_n1' (registry 'solr.core.collection1.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@22ff6c08
   [junit4]   2> 447645 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory solr.hdfs.home=hdfs://localhost.localdomain:33357/solr_hdfs_home
   [junit4]   2> 447645 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Solr Kerberos Authentication disabled
   [junit4]   2> 447645 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrCore [[collection1_shard1_replica_n1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_55468B90EE9CF2FC-001/shard-1-001/cores/collection1_shard1_replica_n1], dataDir=[hdfs://localhost.localdomain:33357/solr_hdfs_home/collection1/core_node2/data/]
   [junit4]   2> 447646 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:33357/solr_hdfs_home/collection1/core_node2/data/snapshot_metadata
   [junit4]   2> 447652 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 447652 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 447653 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 447660 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 447662 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:33357/solr_hdfs_home/collection1/core_node2/data
   [junit4]   2> 447683 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:33357/solr_hdfs_home/collection1/core_node2/data/index
   [junit4]   2> 447689 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 447689 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 447689 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 447694 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 447694 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=10, maxMergeAtOnceExplicit=22, maxMergedSegmentMB=20.95703125, floorSegmentMB=0.544921875, forceMergeDeletesPctAllowed=21.1746934873665, segmentsPerTier=23.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0, deletesPctAllowed=27.960494849315776
   [junit4]   2> 448114 WARN  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 448144 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.HdfsUpdateLog
   [junit4]   2> 448144 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 448144 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.HdfsUpdateLog Initializing HdfsUpdateLog: tlogDfsReplication=2
   [junit4]   2> 448154 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 448154 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 448155 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogDocMergePolicy: [LogDocMergePolicy: minMergeSize=1000, mergeFactor=29, maxMergeSize=9223372036854775807, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=true, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.40373772706721056]
   [junit4]   2> 448162 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@796c717d[collection1_shard1_replica_n1] main]
   [junit4]   2> 448163 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 448163 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 448164 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms
   [junit4]   2> 448164 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1665352263739113472
   [junit4]   2> 448165 INFO  (searcherExecutor-6151-thread-1-processing-n:127.0.0.1:43257_x_ x:collection1_shard1_replica_n1 c:collection1 s:shard1) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrCore [collection1_shard1_replica_n1] Registered new searcher Searcher@796c717d[collection1_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 448167 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection1/terms/shard1 to Terms{values={core_node2=0}, version=0}
   [junit4]   2> 448167 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection1/leaders/shard1
   [junit4]   2> 448168 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 448168 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 448168 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync replicas to https://127.0.0.1:43257/x_/collection1_shard1_replica_n1/
   [junit4]   2> 448169 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 448169 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy https://127.0.0.1:43257/x_/collection1_shard1_replica_n1/ has no replicas
   [junit4]   2> 448169 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/collection1/leaders/shard1/leader after winning as /collections/collection1/leader_elect/shard1/election/72184635917336586-core_node2-n_0000000000
   [junit4]   2> 448169 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I am the new leader: https://127.0.0.1:43257/x_/collection1_shard1_replica_n1/ shard1
   [junit4]   2> 448270 INFO  (zkCallback-6139-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 448270 INFO  (zkCallback-6139-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 448271 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 448272 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_n1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=2025
   [junit4]   2> 448274 INFO  (qtp2141441364-11348) [n:127.0.0.1:43257_x_ c:collection1    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:43257_x_&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2} status=0 QTime=2066
   [junit4]   2> 448274 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Waiting to see 1 active replicas in collection: collection1
   [junit4]   2> 448372 INFO  (zkCallback-6139-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 448372 INFO  (zkCallback-6123-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 448372 INFO  (zkCallback-6139-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 448372 INFO  (zkCallback-6139-thread-3) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/collection1/state.json] for collection [collection1] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 448373 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.SolrTestCaseJ4 ###Starting testDeletedDocs
   [junit4]   2> 450210 INFO  (OverseerCollectionConfigSetProcessor-72184635917336581-127.0.0.1:44915_x_-n_0000000000) [n:127.0.0.1:44915_x_     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000004 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 503089 INFO  (qtp17828847-11285) [n:127.0.0.1:44915_x_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.core.control_collection.shard1.replica_n1:QUERY./select.requests&key=solr.core.control_collection.shard1.replica_n1:INDEX.sizeInBytes&key=solr.core.control_collection.shard1.replica_n1:UPDATE./update.requests} status=0 QTime=1
   [junit4]   2> 503090 INFO  (qtp17828847-11288) [n:127.0.0.1:44915_x_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.jvm:os.processCpuLoad&key=solr.node:CONTAINER.fs.coreRoot.usableSpace&key=solr.jvm:os.systemLoadAverage&key=solr.jvm:memory.heap.used} status=0 QTime=0
   [junit4]   2> 503093 INFO  (qtp2141441364-11351) [n:127.0.0.1:43257_x_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.core.collection1.shard1.replica_n1:QUERY./select.requests&key=solr.core.collection1.shard1.replica_n1:INDEX.sizeInBytes&key=solr.core.collection1.shard1.replica_n1:UPDATE./update.requests} status=0 QTime=1
   [junit4]   2> 503095 INFO  (qtp2141441364-11348) [n:127.0.0.1:43257_x_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.jvm:os.processCpuLoad&key=solr.node:CONTAINER.fs.coreRoot.usableSpace&key=solr.jvm:os.systemLoadAverage&key=solr.jvm:memory.heap.used} status=0 QTime=0
   [junit4]   2> 535611 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:33357/solr
   [junit4]   2> 535611 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[55468B90EE9CF2FC]) [     ] o.a.s.SolrTestCaseJ4 ###Ending testDeletedDocs
   [junit4]   2> 535713 INFO  (closeThreadPool-6158-thread-2) [     ] o.a.s.c.CoreContainer Shutting down CoreContainer instance=130529925
   [junit4]   2> 535713 INFO  (closeThreadPool-6158-thread-2) [     ] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:44915_x_
   [junit4]   2> 535713 INFO  (closeThreadPool-6158-thread-2) [     ] o.a.s.c.ZkController Publish this node as DOWN...
   [junit4]   2> 535713 INFO  (closeThreadPool-6158-thread-2) [     ] o.a.s.c.ZkController Publish node=127.0.0.1:44915_x_ as DOWN
   [junit4]   2> 535715 INFO  (coreCloseExecutor-6163-thread-1) [n:127.0.0.1:44915_x_     ] o.a.s.c.SolrCore [control_collection_shard1_replica_n1]  CLOSING SolrCore org.apache.solr.core.SolrCore@24754165
   [junit4]   2> 535715 INFO  (coreCloseExecutor-6163-thread-1) [n:127.0.0.1:44915_x_     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.control_collection.shard1.replica_n1 tag=SolrCore@24754165
   [junit4]   2> 535715 INFO  (closeThreadPool-6158-thread-3) [     ] o.a.s.c.CoreContainer Shutting down CoreContainer instance=355453079
   [junit4]   2> 535715 INFO  (coreCloseExecutor-6163-thread-1) [n:127.0.0.1:44915_x_     ] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@4aaaa073: rootName = null, domain = solr.core.control_collection.shard1.replica_n1, service url = null, agent id = null] for registry solr.core.control_collection.shard1.replica_n1/com.codahale.metrics.MetricRegistry@6f3eaf9f
   [junit4]   2> 535715 INFO  (closeThreadPool-6158-thread-3) [     ] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:43257_x_
   [junit4]   2> 535715 INFO  (closeThreadPool-6158-thread-3) [     ] o.a.s.c.ZkController Publish this node as DOWN...
   [junit4]   2> 535715 INFO  (closeThreadPool-6158-thread-3) [     ] o.a.s.c.ZkController Publish node=127.0.0.1:43257_x_ as DOWN
   [junit4]   2> 535715 INFO  (zkCallback-6095-thread-5) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 535715 INFO  (zkCallback-6095-thread-6) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 535715 INFO  (zkCallback-6095-thread-7) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 535716 INFO  (coreCloseExecutor-6166-thread-1) [n:127.0.0.1:43257_x_     ] o.a.s.c.SolrCore [collection1_shard1_replica_n1]  CLOSING SolrCore org.apache.solr.core.SolrCore@6afb0426
   [junit4]   2> 535716 INFO  (coreCloseExecutor-6166-thread-1) [n:127.0.0.1:43257_x_     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.collection1.shard1.replica_n1 tag=SolrCore@6afb0426
   [junit4]   2> 535716 INFO  (coreCloseExecutor-6166-thread-1) [n:127.0.0.1:43257_x_     ] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@5ec1bf83: rootName = null, domain = solr.core.collection1.shard1.replica_n1, service url = null, agent id = null] for registry solr.core.collection1.shard1.replica_n1/com.codahale.metrics.MetricRegistry@3f51a51b
   [junit4]   2> 535723 INFO  (coreCloseExecutor-6163-thread-1) [n:127.0.0.1:44915_x_     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.control_collection.shard1.leader tag=SolrCore@24754165
   [junit4]   2> 535723 INFO  (coreCloseExecutor-6166-thread-1) [n:127.0.0.1:43257_x_     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.collection1.shard1.leader tag=SolrCore@6afb

[...truncated too long message...]

e limit:
   [junit4]   2> 
   [junit4]   2> 	21	/solr/overseer/queue
   [junit4]   2> 	12	/solr/overseer/collection-queue-work
   [junit4]   2> 	9	/solr/live_nodes
   [junit4]   2> 	4	/solr/collections
   [junit4]   2> 
   [junit4]   2> 766355 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.ZkTestServer waitForServerDown: 127.0.0.1:44765
   [junit4]   2> 766355 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.ZkTestServer parse host and port list: 127.0.0.1:44765
   [junit4]   2> 766355 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[55468B90EE9CF2FC]) [     ] o.a.s.c.ZkTestServer connecting to 127.0.0.1 44765
   [junit4]   2> 766356 WARN  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.a.h.h.s.d.DirectoryScanner DirectoryScanner: shutdown has been called
   [junit4]   2> 766371 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.w.WebAppContext@65dbf20d{datanode,/,null,UNAVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/datanode}
   [junit4]   2> 766371 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@3d7bb401{HTTP/1.1, (http/1.1)}{localhost:0}
   [junit4]   2> 766371 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 766371 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@1668ccc9{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,UNAVAILABLE}
   [junit4]   2> 766372 WARN  (BP-609729490-127.0.0.1-1588203674110 heartbeating to localhost.localdomain/127.0.0.1:33357) [     ] o.a.h.h.s.d.IncrementalBlockReportManager IncrementalBlockReportManager interrupted
   [junit4]   2> 766372 WARN  (BP-609729490-127.0.0.1-1588203674110 heartbeating to localhost.localdomain/127.0.0.1:33357) [     ] o.a.h.h.s.d.DataNode Ending block pool service for: Block pool BP-609729490-127.0.0.1-1588203674110 (Datanode Uuid 57250380-5cb2-4f47-a281-4148b1d0d29c) service to localhost.localdomain/127.0.0.1:33357
   [junit4]   2> 766387 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.w.WebAppContext@72949340{hdfs,/,null,UNAVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/hdfs}
   [junit4]   2> 766388 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@81b6d25{HTTP/1.1, (http/1.1)}{localhost.localdomain:0}
   [junit4]   2> 766388 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 766388 INFO  (SUITE-CheckHdfsIndexTest-seed#[55468B90EE9CF2FC]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@3f158e02{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,UNAVAILABLE}
   [junit4]   2> NOTE: leaving temporary files on disk at: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_55468B90EE9CF2FC-001
   [junit4]   2> Apr 29, 2020 11:46:39 PM com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks
   [junit4]   2> WARNING: Will linger awaiting termination of 65 leaked thread(s).
   [junit4]   2> NOTE: test params are: codec=Lucene84, sim=Asserting(org.apache.lucene.search.similarities.AssertingSimilarity@12f66342), locale=vi-VN, timezone=Etc/GMT+4
   [junit4]   2> NOTE: Linux 5.3.0-46-generic amd64/Oracle Corporation 1.8.0_201 (64-bit)/cpus=16,threads=8,free=196215120,total=518979584
   [junit4]   2> NOTE: All tests run in this JVM: [TestAddFieldRealTimeGet, TestFilteredDocIdSet, TestStandardQParsers, TestAnalyzedSuggestions, TestPhraseSuggestions, RestartWhileUpdatingTest, TestFieldCacheVsDocValues, TestScoreJoinQPNoScore, CategoryRoutedAliasUpdateProcessorTest, LukeRequestHandlerTest, TestMaxTokenLenTokenizer, ChaosMonkeySafeLeaderTest, ExecutePlanActionTest, TestPseudoReturnFields, MetricsHistoryIntegrationTest, TestJsonFacetRefinement, SimpleCollectionCreateDeleteTest, NodeMarkersRegistrationTest, TestSweetSpotSimilarityFactory, TestJmxIntegration, TestCloudSearcherWarming, TestSimExtremeIndexing, AssignBackwardCompatibilityTest, TestDynamicLoading, BlobRepositoryCloudTest, SimpleMLTQParserTest, DistributedFacetPivotWhiteBoxTest, TestUniqueKeyFieldResource, MoreLikeThisHandlerTest, TestRebalanceLeaders, TestSimScenario, BitVectorTest, HdfsUnloadDistributedZkTest, BlockJoinFacetDistribTest, TriggerSetPropertiesIntegrationTest, JavabinLoaderTest, RecoveryZkTest, MaxSizeAutoCommitTest, TestSkipOverseerOperations, TestReplicaProperties, DirectoryFactoryTest, WordBreakSolrSpellCheckerTest, QueryParsingTest, AnalyticsQueryTest, TestConfigSets, HDFSCollectionsAPITest, TestCloudInspectUtil, TestRetrieveFieldsOptimizer, SolrLogAuditLoggerPluginTest, TestDistribPackageStore, PreAnalyzedFieldManagedSchemaCloudTest, TestSolrIndexConfig, DistribJoinFromCollectionTest, TestConfigReload, AdminHandlersProxyTest, RequestHandlersTest, TestNamedUpdateProcessors, TestSurroundQueryParser, TestImpersonationWithHadoopAuth, TestCopyFieldCollectionResource, TestCloudSchemaless, UniqFieldsUpdateProcessorFactoryTest, CheckHdfsIndexTest]
   [junit4] Completed [536/907 (1!)] on J3 in 329.48s, 5 tests, 1 error, 1 skipped <<< FAILURES!

[...truncated 39364 lines...]
-ecj-javadoc-lint-src:
    [mkdir] Created dir: /tmp/ecj948795482
 [ecj-lint] Compiling 931 source files to /tmp/ecj948795482
 [ecj-lint] ----------
 [ecj-lint] 1. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/CodecUtil.java (at line 523)
 [ecj-lint] 	throw new CorruptIndexException("misplaced codec footer (file truncated?): length=" + in.length() + " but footerLength==" + footerLength(), input);
 [ecj-lint] 	^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'in' is not closed at this location
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 2. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingStoredFieldsReader.java (at line 166)
 [ecj-lint] 	FieldsIndexReader fieldsIndexReader = new FieldsIndexReader(d, si.name, segmentSuffix, INDEX_EXTENSION_PREFIX, INDEX_CODEC_NAME, si.getId());
 [ecj-lint] 	                  ^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'fieldsIndexReader' is never closed
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 3. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingTermVectorsReader.java (at line 148)
 [ecj-lint] 	FieldsIndexReader fieldsIndexReader = new FieldsIndexReader(d, si.name, segmentSuffix, VECTORS_INDEX_EXTENSION_PREFIX, VECTORS_INDEX_CODEC_NAME, si.getId());
 [ecj-lint] 	                  ^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'fieldsIndexReader' is never closed
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 4. ERROR in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/search/IndexSearcher.java (at line 50)
 [ecj-lint] 	import org.apache.lucene.util.automaton.ByteRunAutomaton;
 [ecj-lint] 	       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 [ecj-lint] The import org.apache.lucene.util.automaton.ByteRunAutomaton is never used
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 5. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/util/automaton/Operations.java (at line 742)
 [ecj-lint] 	Integer q = newstate.get(statesSet);
 [ecj-lint] 	                         ^^^^^^^^^
 [ecj-lint] Unlikely argument type SortedIntSet for get(Object) on a Map<SortedIntSet.FrozenIntSet,Integer>
 [ecj-lint] ----------
 [ecj-lint] 5 problems (1 error, 4 warnings)

BUILD FAILED
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/build.xml:634: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/build.xml:101: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build.xml:201: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/common-build.xml:2127: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/common-build.xml:2166: Compile failed; see the compiler error output for details.

Total time: 35 minutes 35 seconds
Build step 'Invoke Ant' marked build as failure
Archiving artifacts
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
[WARNINGS] Skipping publisher since build result is FAILURE
Recording test results
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2

[JENKINS] Lucene-Solr-8.x-Linux (64bit/jdk1.8.0_201) - Build # 2913 - Still Failing!

Posted by Policeman Jenkins Server <je...@thetaphi.de>.
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-8.x-Linux/2913/
Java: 64bit/jdk1.8.0_201 -XX:+UseCompressedOops -XX:+UseParallelGC

1 tests failed.
FAILED:  org.apache.solr.index.hdfs.CheckHdfsIndexTest.doTest

Error Message:
Error from server at https://127.0.0.1:40761/collection1: java.lang.NullPointerException  at org.apache.solr.handler.admin.SystemInfoHandler.getSecurityInfo(SystemInfoHandler.java:326)  at org.apache.solr.handler.admin.SystemInfoHandler.handleRequestBody(SystemInfoHandler.java:146)  at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)  at org.apache.solr.core.SolrCore.execute(SolrCore.java:2600)  at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:803)  at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:582)  at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:432)  at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:362)  at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)  at org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)  at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)  at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)  at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)  at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1610)  at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)  at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1300)  at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)  at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)  at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1580)  at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)  at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1215)  at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)  at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)  at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)  at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:767)  at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)  at org.eclipse.jetty.server.Server.handle(Server.java:500)  at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)  at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)  at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)  at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)  at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)  at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)  at org.eclipse.jetty.io.ssl.SslConnection$DecryptedEndPoint.onFillable(SslConnection.java:543)  at org.eclipse.jetty.io.ssl.SslConnection.onFillable(SslConnection.java:398)  at org.eclipse.jetty.io.ssl.SslConnection$2.succeeded(SslConnection.java:161)  at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)  at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)  at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)  at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)  at java.lang.Thread.run(Thread.java:748) 

Stack Trace:
org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at https://127.0.0.1:40761/collection1: java.lang.NullPointerException
	at org.apache.solr.handler.admin.SystemInfoHandler.getSecurityInfo(SystemInfoHandler.java:326)
	at org.apache.solr.handler.admin.SystemInfoHandler.handleRequestBody(SystemInfoHandler.java:146)
	at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)
	at org.apache.solr.core.SolrCore.execute(SolrCore.java:2600)
	at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:803)
	at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:582)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:432)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:362)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)
	at org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)
	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1610)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1300)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)
	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1580)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1215)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)
	at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:767)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.server.Server.handle(Server.java:500)
	at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)
	at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)
	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)
	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)
	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
	at org.eclipse.jetty.io.ssl.SslConnection$DecryptedEndPoint.onFillable(SslConnection.java:543)
	at org.eclipse.jetty.io.ssl.SslConnection.onFillable(SslConnection.java:398)
	at org.eclipse.jetty.io.ssl.SslConnection$2.succeeded(SslConnection.java:161)
	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
	at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)
	at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)
	at java.lang.Thread.run(Thread.java:748)

	at __randomizedtesting.SeedInfo.seed([5623B1C5D346BD7C:F1670961BEFDAEC5]:0)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:665)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:265)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:248)
	at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:211)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1003)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1018)
	at org.apache.solr.index.hdfs.CheckHdfsIndexTest.doTest(CheckHdfsIndexTest.java:120)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1081)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1053)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at java.lang.Thread.run(Thread.java:748)




Build Log:
[...truncated 14027 lines...]
   [junit4] Suite: org.apache.solr.index.hdfs.CheckHdfsIndexTest
   [junit4]   2> 115451 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.a.s.SolrTestCase Setting 'solr.default.confdir' system property to test-framework derived value of '/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/server/solr/configsets/_default/conf'
   [junit4]   2> 115451 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.a.s.SolrTestCaseJ4 Created dataDir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_5623B1C5D346BD7C-001/data-dir-9-001
   [junit4]   2> 115451 WARN  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.a.s.SolrTestCaseJ4 startTrackingSearchers: numOpens=135 numCloses=135
   [junit4]   2> 115451 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.a.s.SolrTestCaseJ4 Using PointFields (NUMERIC_POINTS_SYSPROP=true) w/NUMERIC_DOCVALUES_SYSPROP=false
   [junit4]   2> 115452 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.a.s.SolrTestCaseJ4 Randomized ssl (true) and clientAuth (true) via: @org.apache.solr.util.RandomizeSSL(reason=, ssl=NaN, value=NaN, clientAuth=NaN)
   [junit4]   2> 115453 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.a.s.SolrTestCaseJ4 SecureRandom sanity checks: test.solr.allowed.securerandom=null & java.security.egd=file:/dev/./urandom
   [junit4]   2> 115453 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.a.s.BaseDistributedSearchTestCase Setting hostContext system property: /
   [junit4]   1> Formatting using clusterid: testClusterID
   [junit4]   2> 116730 WARN  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 116748 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 1.8.0_201-b09
   [junit4]   2> 116751 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 116751 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 116751 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 116753 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@5f28e2cf{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,AVAILABLE}
   [junit4]   2> 116886 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.w.WebAppContext@28bf0e7a{hdfs,/,file:///home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/jetty-localhost_localdomain-33937-hadoop-hdfs-3_2_0-tests_jar-_-any-7211816884241246482.dir/webapp/,AVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/hdfs}
   [junit4]   2> 116887 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.e.j.s.AbstractConnector Started ServerConnector@8df53a8{HTTP/1.1, (http/1.1)}{localhost.localdomain:33937}
   [junit4]   2> 116887 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.e.j.s.Server Started @116922ms
   [junit4]   2> 117377 WARN  (StorageLocationChecker thread 0) [     ] o.a.h.u.NativeCodeLoader Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
   [junit4]   2> 117419 WARN  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 117423 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 1.8.0_201-b09
   [junit4]   2> 117423 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 117423 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 117423 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 117423 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@5ab0eb6f{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,AVAILABLE}
   [junit4]   2> 117506 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.w.WebAppContext@136072d3{datanode,/,file:///home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/jetty-localhost-44817-hadoop-hdfs-3_2_0-tests_jar-_-any-6941540832583935035.dir/webapp/,AVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/datanode}
   [junit4]   2> 117506 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.e.j.s.AbstractConnector Started ServerConnector@108206a0{HTTP/1.1, (http/1.1)}{localhost:44817}
   [junit4]   2> 117506 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.e.j.s.Server Started @117541ms
   [junit4]   2> 118488 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0xe45510740f0ec80f: Processing first storage report for DS-cca3907d-05f6-48ad-80e5-acc45c228e72 from datanode f0c3d25c-0c16-479d-9170-445784beffdc
   [junit4]   2> 118490 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0xe45510740f0ec80f: from storage DS-cca3907d-05f6-48ad-80e5-acc45c228e72 node DatanodeRegistration(127.0.0.1:44115, datanodeUuid=f0c3d25c-0c16-479d-9170-445784beffdc, infoPort=38623, infoSecurePort=0, ipcPort=45897, storageInfo=lv=-57;cid=testClusterID;nsid=854549913;c=1588195427859), blocks: 0, hasStaleStorage: true, processing time: 2 msecs, invalidatedBlocks: 0
   [junit4]   2> 118490 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0xe45510740f0ec80f: Processing first storage report for DS-933a2890-a3db-4a97-9200-64a3b99fd410 from datanode f0c3d25c-0c16-479d-9170-445784beffdc
   [junit4]   2> 118490 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0xe45510740f0ec80f: from storage DS-933a2890-a3db-4a97-9200-64a3b99fd410 node DatanodeRegistration(127.0.0.1:44115, datanodeUuid=f0c3d25c-0c16-479d-9170-445784beffdc, infoPort=38623, infoSecurePort=0, ipcPort=45897, storageInfo=lv=-57;cid=testClusterID;nsid=854549913;c=1588195427859), blocks: 0, hasStaleStorage: false, processing time: 0 msecs, invalidatedBlocks: 0
   [junit4]   2> 118635 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER
   [junit4]   2> 118636 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer client port:0.0.0.0/0.0.0.0:0
   [junit4]   2> 118636 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer Starting server
   [junit4]   2> 118736 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.ZkTestServer start zk server on port:32947
   [junit4]   2> 118736 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.ZkTestServer waitForServerUp: 127.0.0.1:32947
   [junit4]   2> 118736 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.ZkTestServer parse host and port list: 127.0.0.1:32947
   [junit4]   2> 118736 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.ZkTestServer connecting to 127.0.0.1 32947
   [junit4]   2> 118737 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 118739 INFO  (zkConnectionManagerCallback-1187-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 118739 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 118740 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 118741 INFO  (zkConnectionManagerCallback-1189-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 118741 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 118742 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig-tlog.xml to /configs/conf1/solrconfig.xml
   [junit4]   2> 118746 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/schema.xml to /configs/conf1/schema.xml
   [junit4]   2> 118748 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig.snippet.randomindexconfig.xml to /configs/conf1/solrconfig.snippet.randomindexconfig.xml
   [junit4]   2> 118749 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/stopwords.txt to /configs/conf1/stopwords.txt
   [junit4]   2> 118749 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/protwords.txt to /configs/conf1/protwords.txt
   [junit4]   2> 118750 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/currency.xml to /configs/conf1/currency.xml
   [junit4]   2> 118750 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/enumsConfig.xml to /configs/conf1/enumsConfig.xml
   [junit4]   2> 118750 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/open-exchange-rates.json to /configs/conf1/open-exchange-rates.json
   [junit4]   2> 118751 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/mapping-ISOLatin1Accent.txt to /configs/conf1/mapping-ISOLatin1Accent.txt
   [junit4]   2> 118751 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/old_synonyms.txt to /configs/conf1/old_synonyms.txt
   [junit4]   2> 118752 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/synonyms.txt to /configs/conf1/synonyms.txt
   [junit4]   2> 118753 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 118754 INFO  (zkConnectionManagerCallback-1193-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 118754 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 118855 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Will use NRT replicas unless explicitly asked otherwise
   [junit4]   2> 118913 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 118913 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 118913 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ...
   [junit4]   2> 118913 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 1.8.0_201-b09
   [junit4]   2> 118915 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 118915 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 118915 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 118917 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@1e257985{/,null,AVAILABLE}
   [junit4]   2> 118919 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.e.j.s.AbstractConnector Started ServerConnector@75a97819{SSL, (ssl, http/1.1)}{127.0.0.1:40899}
   [junit4]   2> 118919 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.e.j.s.Server Started @118954ms
   [junit4]   2> 118919 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {solr.data.dir=hdfs://localhost.localdomain:34913/hdfs__localhost.localdomain_34913__home_jenkins_workspace_Lucene-Solr-8.x-Linux_solr_build_solr-core_test_J3_temp_solr.index.hdfs.CheckHdfsIndexTest_5623B1C5D346BD7C-001_tempDir-002_control_data, replicaType=NRT, hostContext=/, hostPort=40899, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_5623B1C5D346BD7C-001/control-001/cores}
   [junit4]   2> 118919 ERROR (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 118919 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 118919 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr? version 8.6.0
   [junit4]   2> 118919 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 118919 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr
   [junit4]   2> 118920 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-04-29T21:23:50.386Z
   [junit4]   2> 118920 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 118921 INFO  (zkConnectionManagerCallback-1195-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 118921 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 119022 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 119022 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_5623B1C5D346BD7C-001/control-001/solr.xml
   [junit4]   2> 119024 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 119024 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 119025 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@c156e1c, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 119068 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 119069 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.s.i.Http2SolrClient Create Http2SolrClient with HTTP/1.1 transport since Java 8 or lower versions does not support SSL + HTTP/2
   [junit4]   2> 119070 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@17e43880[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 119070 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@17e43880[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 119076 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.s.i.Http2SolrClient Create Http2SolrClient with HTTP/1.1 transport since Java 8 or lower versions does not support SSL + HTTP/2
   [junit4]   2> 119076 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@3fea69de[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 119076 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@3fea69de[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 119077 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:32947/solr
   [junit4]   2> 119078 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 119079 INFO  (zkConnectionManagerCallback-1206-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 119079 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 119180 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [n:127.0.0.1:40899_     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 119181 INFO  (zkConnectionManagerCallback-1208-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 119182 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [n:127.0.0.1:40899_     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 119220 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [n:127.0.0.1:40899_     ] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:40899_
   [junit4]   2> 119220 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [n:127.0.0.1:40899_     ] o.a.s.c.Overseer Overseer (id=72184095515475973-127.0.0.1:40899_-n_0000000000) starting
   [junit4]   2> 119222 INFO  (OverseerStateUpdate-72184095515475973-127.0.0.1:40899_-n_0000000000) [n:127.0.0.1:40899_     ] o.a.s.c.Overseer Starting to work on the main queue : 127.0.0.1:40899_
   [junit4]   2> 119222 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [n:127.0.0.1:40899_     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:40899_
   [junit4]   2> 119225 INFO  (zkCallback-1207-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 119226 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [n:127.0.0.1:40899_     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 119226 WARN  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [n:127.0.0.1:40899_     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=you make it. Consider configuring authentication/authorization before exposing Solr to users internal or  authorization=external.  See https://s.apache.org/solrsecurity for more info.  Solr is only as secure as disableddisabled
   [junit4]   2> 119241 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [n:127.0.0.1:40899_     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 119253 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [n:127.0.0.1:40899_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@c156e1c
   [junit4]   2> 119259 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [n:127.0.0.1:40899_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@c156e1c
   [junit4]   2> 119259 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [n:127.0.0.1:40899_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@c156e1c
   [junit4]   2> 119260 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [n:127.0.0.1:40899_     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_5623B1C5D346BD7C-001/control-001/cores
   [junit4]   2> 119273 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 119274 INFO  (zkConnectionManagerCallback-1225-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 119274 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 119275 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 119275 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:32947/solr ready
   [junit4]   2> 119288 INFO  (qtp495162550-2356) [n:127.0.0.1:40899_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:40899_&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 119290 INFO  (OverseerThreadFactory-1215-thread-1-processing-n:127.0.0.1:40899_) [n:127.0.0.1:40899_     ] o.a.s.c.a.c.CreateCollectionCmd Create collection control_collection
   [junit4]   2> 119396 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 119397 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 119399 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_    x:control_collection_shard1_replica_n1 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 119400 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_    x:control_collection_shard1_replica_n1 ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 120472 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.6.0
   [junit4]   2> 120493 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.IndexSchema Schema name=test
   [junit4]   2> 120592 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 120605 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.CoreContainer Creating SolrCore 'control_collection_shard1_replica_n1' using configuration from configset conf1, trusted=true
   [junit4]   2> 120605 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.control_collection.shard1.replica_n1' (registry 'solr.core.control_collection.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@c156e1c
   [junit4]   2> 120609 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory solr.hdfs.home=hdfs://localhost.localdomain:34913/solr_hdfs_home
   [junit4]   2> 120609 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Solr Kerberos Authentication disabled
   [junit4]   2> 120609 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrCore [[control_collection_shard1_replica_n1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_5623B1C5D346BD7C-001/control-001/cores/control_collection_shard1_replica_n1], dataDir=[hdfs://localhost.localdomain:34913/solr_hdfs_home/control_collection/core_node2/data/]
   [junit4]   2> 120610 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:34913/solr_hdfs_home/control_collection/core_node2/data/snapshot_metadata
   [junit4]   2> 120619 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 120619 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 120619 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 120672 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 120677 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:34913/solr_hdfs_home/control_collection/core_node2/data
   [junit4]   2> 120704 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:34913/solr_hdfs_home/control_collection/core_node2/data/index
   [junit4]   2> 120711 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 120711 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 120711 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 120721 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 120721 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=30, maxMergeAtOnceExplicit=15, maxMergedSegmentMB=41.2548828125, floorSegmentMB=1.546875, forceMergeDeletesPctAllowed=9.38999441744848, segmentsPerTier=49.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.35243406397215404, deletesPctAllowed=43.491407383382366
   [junit4]   2> 121363 WARN  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 121428 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.HdfsUpdateLog
   [junit4]   2> 121428 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 121428 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.HdfsUpdateLog Initializing HdfsUpdateLog: tlogDfsReplication=2
   [junit4]   2> 121446 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 121446 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 121452 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=39, maxMergeAtOnceExplicit=49, maxMergedSegmentMB=47.4765625, floorSegmentMB=1.8095703125, forceMergeDeletesPctAllowed=4.914254523645479, segmentsPerTier=40.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0, deletesPctAllowed=47.54078249477263
   [junit4]   2> 121577 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@1e83796d[control_collection_shard1_replica_n1] main]
   [junit4]   2> 121581 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 121581 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 121584 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms
   [junit4]   2> 121588 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1665343614410031104
   [junit4]   2> 121590 INFO  (searcherExecutor-1227-thread-1-processing-n:127.0.0.1:40899_ x:control_collection_shard1_replica_n1 c:control_collection s:shard1) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrCore [control_collection_shard1_replica_n1] Registered new searcher Searcher@1e83796d[control_collection_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 121596 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ZkShardTerms Successful update of terms at /collections/control_collection/terms/shard1 to Terms{values={core_node2=0}, version=0}
   [junit4]   2> 121596 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/control_collection/leaders/shard1
   [junit4]   2> 121602 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 121603 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 121603 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync replicas to https://127.0.0.1:40899/control_collection_shard1_replica_n1/
   [junit4]   2> 121603 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 121603 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy https://127.0.0.1:40899/control_collection_shard1_replica_n1/ has no replicas
   [junit4]   2> 121603 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/control_collection/leaders/shard1/leader after winning as /collections/control_collection/leader_elect/shard1/election/72184095515475973-core_node2-n_0000000000
   [junit4]   2> 121604 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I am the new leader: https://127.0.0.1:40899/control_collection_shard1_replica_n1/ shard1
   [junit4]   2> 121605 INFO  (zkCallback-1207-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 121606 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_ c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 121608 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=2208
   [junit4]   2> 121621 INFO  (qtp495162550-2356) [n:127.0.0.1:40899_     ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas
   [junit4]   2> 121709 INFO  (zkCallback-1207-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 121709 INFO  (zkCallback-1207-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 121714 INFO  (qtp495162550-2356) [n:127.0.0.1:40899_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:40899_&wt=javabin&version=2} status=0 QTime=2427
   [junit4]   2> 121714 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Waiting to see 1 active replicas in collection: control_collection
   [junit4]   2> 121830 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 121831 INFO  (zkConnectionManagerCallback-1236-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 121831 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 121832 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 121832 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:32947/solr ready
   [junit4]   2> 121832 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.ChaosMonkey monkey: init - expire sessions:false cause connection loss:false
   [junit4]   2> 121836 INFO  (qtp495162550-2356) [n:127.0.0.1:40899_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=1&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 121841 INFO  (OverseerThreadFactory-1215-thread-2-processing-n:127.0.0.1:40899_) [n:127.0.0.1:40899_     ] o.a.s.c.a.c.CreateCollectionCmd Create collection collection1
   [junit4]   2> 121842 INFO  (OverseerCollectionConfigSetProcessor-72184095515475973-127.0.0.1:40899_-n_0000000000) [n:127.0.0.1:40899_     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000000 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 122044 WARN  (OverseerThreadFactory-1215-thread-2-processing-n:127.0.0.1:40899_) [n:127.0.0.1:40899_     ] o.a.s.c.a.c.CreateCollectionCmd It is unusual to create a collection (collection1) without cores.
   [junit4]   2> 122045 INFO  (qtp495162550-2356) [n:127.0.0.1:40899_     ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas
   [junit4]   2> 122045 INFO  (qtp495162550-2356) [n:127.0.0.1:40899_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=1&wt=javabin&version=2} status=0 QTime=209
   [junit4]   2> 122046 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 122046 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 122046 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 122046 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 122046 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 122046 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 122046 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Creating jetty instances pullReplicaCount=0 numOtherReplicas=1
   [junit4]   2> 122092 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 1 in directory /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_5623B1C5D346BD7C-001/shard-1-001 of type NRT for shard1
   [junit4]   2> 122093 WARN  (closeThreadPool-1237-thread-1) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 122093 INFO  (closeThreadPool-1237-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 122093 INFO  (closeThreadPool-1237-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ...
   [junit4]   2> 122093 INFO  (closeThreadPool-1237-thread-1) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 1.8.0_201-b09
   [junit4]   2> 122094 INFO  (closeThreadPool-1237-thread-1) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 122094 INFO  (closeThreadPool-1237-thread-1) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 122094 INFO  (closeThreadPool-1237-thread-1) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 122094 INFO  (closeThreadPool-1237-thread-1) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@42b82ee9{/,null,AVAILABLE}
   [junit4]   2> 122095 INFO  (closeThreadPool-1237-thread-1) [     ] o.e.j.s.AbstractConnector Started ServerConnector@57b230eb{SSL, (ssl, http/1.1)}{127.0.0.1:38719}
   [junit4]   2> 122095 INFO  (closeThreadPool-1237-thread-1) [     ] o.e.j.s.Server Started @122130ms
   [junit4]   2> 122095 INFO  (closeThreadPool-1237-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {solr.data.dir=hdfs://localhost.localdomain:34913/hdfs__localhost.localdomain_34913__home_jenkins_workspace_Lucene-Solr-8.x-Linux_solr_build_solr-core_test_J3_temp_solr.index.hdfs.CheckHdfsIndexTest_5623B1C5D346BD7C-001_tempDir-002_jetty1, replicaType=NRT, solrconfig=solrconfig.xml, hostContext=/, hostPort=38719, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_5623B1C5D346BD7C-001/shard-1-001/cores}
   [junit4]   2> 122095 ERROR (closeThreadPool-1237-thread-1) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 122095 INFO  (closeThreadPool-1237-thread-1) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 122095 INFO  (closeThreadPool-1237-thread-1) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr? version 8.6.0
   [junit4]   2> 122095 INFO  (closeThreadPool-1237-thread-1) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 122095 INFO  (closeThreadPool-1237-thread-1) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr
   [junit4]   2> 122095 INFO  (closeThreadPool-1237-thread-1) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-04-29T21:23:53.561Z
   [junit4]   2> 122096 INFO  (closeThreadPool-1237-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 122097 INFO  (zkConnectionManagerCallback-1239-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 122097 INFO  (closeThreadPool-1237-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 122198 INFO  (closeThreadPool-1237-thread-1) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 122198 INFO  (closeThreadPool-1237-thread-1) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_5623B1C5D346BD7C-001/shard-1-001/solr.xml
   [junit4]   2> 122200 INFO  (closeThreadPool-1237-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 122200 INFO  (closeThreadPool-1237-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 122201 INFO  (closeThreadPool-1237-thread-1) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@c156e1c, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 122269 INFO  (closeThreadPool-1237-thread-1) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 122270 WARN  (closeThreadPool-1237-thread-1) [     ] o.a.s.c.s.i.Http2SolrClient Create Http2SolrClient with HTTP/1.1 transport since Java 8 or lower versions does not support SSL + HTTP/2
   [junit4]   2> 122270 WARN  (closeThreadPool-1237-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@31bbc6b8[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 122270 WARN  (closeThreadPool-1237-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@31bbc6b8[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 122272 WARN  (closeThreadPool-1237-thread-1) [     ] o.a.s.c.s.i.Http2SolrClient Create Http2SolrClient with HTTP/1.1 transport since Java 8 or lower versions does not support SSL + HTTP/2
   [junit4]   2> 122272 WARN  (closeThreadPool-1237-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@4884813d[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 122272 WARN  (closeThreadPool-1237-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@4884813d[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 122272 INFO  (closeThreadPool-1237-thread-1) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:32947/solr
   [junit4]   2> 122273 INFO  (closeThreadPool-1237-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 122273 INFO  (zkConnectionManagerCallback-1250-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 122274 INFO  (closeThreadPool-1237-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 122375 INFO  (closeThreadPool-1237-thread-1) [n:127.0.0.1:38719_     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 122375 INFO  (zkConnectionManagerCallback-1252-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 122375 INFO  (closeThreadPool-1237-thread-1) [n:127.0.0.1:38719_     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 122377 INFO  (closeThreadPool-1237-thread-1) [n:127.0.0.1:38719_     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 122379 INFO  (closeThreadPool-1237-thread-1) [n:127.0.0.1:38719_     ] o.a.s.c.ZkController Publish node=127.0.0.1:38719_ as DOWN
   [junit4]   2> 122379 INFO  (closeThreadPool-1237-thread-1) [n:127.0.0.1:38719_     ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 122379 INFO  (closeThreadPool-1237-thread-1) [n:127.0.0.1:38719_     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:38719_
   [junit4]   2> 122380 INFO  (zkCallback-1207-thread-2) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 122380 INFO  (zkCallback-1235-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 122380 INFO  (zkCallback-1251-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 122380 INFO  (closeThreadPool-1237-thread-1) [n:127.0.0.1:38719_     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 122381 WARN  (closeThreadPool-1237-thread-1) [n:127.0.0.1:38719_     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=you make it. Consider configuring authentication/authorization before exposing Solr to users internal or  authorization=external.  See https://s.apache.org/solrsecurity for more info.  Solr is only as secure as disableddisabled
   [junit4]   2> 122393 INFO  (closeThreadPool-1237-thread-1) [n:127.0.0.1:38719_     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 122407 INFO  (closeThreadPool-1237-thread-1) [n:127.0.0.1:38719_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@c156e1c
   [junit4]   2> 122412 INFO  (closeThreadPool-1237-thread-1) [n:127.0.0.1:38719_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@c156e1c
   [junit4]   2> 122412 INFO  (closeThreadPool-1237-thread-1) [n:127.0.0.1:38719_     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@c156e1c
   [junit4]   2> 122413 INFO  (closeThreadPool-1237-thread-1) [n:127.0.0.1:38719_     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_5623B1C5D346BD7C-001/shard-1-001/cores
   [junit4]   2> 122418 INFO  (closeThreadPool-1237-thread-1) [     ] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:38719_
   [junit4]   2> 122429 INFO  (qtp540337252-2417) [n:127.0.0.1:38719_     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :addreplica with params node=127.0.0.1:38719_&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 122430 INFO  (OverseerCollectionConfigSetProcessor-72184095515475973-127.0.0.1:40899_-n_0000000000) [n:127.0.0.1:40899_     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000002 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 122434 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 122437 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.core.control_collection.shard1.replica_n1:INDEX.sizeInBytes} status=0 QTime=2
   [junit4]   2> 122439 INFO  (qtp495162550-2356) [n:127.0.0.1:40899_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 122441 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 122443 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.core.control_collection.shard1.replica_n1:INDEX.sizeInBytes} status=0 QTime=2
   [junit4]   2> 122445 INFO  (qtp495162550-2356) [n:127.0.0.1:40899_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 122446 INFO  (OverseerThreadFactory-1215-thread-3-processing-n:127.0.0.1:40899_) [n:127.0.0.1:40899_ c:collection1 s:shard1   ] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:38719_ for creating new replica of shard shard1 for collection collection1
   [junit4]   2> 122446 INFO  (OverseerThreadFactory-1215-thread-3-processing-n:127.0.0.1:40899_) [n:127.0.0.1:40899_ c:collection1 s:shard1   ] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command.
   [junit4]   2> 122460 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_    x:collection1_shard1_replica_n1 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_n1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 123466 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.6.0
   [junit4]   2> 123481 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.IndexSchema Schema name=test
   [junit4]   2> 123548 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 123556 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard1_replica_n1' using configuration from configset conf1, trusted=true
   [junit4]   2> 123556 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.collection1.shard1.replica_n1' (registry 'solr.core.collection1.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@c156e1c
   [junit4]   2> 123556 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory solr.hdfs.home=hdfs://localhost.localdomain:34913/solr_hdfs_home
   [junit4]   2> 123556 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Solr Kerberos Authentication disabled
   [junit4]   2> 123556 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrCore [[collection1_shard1_replica_n1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/../../../../../../../../../home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_5623B1C5D346BD7C-001/shard-1-001/cores/collection1_shard1_replica_n1], dataDir=[hdfs://localhost.localdomain:34913/solr_hdfs_home/collection1/core_node2/data/]
   [junit4]   2> 123557 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:34913/solr_hdfs_home/collection1/core_node2/data/snapshot_metadata
   [junit4]   2> 123561 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 123561 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 123561 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 123567 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 123568 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:34913/solr_hdfs_home/collection1/core_node2/data
   [junit4]   2> 123583 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:34913/solr_hdfs_home/collection1/core_node2/data/index
   [junit4]   2> 123589 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 123589 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 123589 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 123593 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 123594 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=30, maxMergeAtOnceExplicit=15, maxMergedSegmentMB=41.2548828125, floorSegmentMB=1.546875, forceMergeDeletesPctAllowed=9.38999441744848, segmentsPerTier=49.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=0.35243406397215404, deletesPctAllowed=43.491407383382366
   [junit4]   2> 123609 WARN  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 123633 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.HdfsUpdateLog
   [junit4]   2> 123633 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 123633 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.HdfsUpdateLog Initializing HdfsUpdateLog: tlogDfsReplication=2
   [junit4]   2> 123641 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 123641 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 123643 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=39, maxMergeAtOnceExplicit=49, maxMergedSegmentMB=47.4765625, floorSegmentMB=1.8095703125, forceMergeDeletesPctAllowed=4.914254523645479, segmentsPerTier=40.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0, deletesPctAllowed=47.54078249477263
   [junit4]   2> 123650 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@33a44303[collection1_shard1_replica_n1] main]
   [junit4]   2> 123650 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 123650 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 123651 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms
   [junit4]   2> 123651 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1665343616573243392
   [junit4]   2> 123653 INFO  (searcherExecutor-1263-thread-1-processing-n:127.0.0.1:38719_ x:collection1_shard1_replica_n1 c:collection1 s:shard1) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrCore [collection1_shard1_replica_n1] Registered new searcher Searcher@33a44303[collection1_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 123655 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection1/terms/shard1 to Terms{values={core_node2=0}, version=0}
   [junit4]   2> 123655 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection1/leaders/shard1
   [junit4]   2> 123656 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 123656 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 123656 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync replicas to https://127.0.0.1:38719/collection1_shard1_replica_n1/
   [junit4]   2> 123656 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 123656 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy https://127.0.0.1:38719/collection1_shard1_replica_n1/ has no replicas
   [junit4]   2> 123657 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/collection1/leaders/shard1/leader after winning as /collections/collection1/leader_elect/shard1/election/72184095515475978-core_node2-n_0000000000
   [junit4]   2> 123657 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I am the new leader: https://127.0.0.1:38719/collection1_shard1_replica_n1/ shard1
   [junit4]   2> 123758 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_ c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 123760 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_n1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1300
   [junit4]   2> 123761 INFO  (qtp540337252-2417) [n:127.0.0.1:38719_ c:collection1    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:38719_&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2} status=0 QTime=1332
   [junit4]   2> 123762 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Waiting to see 1 active replicas in collection: collection1
   [junit4]   2> 123861 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.SolrTestCaseJ4 ###Starting testDeletedDocs
   [junit4]   2> 124431 INFO  (OverseerCollectionConfigSetProcessor-72184095515475973-127.0.0.1:40899_-n_0000000000) [n:127.0.0.1:40899_     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000004 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 179247 INFO  (qtp540337252-2420) [n:127.0.0.1:38719_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.core.collection1.shard1.replica_n1:QUERY./select.requests&key=solr.core.collection1.shard1.replica_n1:INDEX.sizeInBytes&key=solr.core.collection1.shard1.replica_n1:UPDATE./update.requests} status=0 QTime=2
   [junit4]   2> 179252 INFO  (qtp540337252-2417) [n:127.0.0.1:38719_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.jvm:os.processCpuLoad&key=solr.node:CONTAINER.fs.coreRoot.usableSpace&key=solr.jvm:os.systemLoadAverage&key=solr.jvm:memory.heap.used} status=0 QTime=1
   [junit4]   2> 179257 INFO  (qtp495162550-2359) [n:127.0.0.1:40899_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.core.control_collection.shard1.replica_n1:QUERY./select.requests&key=solr.core.control_collection.shard1.replica_n1:INDEX.sizeInBytes&key=solr.core.control_collection.shard1.replica_n1:UPDATE./update.requests} status=0 QTime=1
   [junit4]   2> 179259 INFO  (qtp495162550-2356) [n:127.0.0.1:40899_     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.jvm:os.processCpuLoad&key=solr.node:CONTAINER.fs.coreRoot.usableSpace&key=solr.jvm:os.systemLoadAverage&key=solr.jvm:memory.heap.used} status=0 QTime=0
   [junit4]   2> 200574 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:34913/solr
   [junit4]   2> 200575 INFO  (TEST-CheckHdfsIndexTest.testDeletedDocs-seed#[5623B1C5D346BD7C]) [     ] o.a.s.SolrTestCaseJ4 ###Ending testDeletedDocs
   [junit4]   2> 200681 INFO  (closeThreadPool-1270-thread-2) [     ] o.a.s.c.CoreContainer Shutting down CoreContainer instance=1371786826
   [junit4]   2> 200681 INFO  (closeThreadPool-1270-thread-2) [     ] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:40899_
   [junit4]   2> 200681 INFO  (closeThreadPool-1270-thread-2) [     ] o.a.s.c.ZkController Publish this node as DOWN...
   [junit4]   2> 200681 INFO  (closeThreadPool-1270-thread-2) [     ] o.a.s.c.ZkController Publish node=127.0.0.1:40899_ as DOWN
   [junit4]   2> 200682 INFO  (closeThreadPool-1270-thread-1) [     ] o.a.s.c.CoreContainer Shutting down CoreContainer instance=1132312072
   [junit4]   2> 200682 INFO  (closeThreadPool-1270-thread-1) [     ] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:38719_
   [junit4]   2> 200683 INFO  (closeThreadPool-1270-thread-1) [     ] o.a.s.c.ZkController Publish this node as DOWN...
   [junit4]   2> 200683 INFO  (closeThreadPool-1270-thread-1) [     ] o.a.s.c.ZkController Publish node=127.0.0.1:38719_ as DOWN
   [junit4]   2> 200683 INFO  (coreCloseExecutor-1276-thread-1) [n:127.0.0.1:40899_     ] o.a.s.c.SolrCore [control_collection_shard1_replica_n1]  CLOSING SolrCore org.apache.solr.core.SolrCore@3dff685f
   [junit4]   2> 200683 INFO  (coreCloseExecutor-1276-thread-1) [n:127.0.0.1:40899_     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.control_collection.shard1.replica_n1 tag=SolrCore@3dff685f
   [junit4]   2> 200684 INFO  (coreCloseExecutor-1276-thread-1) [n:127.0.0.1:40899_     ] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@1f9888e0: rootName = null, domain = solr.core.control_collection.shard1.replica_n1, service url = null, agent id = null] for registry solr.core.control_collection.shard1.replica_n1/com.codahale.metrics.MetricRegistry@3be2744c
   [junit4]   2> 200687 INFO  (zkCallback-1207-thread-4) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 200687 INFO  (zkCallback-1207-thread-5) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 200695 INFO  (coreCloseExecutor-1278-thread-1) [n:127.0.0.1:38719_     ] o.a.s.c.SolrCore [collection1_shard1_replica_n1]  CLOSING SolrCore org.apache.solr.core.SolrCore@7e168ece
   [junit4]   2> 200695 INFO  (coreCloseExecutor-1278-thread-1) [n:127.0.0.1:38719_     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.collection1.shard1.replica_n1 tag=SolrCore@7e168ece
   [junit4]   2> 200695 INFO  (coreCloseExecutor-1278-thread-1) [n:127.0.0.1:38719_     ] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@27b330a2: rootName = null, domain = solr.core.collection1.shard1.replica_n1, service url = null, agent id = null] for registry solr.core.collection1.shard1.replica_n1/com.codahale.metrics.MetricRegistry@7ab8689f
   [junit4]   2> 200703 INFO  (coreCloseExecutor-1276-thread-1) [n:127.0.0.1:40899_     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.control_collection.shard1.leader tag=SolrCore@3dff685f
   [junit4]   2> 200704 INFO  (coreCloseExecutor-1276-thread-1) [n:127.0.0.1:40899_     ] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close()  ... SKIPPED (unnecessary).
   [junit4]   2> 200709 INFO  (coreCloseExecutor-1278-thread-1) [n:127.0.0.1:38719_     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.collection1.shard1.leader tag=SolrCore@7e168ece
   [junit4]   2> 200711 INFO  (coreCloseExecutor-1278-thread-1) [n:127.0.0.1:38719_     ] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close()  ... SKIPPED (unnecessary).
   [junit4]   2> 200716 INFO  (coreCloseExecutor-1278-thread-1) [n:127.0.0.1:38719_     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:34913/solr_hdfs_home/collection1/core_node2/data/snapshot_metadata
   [junit4]   2> 200716 INFO  (coreCloseExecutor-1278-thread-1) [n:127.0.0.1:38719_     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:34913/solr_hdfs_home/collection1/core_node2/data
   [junit4]   2> 200727 INFO  (coreCloseExecutor-1278-thread-1) [n:127.0.0.1:38719_     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:34913/solr_hdfs_home/collection1/core_node2/data/index
   [junit4]   2> 200727 INFO  (coreCloseExecutor-1276-thread-1) [n:127.0.0.1:40899_     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:34913/solr_hdfs_home/control_collection/core_node2/data/index
   [junit4]   2> 200729 INFO  (coreCloseExecutor-1276-thread-1) [n:127.0.0.1:40899_     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:34913/solr_hdfs_home/control_collection/core_node2/data/snapshot_metadata
   [junit4]   2> 200729 INFO  (coreCloseExecutor-1276-thread-1) [n:127.0.0.1:40899_     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:34913/solr_hdfs_home/control_collection/core_node2/data
   [junit4]   2> 200738 INFO  (closeThreadPool-1270-thread-1) [     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null
   [junit4]   2> 200738 INFO  (closeThreadPool-1270-thread-1) [     ] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@38afb859: rootName = null, domain = solr.node, service url = null, agent id = null] for registry solr.node/com.codahale.metrics.MetricRegistry@29844ac2
   [junit4]   2> 200738 INFO  (closeThreadPool-1270-thread-2) [     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null
   [junit4]   2> 200738 INFO  (closeThreadPool-1270-thread-2) [     ] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@50145f98: rootName = null, domain = solr.node, service url = null, agent id = null] for registry solr.node/com.codahale.metrics.

[...truncated too long message...]

.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45) ~[java/:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41) ~[java/:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) ~[java/:?]
   [junit4]   2> 	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) ~[java/:?]
   [junit4]   2> 	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) ~[java/:?]
   [junit4]   2> 	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54) ~[java/:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at java.lang.Thread.run(Thread.java:748) [?:1.8.0_201]
   [junit4]   2> 242535 WARN  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.a.h.h.s.d.DirectoryScanner DirectoryScanner: shutdown has been called
   [junit4]   2> 242566 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.w.WebAppContext@136072d3{datanode,/,null,UNAVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/datanode}
   [junit4]   2> 242567 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@108206a0{HTTP/1.1, (http/1.1)}{localhost:0}
   [junit4]   2> 242567 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 242567 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@5ab0eb6f{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,UNAVAILABLE}
   [junit4]   2> 242573 WARN  (BP-649695531-127.0.0.1-1588195427859 heartbeating to localhost.localdomain/127.0.0.1:34913) [     ] o.a.h.h.s.d.IncrementalBlockReportManager IncrementalBlockReportManager interrupted
   [junit4]   2> 242574 WARN  (BP-649695531-127.0.0.1-1588195427859 heartbeating to localhost.localdomain/127.0.0.1:34913) [     ] o.a.h.h.s.d.DataNode Ending block pool service for: Block pool BP-649695531-127.0.0.1-1588195427859 (Datanode Uuid f0c3d25c-0c16-479d-9170-445784beffdc) service to localhost.localdomain/127.0.0.1:34913
   [junit4]   2> 242594 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.w.WebAppContext@28bf0e7a{hdfs,/,null,UNAVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/hdfs}
   [junit4]   2> 242596 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@8df53a8{HTTP/1.1, (http/1.1)}{localhost.localdomain:0}
   [junit4]   2> 242596 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 242596 INFO  (SUITE-CheckHdfsIndexTest-seed#[5623B1C5D346BD7C]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@5f28e2cf{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,UNAVAILABLE}
   [junit4]   2> NOTE: leaving temporary files on disk at: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_5623B1C5D346BD7C-001
   [junit4]   2> Apr 29, 2020 9:25:54 PM com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks
   [junit4]   2> WARNING: Will linger awaiting termination of 65 leaked thread(s).
   [junit4]   2> NOTE: test params are: codec=Asserting(Lucene84), sim=Asserting(org.apache.lucene.search.similarities.AssertingSimilarity@2efb2e14), locale=de-LU, timezone=Europe/Warsaw
   [junit4]   2> NOTE: Linux 5.3.0-46-generic amd64/Oracle Corporation 1.8.0_201 (64-bit)/cpus=16,threads=9,free=242585168,total=506462208
   [junit4]   2> NOTE: All tests run in this JVM: [TestNamedUpdateProcessors, ClusterStateTest, CdcrBootstrapTest, TestSolrCachePerf, TestInitQParser, PreAnalyzedFieldManagedSchemaCloudTest, TestMissingGroups, ConfigSetsAPITest, DocValuesMissingTest, CheckHdfsIndexTest]
   [junit4] Completed [204/907 (1!)] on J3 in 130.15s, 5 tests, 1 error, 1 skipped <<< FAILURES!

[...truncated 40533 lines...]
-ecj-javadoc-lint-src:
    [mkdir] Created dir: /tmp/ecj1123493866
 [ecj-lint] Compiling 931 source files to /tmp/ecj1123493866
 [ecj-lint] ----------
 [ecj-lint] 1. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/CodecUtil.java (at line 523)
 [ecj-lint] 	throw new CorruptIndexException("misplaced codec footer (file truncated?): length=" + in.length() + " but footerLength==" + footerLength(), input);
 [ecj-lint] 	^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'in' is not closed at this location
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 2. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingStoredFieldsReader.java (at line 166)
 [ecj-lint] 	FieldsIndexReader fieldsIndexReader = new FieldsIndexReader(d, si.name, segmentSuffix, INDEX_EXTENSION_PREFIX, INDEX_CODEC_NAME, si.getId());
 [ecj-lint] 	                  ^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'fieldsIndexReader' is never closed
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 3. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingTermVectorsReader.java (at line 148)
 [ecj-lint] 	FieldsIndexReader fieldsIndexReader = new FieldsIndexReader(d, si.name, segmentSuffix, VECTORS_INDEX_EXTENSION_PREFIX, VECTORS_INDEX_CODEC_NAME, si.getId());
 [ecj-lint] 	                  ^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'fieldsIndexReader' is never closed
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 4. ERROR in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/search/IndexSearcher.java (at line 50)
 [ecj-lint] 	import org.apache.lucene.util.automaton.ByteRunAutomaton;
 [ecj-lint] 	       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 [ecj-lint] The import org.apache.lucene.util.automaton.ByteRunAutomaton is never used
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 5. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/util/automaton/Operations.java (at line 742)
 [ecj-lint] 	Integer q = newstate.get(statesSet);
 [ecj-lint] 	                         ^^^^^^^^^
 [ecj-lint] Unlikely argument type SortedIntSet for get(Object) on a Map<SortedIntSet.FrozenIntSet,Integer>
 [ecj-lint] ----------
 [ecj-lint] 5 problems (1 error, 4 warnings)

BUILD FAILED
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/build.xml:634: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/build.xml:101: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build.xml:201: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/common-build.xml:2127: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/common-build.xml:2166: Compile failed; see the compiler error output for details.

Total time: 38 minutes 33 seconds
Build step 'Invoke Ant' marked build as failure
Archiving artifacts
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
[WARNINGS] Skipping publisher since build result is FAILURE
Recording test results
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2

[JENKINS] Lucene-Solr-8.x-Linux (64bit/jdk-12.0.2) - Build # 2912 - Still Failing!

Posted by Policeman Jenkins Server <je...@thetaphi.de>.
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-8.x-Linux/2912/
Java: 64bit/jdk-12.0.2 -XX:-UseCompressedOops -XX:+UseConcMarkSweepGC

All tests passed

Build Log:
[...truncated 2305 lines...]
   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/core/test/temp/junit4-J4-20200429_191949_07315487047711309842169.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/core/test/temp/junit4-J0-20200429_191949_0727249425508733831040.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 13 lines...]
   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/core/test/temp/junit4-J5-20200429_191949_0736034780972997055122.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/core/test/temp/junit4-J2-20200429_191949_07287565167273501156.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/core/test/temp/junit4-J1-20200429_191949_0723385991900358283640.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/core/test/temp/junit4-J3-20200429_191949_0722261498630574269316.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

[...truncated 315 lines...]
   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/test-framework/test/temp/junit4-J4-20200429_192220_1948610166636978260303.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/test-framework/test/temp/junit4-J0-20200429_192220_18810285113889772217253.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/test-framework/test/temp/junit4-J2-20200429_192220_19211689521904843660596.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/test-framework/test/temp/junit4-J5-20200429_192220_1967758621397075479808.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

[...truncated 11 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/test-framework/test/temp/junit4-J1-20200429_192220_1921501908469236598719.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 9 lines...]
   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/test-framework/test/temp/junit4-J3-20200429_192220_19415472337593523845414.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

[...truncated 1113 lines...]
   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/common/test/temp/junit4-J5-20200429_192305_9778132301562695708401.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

[...truncated 6 lines...]
   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/common/test/temp/junit4-J4-20200429_192305_9711905615879416987009.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/common/test/temp/junit4-J2-20200429_192305_97716710412763869974036.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/common/test/temp/junit4-J1-20200429_192305_96513579763189883819697.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 6 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/common/test/temp/junit4-J0-20200429_192305_9652146076491351893943.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/common/test/temp/junit4-J3-20200429_192305_97016994014922187004653.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

[...truncated 250 lines...]
   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/icu/test/temp/junit4-J4-20200429_192339_36417713609164994371313.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

[...truncated 6 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/icu/test/temp/junit4-J0-20200429_192339_36415822783072076566798.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/icu/test/temp/junit4-J5-20200429_192339_36417210855781407070693.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

[...truncated 11 lines...]
   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/icu/test/temp/junit4-J3-20200429_192339_36417009947350276537363.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/icu/test/temp/junit4-J1-20200429_192339_3641382433551362191091.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/icu/test/temp/junit4-J2-20200429_192339_3642656907919669513090.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 235 lines...]
   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/kuromoji/test/temp/junit4-J5-20200429_192347_3764837426698830502705.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/kuromoji/test/temp/junit4-J0-20200429_192347_3719220812175902283986.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 9 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/kuromoji/test/temp/junit4-J2-20200429_192347_3731985230595908511458.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/kuromoji/test/temp/junit4-J4-20200429_192347_37417777524564899712144.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/kuromoji/test/temp/junit4-J3-20200429_192347_372942805051749844750.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/kuromoji/test/temp/junit4-J1-20200429_192347_3713678158933918056752.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 168 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/morfologik/test/temp/junit4-J1-20200429_192354_3854353279698004098392.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/morfologik/test/temp/junit4-J0-20200429_192354_3855435339912383456734.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/morfologik/test/temp/junit4-J2-20200429_192354_38513317092092070668049.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 182 lines...]
   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/nori/test/temp/junit4-J5-20200429_192357_7174282560799665045426.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/nori/test/temp/junit4-J2-20200429_192357_7162356014797835532689.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 13 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/nori/test/temp/junit4-J1-20200429_192357_7163743257232569882239.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/nori/test/temp/junit4-J3-20200429_192357_71612551692239410064319.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/nori/test/temp/junit4-J4-20200429_192357_7161297737608684494157.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

[...truncated 5 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/nori/test/temp/junit4-J0-20200429_192357_7163233379175144413244.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 170 lines...]
   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/opennlp/test/temp/junit4-J3-20200429_192402_68617824698892854177666.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

[...truncated 6 lines...]
   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/opennlp/test/temp/junit4-J4-20200429_192402_68711505846083732136634.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/opennlp/test/temp/junit4-J0-20200429_192402_68713204350397030874842.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/opennlp/test/temp/junit4-J2-20200429_192402_68614876931606772008776.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/opennlp/test/temp/junit4-J1-20200429_192402_6867365999066724293450.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 174 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/phonetic/test/temp/junit4-J2-20200429_192405_66017964292281182773932.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/phonetic/test/temp/junit4-J4-20200429_192405_6614243163584757208333.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

[...truncated 6 lines...]
   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/phonetic/test/temp/junit4-J3-20200429_192405_66116647800898249424870.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/phonetic/test/temp/junit4-J0-20200429_192405_66013106422938421944182.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/phonetic/test/temp/junit4-J5-20200429_192405_6619121080658141606268.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/phonetic/test/temp/junit4-J1-20200429_192405_66015212516450206161594.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 167 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/smartcn/test/temp/junit4-J0-20200429_192411_10418309477672721663114.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/smartcn/test/temp/junit4-J1-20200429_192411_10415415015095847004178.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 169 lines...]
   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/stempel/test/temp/junit4-J3-20200429_192414_009816252853483014199.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/stempel/test/temp/junit4-J1-20200429_192414_0099754485005124798053.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/stempel/test/temp/junit4-J2-20200429_192414_009176056349476290353.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/analysis/stempel/test/temp/junit4-J0-20200429_192414_0097149140187530007520.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 182 lines...]
   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/backward-codecs/test/temp/junit4-J4-20200429_192416_69217190517451548880910.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/backward-codecs/test/temp/junit4-J3-20200429_192416_69116264418482185611950.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

[...truncated 5 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/backward-codecs/test/temp/junit4-J2-20200429_192416_6911423356628898426028.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 11 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/backward-codecs/test/temp/junit4-J0-20200429_192416_69115115430103547436302.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 19 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/backward-codecs/test/temp/junit4-J1-20200429_192416_6914026486122659345190.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 15 lines...]
   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/backward-codecs/test/temp/junit4-J5-20200429_192416_6944707570695889655192.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

[...truncated 1389 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/benchmark/test/temp/junit4-J1-20200429_192622_49515775868415964149045.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/benchmark/test/temp/junit4-J3-20200429_192622_49516168840336373358877.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/benchmark/test/temp/junit4-J4-20200429_192622_49511527921702997564887.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

[...truncated 6 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/benchmark/test/temp/junit4-J0-20200429_192622_49510566672338055823385.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/benchmark/test/temp/junit4-J5-20200429_192622_4959668631534700947256.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/benchmark/test/temp/junit4-J2-20200429_192622_49514618072446408020943.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 240 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/classification/test/temp/junit4-J0-20200429_192631_0634974145920785001703.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/classification/test/temp/junit4-J3-20200429_192631_0638198921783458621998.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/classification/test/temp/junit4-J5-20200429_192631_06311695493435763520768.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/classification/test/temp/junit4-J4-20200429_192631_06310603628188687205372.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/classification/test/temp/junit4-J1-20200429_192631_06315143714466698286804.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/classification/test/temp/junit4-J2-20200429_192631_0631188276166965334791.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 301 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/codecs/test/temp/junit4-J2-20200429_192638_2705270710459861449190.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 8 lines...]
   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/codecs/test/temp/junit4-J5-20200429_192638_27014255870331530657807.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/codecs/test/temp/junit4-J3-20200429_192638_2701186373995103098302.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

[...truncated 23 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/codecs/test/temp/junit4-J0-20200429_192638_27010899962724981911526.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 9 lines...]
   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/codecs/test/temp/junit4-J4-20200429_192638_2705757711384607731554.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

[...truncated 11 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/codecs/test/temp/junit4-J1-20200429_192638_2709262818690310476476.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 239 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/demo/test/temp/junit4-J2-20200429_192759_10514981485092965652748.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/demo/test/temp/junit4-J0-20200429_192759_1051144019149731695066.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/demo/test/temp/junit4-J3-20200429_192759_10512087429320543530488.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/demo/test/temp/junit4-J1-20200429_192759_10510738903383018242702.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 6 lines...]
   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/demo/test/temp/junit4-J4-20200429_192759_1051427587781661584598.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/demo/test/temp/junit4-J5-20200429_192759_10511078062576118750000.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

[...truncated 179 lines...]
   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/expressions/test/temp/junit4-J3-20200429_192802_16717862513368676378797.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/expressions/test/temp/junit4-J1-20200429_192802_1677250305639190473856.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/expressions/test/temp/junit4-J0-20200429_192802_16714184434830242231738.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 6 lines...]
   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/expressions/test/temp/junit4-J4-20200429_192802_1679393434999276494265.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/expressions/test/temp/junit4-J5-20200429_192802_1672335524005512810755.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/expressions/test/temp/junit4-J2-20200429_192802_16711262652854465335247.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 239 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/facet/test/temp/junit4-J0-20200429_192807_02716022951396790927066.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/facet/test/temp/junit4-J2-20200429_192807_02714264445313921379441.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/facet/test/temp/junit4-J3-20200429_192807_0278642970576841228605.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

[...truncated 9 lines...]
   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/facet/test/temp/junit4-J4-20200429_192807_02715195452874954511052.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/facet/test/temp/junit4-J5-20200429_192807_0271895654484695607364.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/facet/test/temp/junit4-J1-20200429_192807_02710904622636268748882.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 185 lines...]
   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/grouping/test/temp/junit4-J4-20200429_192821_43215642202455499746709.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/grouping/test/temp/junit4-J1-20200429_192821_432868436053125156912.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/grouping/test/temp/junit4-J2-20200429_192821_4325282572345212539694.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/grouping/test/temp/junit4-J3-20200429_192821_4321805211219704077741.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/grouping/test/temp/junit4-J5-20200429_192821_43212064306579789640148.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/grouping/test/temp/junit4-J0-20200429_192821_43216551632859595123039.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 251 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/highlighter/test/temp/junit4-J1-20200429_192828_70713575135051192004812.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/highlighter/test/temp/junit4-J2-20200429_192828_707932578190443100290.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/highlighter/test/temp/junit4-J3-20200429_192828_70715632075958187600600.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

[...truncated 6 lines...]
   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/highlighter/test/temp/junit4-J5-20200429_192828_70710171481393663802210.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

[...truncated 9 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/highlighter/test/temp/junit4-J0-20200429_192828_70712816283343182744969.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/highlighter/test/temp/junit4-J4-20200429_192828_7076145255883751157975.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

[...truncated 172 lines...]
   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/join/test/temp/junit4-J3-20200429_192840_77314128453862730525653.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/join/test/temp/junit4-J5-20200429_192840_773147496353359863646.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

[...truncated 6 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/join/test/temp/junit4-J1-20200429_192840_77317533717694595452492.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/join/test/temp/junit4-J4-20200429_192840_77317977456200699879193.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/join/test/temp/junit4-J2-20200429_192840_7736616448550166400985.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/join/test/temp/junit4-J0-20200429_192840_7736561051753985094232.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 306 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/luke/test/temp/junit4-J2-20200429_192852_883450967285869968644.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/luke/test/temp/junit4-J0-20200429_192852_88312233117694911864116.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/luke/test/temp/junit4-J1-20200429_192852_8832726773331589110223.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 9 lines...]
   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/luke/test/temp/junit4-J4-20200429_192852_88411684235527479602194.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/luke/test/temp/junit4-J5-20200429_192852_88412088879412079990407.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/luke/test/temp/junit4-J3-20200429_192852_88313192495677335644389.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

[...truncated 163 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/memory/test/temp/junit4-J0-20200429_192858_2281213383444146493817.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/memory/test/temp/junit4-J1-20200429_192858_2288487681441807167739.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 185 lines...]
   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/misc/test/temp/junit4-J4-20200429_192902_1483976540874907280809.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/misc/test/temp/junit4-J2-20200429_192902_14817159886440086154523.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/misc/test/temp/junit4-J1-20200429_192902_14818296634498301939940.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 6 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/misc/test/temp/junit4-J0-20200429_192902_1481570217357634653758.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/misc/test/temp/junit4-J3-20200429_192902_14813222891855612595292.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

[...truncated 5 lines...]
   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/misc/test/temp/junit4-J5-20200429_192902_1492711023076376129372.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

[...truncated 267 lines...]
   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/monitor/test/temp/junit4-J4-20200429_192908_81415137440847549896434.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/monitor/test/temp/junit4-J5-20200429_192908_8142516549669759936849.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/monitor/test/temp/junit4-J0-20200429_192908_81411248675814195005505.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/monitor/test/temp/junit4-J3-20200429_192908_81414681302579274394017.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/monitor/test/temp/junit4-J1-20200429_192908_814854529157911300145.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/monitor/test/temp/junit4-J2-20200429_192908_8141712505150731504472.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 253 lines...]
   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/queries/test/temp/junit4-J5-20200429_192915_99916641606051761437326.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/queries/test/temp/junit4-J3-20200429_192915_99915152891157920512346.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/queries/test/temp/junit4-J4-20200429_192915_9995255502396327189114.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

[...truncated 6 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/queries/test/temp/junit4-J0-20200429_192915_999216003209780011508.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/queries/test/temp/junit4-J1-20200429_192915_9996971324097516712403.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/queries/test/temp/junit4-J2-20200429_192915_9996568569813873343049.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 234 lines...]
   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/queryparser/test/temp/junit4-J3-20200429_192922_88115941274544854423492.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/queryparser/test/temp/junit4-J0-20200429_192922_88114548966999071461029.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/queryparser/test/temp/junit4-J2-20200429_192922_8813145081105032540854.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 9 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/queryparser/test/temp/junit4-J1-20200429_192922_8818973859409345218452.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/queryparser/test/temp/junit4-J4-20200429_192922_88216891343770591845961.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/queryparser/test/temp/junit4-J5-20200429_192922_8826388307394677338281.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

[...truncated 199 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/replicator/test/temp/junit4-J1-20200429_192930_7506998959442224532754.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/replicator/test/temp/junit4-J4-20200429_192930_7507593537735229150148.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/replicator/test/temp/junit4-J5-20200429_192930_75018137022788661724840.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

[...truncated 23 lines...]
   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/replicator/test/temp/junit4-J3-20200429_192930_7507811775724021000048.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/replicator/test/temp/junit4-J0-20200429_192930_75017609990817507319610.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/replicator/test/temp/junit4-J2-20200429_192930_75016720224427837005185.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 199 lines...]
   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/sandbox/test/temp/junit4-J3-20200429_192942_3686995441438516992703.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/sandbox/test/temp/junit4-J1-20200429_192942_3685690007094889329316.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/sandbox/test/temp/junit4-J4-20200429_192942_36813216216573403328380.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/sandbox/test/temp/junit4-J5-20200429_192942_36813206979282459657599.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/sandbox/test/temp/junit4-J2-20200429_192942_3685335524803637072949.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/sandbox/test/temp/junit4-J0-20200429_192942_3686783831223127428029.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 286 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/spatial-extras/test/temp/junit4-J0-20200429_192957_83015751843707423807173.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/spatial-extras/test/temp/junit4-J3-20200429_192957_8317542320929491780825.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/spatial-extras/test/temp/junit4-J4-20200429_192957_8315310726243237273944.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/spatial-extras/test/temp/junit4-J1-20200429_192957_83115298280727546969069.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/spatial-extras/test/temp/junit4-J2-20200429_192957_8319522394113980814322.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 9 lines...]
   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/spatial-extras/test/temp/junit4-J5-20200429_192957_8316343859268355816263.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

[...truncated 194 lines...]
   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/spatial3d/test/temp/junit4-J3-20200429_193019_53717020846530754297373.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/spatial3d/test/temp/junit4-J4-20200429_193019_537799044769577386009.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/spatial3d/test/temp/junit4-J2-20200429_193019_5377655762263805410567.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/spatial3d/test/temp/junit4-J5-20200429_193019_53711393268692009562457.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/spatial3d/test/temp/junit4-J0-20200429_193019_53712000271841488756426.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 8 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/spatial3d/test/temp/junit4-J1-20200429_193019_53711340320538927566297.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 261 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/suggest/test/temp/junit4-J2-20200429_193032_0557525370342744870947.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/suggest/test/temp/junit4-J5-20200429_193032_05716194836793910459269.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

[...truncated 9 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/suggest/test/temp/junit4-J0-20200429_193032_0551381247631410486850.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/suggest/test/temp/junit4-J3-20200429_193032_0552279647883729599366.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/suggest/test/temp/junit4-J1-20200429_193032_0557050627302363992400.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build/suggest/test/temp/junit4-J4-20200429_193032_0556980430735523233779.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

[...truncated 4792 lines...]
   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/temp/junit4-J3-20200429_193115_17013276268573259736443.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

[...truncated 6 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/temp/junit4-J1-20200429_193115_1704210429318170255039.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/temp/junit4-J0-20200429_193115_17014626766698324087970.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/temp/junit4-J5-20200429_193115_1702893823372661001606.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/temp/junit4-J4-20200429_193115_1706483129388011711571.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/temp/junit4-J2-20200429_193115_17014405659382930206570.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 1102 lines...]
   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-solrj/test/temp/junit4-J5-20200429_195230_4987344130482975184972.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

[...truncated 18 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-solrj/test/temp/junit4-J0-20200429_195230_49813053411167963965570.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-solrj/test/temp/junit4-J4-20200429_195230_49817258593821278962410.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-solrj/test/temp/junit4-J1-20200429_195230_49813536523945174884212.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-solrj/test/temp/junit4-J2-20200429_195230_49810579456984828526484.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-solrj/test/temp/junit4-J3-20200429_195230_49810212559502440528270.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

[...truncated 1451 lines...]
   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/contrib/solr-analysis-extras/test/temp/junit4-J4-20200429_195509_75317282751759666525463.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/contrib/solr-analysis-extras/test/temp/junit4-J3-20200429_195509_75318189494643319460986.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/contrib/solr-analysis-extras/test/temp/junit4-J0-20200429_195509_75310521711608557142279.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/contrib/solr-analysis-extras/test/temp/junit4-J2-20200429_195509_7534914189758271312402.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/contrib/solr-analysis-extras/test/temp/junit4-J1-20200429_195509_75317204797160943266907.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 804 lines...]
   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/contrib/solr-analytics/test/temp/junit4-J4-20200429_195520_97617928485173950929637.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

[...truncated 9 lines...]
   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/contrib/solr-analytics/test/temp/junit4-J5-20200429_195520_9768892395917475621749.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/contrib/solr-analytics/test/temp/junit4-J0-20200429_195520_97611920389473881737221.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/contrib/solr-analytics/test/temp/junit4-J1-20200429_195520_9763060734814469054921.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM war

[...truncated too long message...]

 output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 680 lines...]
   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/contrib/solr-ltr/test/temp/junit4-J4-20200429_195715_89910172098826452721816.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/contrib/solr-ltr/test/temp/junit4-J2-20200429_195715_89918350519441553899373.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J5: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/contrib/solr-ltr/test/temp/junit4-J5-20200429_195715_9042753070740516681238.syserr
   [junit4] >>> JVM J5 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J5: EOF ----

[...truncated 6 lines...]
   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/contrib/solr-ltr/test/temp/junit4-J3-20200429_195715_89916545904366164000706.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/contrib/solr-ltr/test/temp/junit4-J0-20200429_195715_89912089117121543320792.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/contrib/solr-ltr/test/temp/junit4-J1-20200429_195715_89914172775538252978714.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 585 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/contrib/solr-prometheus-exporter/test/temp/junit4-J0-20200429_195743_1892301274309219361673.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J2: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/contrib/solr-prometheus-exporter/test/temp/junit4-J2-20200429_195743_18914668876471302634913.syserr
   [junit4] >>> JVM J2 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J2: EOF ----

[...truncated 13 lines...]
   [junit4] JVM J1: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/contrib/solr-prometheus-exporter/test/temp/junit4-J1-20200429_195743_18912754637013319692734.syserr
   [junit4] >>> JVM J1 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J1: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J4: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/contrib/solr-prometheus-exporter/test/temp/junit4-J4-20200429_195743_1896712804501935273555.syserr
   [junit4] >>> JVM J4 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J4: EOF ----

[...truncated 3 lines...]
   [junit4] JVM J3: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/contrib/solr-prometheus-exporter/test/temp/junit4-J3-20200429_195743_1898618557768304460628.syserr
   [junit4] >>> JVM J3 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J3: EOF ----

[...truncated 570 lines...]
   [junit4] JVM J0: stderr was not empty, see: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/contrib/solr-velocity/test/temp/junit4-J0-20200429_195757_86815802794008592997911.syserr
   [junit4] >>> JVM J0 emitted unexpected output (verbatim) ----
   [junit4] OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
   [junit4] <<< JVM J0: EOF ----

[...truncated 28503 lines...]
-ecj-javadoc-lint-src:
    [mkdir] Created dir: /tmp/ecj373172291
 [ecj-lint] Compiling 931 source files to /tmp/ecj373172291
 [ecj-lint] ----------
 [ecj-lint] 1. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/CodecUtil.java (at line 523)
 [ecj-lint] 	throw new CorruptIndexException("misplaced codec footer (file truncated?): length=" + in.length() + " but footerLength==" + footerLength(), input);
 [ecj-lint] 	^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'in' is not closed at this location
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 2. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingStoredFieldsReader.java (at line 166)
 [ecj-lint] 	FieldsIndexReader fieldsIndexReader = new FieldsIndexReader(d, si.name, segmentSuffix, INDEX_EXTENSION_PREFIX, INDEX_CODEC_NAME, si.getId());
 [ecj-lint] 	                  ^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'fieldsIndexReader' is never closed
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 3. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingTermVectorsReader.java (at line 148)
 [ecj-lint] 	FieldsIndexReader fieldsIndexReader = new FieldsIndexReader(d, si.name, segmentSuffix, VECTORS_INDEX_EXTENSION_PREFIX, VECTORS_INDEX_CODEC_NAME, si.getId());
 [ecj-lint] 	                  ^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'fieldsIndexReader' is never closed
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 4. ERROR in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/search/IndexSearcher.java (at line 50)
 [ecj-lint] 	import org.apache.lucene.util.automaton.ByteRunAutomaton;
 [ecj-lint] 	       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 [ecj-lint] The import org.apache.lucene.util.automaton.ByteRunAutomaton is never used
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 5. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/util/automaton/Operations.java (at line 742)
 [ecj-lint] 	Integer q = newstate.get(statesSet);
 [ecj-lint] 	                         ^^^^^^^^^
 [ecj-lint] Unlikely argument type SortedIntSet for get(Object) on a Map<SortedIntSet.FrozenIntSet,Integer>
 [ecj-lint] ----------
 [ecj-lint] 5 problems (1 error, 4 warnings)

BUILD FAILED
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/build.xml:634: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/build.xml:101: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build.xml:201: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/common-build.xml:2127: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/common-build.xml:2166: Compile failed; see the compiler error output for details.

Total time: 39 minutes 41 seconds
Build step 'Invoke Ant' marked build as failure
Archiving artifacts
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
[WARNINGS] Skipping publisher since build result is FAILURE
Recording test results
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2

[JENKINS] Lucene-Solr-8.x-Linux (64bit/jdk-12.0.2) - Build # 2911 - Still Failing!

Posted by Policeman Jenkins Server <je...@thetaphi.de>.
Build: https://jenkins.thetaphi.de/job/Lucene-Solr-8.x-Linux/2911/
Java: 64bit/jdk-12.0.2 -XX:+UseCompressedOops -XX:+UseSerialGC

1 tests failed.
FAILED:  org.apache.solr.index.hdfs.CheckHdfsIndexTest.doTest

Error Message:
Error from server at http://127.0.0.1:33319/zg_vs/ju/collection1: java.lang.NullPointerException  at org.apache.solr.handler.admin.SystemInfoHandler.getSecurityInfo(SystemInfoHandler.java:326)  at org.apache.solr.handler.admin.SystemInfoHandler.handleRequestBody(SystemInfoHandler.java:146)  at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)  at org.apache.solr.core.SolrCore.execute(SolrCore.java:2600)  at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:803)  at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:582)  at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:432)  at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:362)  at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)  at org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)  at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)  at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)  at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)  at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1610)  at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)  at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1300)  at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)  at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)  at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1580)  at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)  at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1215)  at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)  at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)  at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)  at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:767)  at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)  at org.eclipse.jetty.server.Server.handle(Server.java:500)  at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)  at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)  at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)  at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)  at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)  at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)  at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)  at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)  at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)  at java.base/java.lang.Thread.run(Thread.java:835) 

Stack Trace:
org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException: Error from server at http://127.0.0.1:33319/zg_vs/ju/collection1: java.lang.NullPointerException
	at org.apache.solr.handler.admin.SystemInfoHandler.getSecurityInfo(SystemInfoHandler.java:326)
	at org.apache.solr.handler.admin.SystemInfoHandler.handleRequestBody(SystemInfoHandler.java:146)
	at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:211)
	at org.apache.solr.core.SolrCore.execute(SolrCore.java:2600)
	at org.apache.solr.servlet.HttpSolrCall.execute(HttpSolrCall.java:803)
	at org.apache.solr.servlet.HttpSolrCall.call(HttpSolrCall.java:582)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:432)
	at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:362)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)
	at org.apache.solr.client.solrj.embedded.JettySolrRunner$DebugFilter.doFilter(JettySolrRunner.java:166)
	at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1604)
	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:545)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1610)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233)
	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1300)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188)
	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:485)
	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1580)
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186)
	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1215)
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.rewrite.handler.RewriteHandler.handle(RewriteHandler.java:322)
	at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:767)
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127)
	at org.eclipse.jetty.server.Server.handle(Server.java:500)
	at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383)
	at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:547)
	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375)
	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273)
	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311)
	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:103)
	at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:117)
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:806)
	at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:938)
	at java.base/java.lang.Thread.run(Thread.java:835)

	at __randomizedtesting.SeedInfo.seed([4FBD655B33633D2:A3BF6EF1DE8D206B]:0)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:665)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:265)
	at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:248)
	at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:211)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1003)
	at org.apache.solr.client.solrj.SolrClient.query(SolrClient.java:1018)
	at org.apache.solr.index.hdfs.CheckHdfsIndexTest.doTest(CheckHdfsIndexTest.java:120)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:567)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsFixedStatement.callStatement(BaseDistributedSearchTestCase.java:1081)
	at org.apache.solr.BaseDistributedSearchTestCase$ShardsRepeatRule$ShardsStatement.evaluate(BaseDistributedSearchTestCase.java:1053)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
	at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
	at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.SystemPropertiesRestoreRule$1.evaluate(SystemPropertiesRestoreRule.java:57)
	at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
	at java.base/java.lang.Thread.run(Thread.java:835)




Build Log:
[...truncated 15229 lines...]
   [junit4] Suite: org.apache.solr.index.hdfs.CheckHdfsIndexTest
   [junit4]   2> 712915 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.a.s.SolrTestCase Setting 'solr.default.confdir' system property to test-framework derived value of '/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/server/solr/configsets/_default/conf'
   [junit4]   2> 712916 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.a.s.SolrTestCaseJ4 SecureRandom sanity checks: test.solr.allowed.securerandom=null & java.security.egd=file:/dev/./urandom
   [junit4]   2> 712916 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.a.s.SolrTestCaseJ4 Created dataDir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_4FBD655B33633D2-001/data-dir-60-001
   [junit4]   2> 712916 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.a.s.SolrTestCaseJ4 Using TrieFields (NUMERIC_POINTS_SYSPROP=false) w/NUMERIC_DOCVALUES_SYSPROP=false
   [junit4]   2> 712917 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.a.s.SolrTestCaseJ4 Randomized ssl (false) and clientAuth (true) via: @org.apache.solr.util.RandomizeSSL(reason="", value=0.0/0.0, ssl=0.0/0.0, clientAuth=0.0/0.0)
   [junit4]   2> 712917 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.a.s.BaseDistributedSearchTestCase Setting hostContext system property: /zg_vs/ju
   [junit4]   1> Formatting using clusterid: testClusterID
   [junit4]   2> 712934 WARN  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 712935 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 12.0.2+10
   [junit4]   2> 712935 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 712935 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 712935 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 712936 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@45f6dc5{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,AVAILABLE}
   [junit4]   2> 713033 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.w.WebAppContext@7fd3e644{hdfs,/,file:///home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/jetty-localhost_localdomain-39875-hadoop-hdfs-3_2_0-tests_jar-_-any-13146645986817602175.dir/webapp/,AVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/hdfs}
   [junit4]   2> 713034 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.e.j.s.AbstractConnector Started ServerConnector@6ab35e3c{HTTP/1.1, (http/1.1)}{localhost.localdomain:39875}
   [junit4]   2> 713034 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.e.j.s.Server Started @713042ms
   [junit4]   2> 713068 WARN  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.a.h.h.HttpRequestLog Jetty request log can only be enabled using Log4j
   [junit4]   2> 713069 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 12.0.2+10
   [junit4]   2> 713070 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 713070 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 713070 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 713071 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@7567b15a{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,AVAILABLE}
   [junit4]   2> 713161 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.e.j.s.h.ContextHandler Started o.e.j.w.WebAppContext@721063c1{datanode,/,file:///home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/jetty-localhost-39015-hadoop-hdfs-3_2_0-tests_jar-_-any-6884601990524020369.dir/webapp/,AVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/datanode}
   [junit4]   2> 713161 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.e.j.s.AbstractConnector Started ServerConnector@ca30e1f{HTTP/1.1, (http/1.1)}{localhost:39015}
   [junit4]   2> 713161 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.e.j.s.Server Started @713169ms
   [junit4]   2> 713232 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x15b0fbe0a5929cfa: Processing first storage report for DS-14b75a52-9b8a-4a62-9352-a776521e75ab from datanode 4827672a-9822-4e42-85fb-abaf30fb4bc0
   [junit4]   2> 713232 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x15b0fbe0a5929cfa: from storage DS-14b75a52-9b8a-4a62-9352-a776521e75ab node DatanodeRegistration(127.0.0.1:38013, datanodeUuid=4827672a-9822-4e42-85fb-abaf30fb4bc0, infoPort=41947, infoSecurePort=0, ipcPort=41375, storageInfo=lv=-57;cid=testClusterID;nsid=1427067170;c=1588181878506), blocks: 0, hasStaleStorage: true, processing time: 0 msecs, invalidatedBlocks: 0
   [junit4]   2> 713232 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x15b0fbe0a5929cfa: Processing first storage report for DS-d00dfaee-be58-41bc-b0c1-73bcbc99a531 from datanode 4827672a-9822-4e42-85fb-abaf30fb4bc0
   [junit4]   2> 713232 INFO  (Block report processor) [     ] BlockStateChange BLOCK* processReport 0x15b0fbe0a5929cfa: from storage DS-d00dfaee-be58-41bc-b0c1-73bcbc99a531 node DatanodeRegistration(127.0.0.1:38013, datanodeUuid=4827672a-9822-4e42-85fb-abaf30fb4bc0, infoPort=41947, infoSecurePort=0, ipcPort=41375, storageInfo=lv=-57;cid=testClusterID;nsid=1427067170;c=1588181878506), blocks: 0, hasStaleStorage: false, processing time: 0 msecs, invalidatedBlocks: 0
   [junit4]   2> 713305 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.ZkTestServer STARTING ZK TEST SERVER
   [junit4]   2> 713306 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer client port:0.0.0.0/0.0.0.0:0
   [junit4]   2> 713306 INFO  (ZkTestServer Run Thread) [     ] o.a.s.c.ZkTestServer Starting server
   [junit4]   2> 713406 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.ZkTestServer start zk server on port:42849
   [junit4]   2> 713406 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.ZkTestServer waitForServerUp: 127.0.0.1:42849
   [junit4]   2> 713406 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.ZkTestServer parse host and port list: 127.0.0.1:42849
   [junit4]   2> 713406 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.ZkTestServer connecting to 127.0.0.1 42849
   [junit4]   2> 713407 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 713409 INFO  (zkConnectionManagerCallback-8909-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 713409 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 713414 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 713415 INFO  (zkConnectionManagerCallback-8911-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 713415 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 713417 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig-tlog.xml to /configs/conf1/solrconfig.xml
   [junit4]   2> 713424 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/schema.xml to /configs/conf1/schema.xml
   [junit4]   2> 713430 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/solrconfig.snippet.randomindexconfig.xml to /configs/conf1/solrconfig.snippet.randomindexconfig.xml
   [junit4]   2> 713442 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/stopwords.txt to /configs/conf1/stopwords.txt
   [junit4]   2> 713443 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/protwords.txt to /configs/conf1/protwords.txt
   [junit4]   2> 713444 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/currency.xml to /configs/conf1/currency.xml
   [junit4]   2> 713444 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/enumsConfig.xml to /configs/conf1/enumsConfig.xml
   [junit4]   2> 713445 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/open-exchange-rates.json to /configs/conf1/open-exchange-rates.json
   [junit4]   2> 713446 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/mapping-ISOLatin1Accent.txt to /configs/conf1/mapping-ISOLatin1Accent.txt
   [junit4]   2> 713447 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/old_synonyms.txt to /configs/conf1/old_synonyms.txt
   [junit4]   2> 713447 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.ZkTestServer put /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/core/src/test-files/solr/collection1/conf/synonyms.txt to /configs/conf1/synonyms.txt
   [junit4]   2> 713448 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Will use NRT replicas unless explicitly asked otherwise
   [junit4]   2> 713524 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 713524 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 713524 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ...
   [junit4]   2> 713524 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 12.0.2+10
   [junit4]   2> 713525 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 713525 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 713526 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.e.j.s.session node0 Scavenging every 660000ms
   [junit4]   2> 713526 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@4ce44200{/zg_vs/ju,null,AVAILABLE}
   [junit4]   2> 713527 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.e.j.s.AbstractConnector Started ServerConnector@2e628330{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:44563}
   [junit4]   2> 713527 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.e.j.s.Server Started @713535ms
   [junit4]   2> 713527 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/zg_vs/ju, solr.data.dir=hdfs://localhost.localdomain:40839/hdfs__localhost.localdomain_40839__home_jenkins_workspace_Lucene-Solr-8.x-Linux_solr_build_solr-core_test_J3_temp_solr.index.hdfs.CheckHdfsIndexTest_4FBD655B33633D2-001_tempDir-002_control_data, hostPort=44563, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_4FBD655B33633D2-001/control-001/cores}
   [junit4]   2> 713527 ERROR (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 713527 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 713527 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr? version 8.6.0
   [junit4]   2> 713527 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 713527 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr
   [junit4]   2> 713527 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-04-29T17:37:59.106838Z
   [junit4]   2> 713528 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 713531 INFO  (zkConnectionManagerCallback-8913-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 713531 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 713632 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 713632 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_4FBD655B33633D2-001/control-001/solr.xml
   [junit4]   2> 713634 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 713634 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 713651 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@350d6bcc, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 713733 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 713735 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@49c1e9a6[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 713735 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@49c1e9a6[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 713738 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@4436ee94[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 713739 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@4436ee94[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 713740 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:42849/solr
   [junit4]   2> 713741 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 713743 INFO  (zkConnectionManagerCallback-8924-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 713743 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 713844 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 713845 INFO  (zkConnectionManagerCallback-8926-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 713845 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 713899 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.c.OverseerElectionContext I am going to be the leader 127.0.0.1:44563_zg_vs%2Fju
   [junit4]   2> 713899 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.c.Overseer Overseer (id=72183207422918660-127.0.0.1:44563_zg_vs%2Fju-n_0000000000) starting
   [junit4]   2> 713902 INFO  (OverseerStateUpdate-72183207422918660-127.0.0.1:44563_zg_vs%2Fju-n_0000000000) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.c.Overseer Starting to work on the main queue : 127.0.0.1:44563_zg_vs%2Fju
   [junit4]   2> 713902 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:44563_zg_vs%2Fju
   [junit4]   2> 713903 INFO  (zkCallback-8925-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 713904 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 713904 WARN  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=you make it. Consider configuring authentication/authorization before exposing Solr to users internal or  authorization=external.  See https://s.apache.org/solrsecurity for more info.  Solr is only as secure as disableddisabled
   [junit4]   2> 713916 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 713935 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@350d6bcc
   [junit4]   2> 713944 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@350d6bcc
   [junit4]   2> 713944 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@350d6bcc
   [junit4]   2> 713945 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_4FBD655B33633D2-001/control-001/cores
   [junit4]   2> 713951 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 713952 INFO  (zkConnectionManagerCallback-8943-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 713952 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 713952 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 713953 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:42849/solr ready
   [junit4]   2> 713954 INFO  (qtp524845801-15725) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:44563_zg_vs%252Fju&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 713957 INFO  (OverseerThreadFactory-8933-thread-1-processing-n:127.0.0.1:44563_zg_vs%2Fju) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.c.a.c.CreateCollectionCmd Create collection control_collection
   [junit4]   2> 714064 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 714065 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 714069 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju    x:control_collection_shard1_replica_n1 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 714069 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju    x:control_collection_shard1_replica_n1 ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 715078 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.6.0
   [junit4]   2> 715093 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.IndexSchema Schema name=test
   [junit4]   2> 715156 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 715165 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.CoreContainer Creating SolrCore 'control_collection_shard1_replica_n1' using configuration from configset conf1, trusted=true
   [junit4]   2> 715165 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.control_collection.shard1.replica_n1' (registry 'solr.core.control_collection.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@350d6bcc
   [junit4]   2> 715168 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory solr.hdfs.home=hdfs://localhost.localdomain:40839/solr_hdfs_home
   [junit4]   2> 715168 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Solr Kerberos Authentication disabled
   [junit4]   2> 715168 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrCore [[control_collection_shard1_replica_n1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_4FBD655B33633D2-001/control-001/cores/control_collection_shard1_replica_n1], dataDir=[hdfs://localhost.localdomain:40839/solr_hdfs_home/control_collection/core_node2/data/]
   [junit4]   2> 715168 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:40839/solr_hdfs_home/control_collection/core_node2/data/snapshot_metadata
   [junit4]   2> 715176 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 715176 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 715176 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 715200 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 715203 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:40839/solr_hdfs_home/control_collection/core_node2/data
   [junit4]   2> 715222 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:40839/solr_hdfs_home/control_collection/core_node2/data/index
   [junit4]   2> 715228 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 715228 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 715228 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 715234 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 715235 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=18, maxMergeAtOnceExplicit=30, maxMergedSegmentMB=50.7705078125, floorSegmentMB=1.1953125, forceMergeDeletesPctAllowed=19.85105429939854, segmentsPerTier=16.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0, deletesPctAllowed=24.850517236888944
   [junit4]   2> 715660 WARN  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 715715 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.HdfsUpdateLog
   [junit4]   2> 715715 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 715715 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.HdfsUpdateLog Initializing HdfsUpdateLog: tlogDfsReplication=2
   [junit4]   2> 715727 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 715727 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 715730 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=41, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0]
   [junit4]   2> 715805 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@7084ac74[control_collection_shard1_replica_n1] main]
   [junit4]   2> 715805 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 715805 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 715807 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms
   [junit4]   2> 715809 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1665329404458303488
   [junit4]   2> 715811 INFO  (searcherExecutor-8945-thread-1-processing-n:127.0.0.1:44563_zg_vs%2Fju x:control_collection_shard1_replica_n1 c:control_collection s:shard1) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SolrCore [control_collection_shard1_replica_n1] Registered new searcher Searcher@7084ac74[control_collection_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 715812 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ZkShardTerms Successful update of terms at /collections/control_collection/terms/shard1 to Terms{values={core_node2=0}, version=0}
   [junit4]   2> 715812 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/control_collection/leaders/shard1
   [junit4]   2> 715813 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 715813 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 715813 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:44563/zg_vs/ju/control_collection_shard1_replica_n1/
   [junit4]   2> 715814 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 715814 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.SyncStrategy http://127.0.0.1:44563/zg_vs/ju/control_collection_shard1_replica_n1/ has no replicas
   [junit4]   2> 715814 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/control_collection/leaders/shard1/leader after winning as /collections/control_collection/leader_elect/shard1/election/72183207422918660-core_node2-n_0000000000
   [junit4]   2> 715815 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:44563/zg_vs/ju/control_collection_shard1_replica_n1/ shard1
   [junit4]   2> 715916 INFO  (zkCallback-8925-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 715916 INFO  (zkCallback-8925-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 715916 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 715920 ERROR (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju c:control_collection s:shard1  x:control_collection_shard1_replica_n1 ] o.a.s.c.c.ZkStateReader Collection control_collection is not lazy or watched!
   [junit4]   2> 715923 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&newCollection=true&name=control_collection_shard1_replica_n1&action=CREATE&numShards=1&collection=control_collection&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1854
   [junit4]   2> 715924 INFO  (qtp524845801-15725) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas
   [junit4]   2> 715958 INFO  (OverseerCollectionConfigSetProcessor-72183207422918660-127.0.0.1:44563_zg_vs%2Fju-n_0000000000) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000000 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 716023 INFO  (zkCallback-8925-thread-2) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 716023 INFO  (zkCallback-8925-thread-3) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 716023 INFO  (zkCallback-8925-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [1])
   [junit4]   2> 716024 INFO  (qtp524845801-15725) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=control_collection&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=127.0.0.1:44563_zg_vs%252Fju&wt=javabin&version=2} status=0 QTime=2069
   [junit4]   2> 716024 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Waiting to see 1 active replicas in collection: control_collection
   [junit4]   2> 716126 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 716127 INFO  (zkConnectionManagerCallback-8954-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 716127 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 716128 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 716128 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.s.i.ZkClientClusterStateProvider Cluster at 127.0.0.1:42849/solr ready
   [junit4]   2> 716128 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.ChaosMonkey monkey: init - expire sessions:false cause connection loss:false
   [junit4]   2> 716129 INFO  (qtp524845801-15725) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :create with params collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=1&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 716131 INFO  (OverseerThreadFactory-8933-thread-2-processing-n:127.0.0.1:44563_zg_vs%2Fju) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.c.a.c.CreateCollectionCmd Create collection collection1
   [junit4]   2> 716333 WARN  (OverseerThreadFactory-8933-thread-2-processing-n:127.0.0.1:44563_zg_vs%2Fju) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.c.a.c.CreateCollectionCmd It is unusual to create a collection (collection1) without cores.
   [junit4]   2> 716333 INFO  (qtp524845801-15725) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.h.a.CollectionsHandler Wait for new collection to be active for at most 45 seconds. Check all shard replicas
   [junit4]   2> 716334 INFO  (qtp524845801-15725) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={collection.configName=conf1&name=collection1&nrtReplicas=1&action=CREATE&numShards=1&createNodeSet=&stateFormat=1&wt=javabin&version=2} status=0 QTime=204
   [junit4]   2> 716334 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 716334 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 716334 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 716334 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 716334 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.SolrCloudTestCase active slice count: 1 expected:1
   [junit4]   2> 716334 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.SolrCloudTestCase active replica count: 0 expected replica count: 0
   [junit4]   2> 716334 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Creating jetty instances pullReplicaCount=0 numOtherReplicas=1
   [junit4]   2> 716411 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.AbstractFullDistribZkTestBase create jetty 1 in directory /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_4FBD655B33633D2-001/shard-1-001 of type NRT for shard1
   [junit4]   2> 716412 WARN  (closeThreadPool-8955-thread-1) [     ] o.e.j.s.h.g.GzipHandler minGzipSize of 0 is inefficient for short content, break even is size 23
   [junit4]   2> 716412 INFO  (closeThreadPool-8955-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Start Jetty (configured port=0, binding port=0)
   [junit4]   2> 716412 INFO  (closeThreadPool-8955-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Trying to start Jetty on port 0 try number 2 ...
   [junit4]   2> 716412 INFO  (closeThreadPool-8955-thread-1) [     ] o.e.j.s.Server jetty-9.4.27.v20200227; built: 2020-02-27T18:37:21.340Z; git: a304fd9f351f337e7c0e2a7c28878dd536149c6c; jvm 12.0.2+10
   [junit4]   2> 716412 INFO  (closeThreadPool-8955-thread-1) [     ] o.e.j.s.session DefaultSessionIdManager workerName=node0
   [junit4]   2> 716412 INFO  (closeThreadPool-8955-thread-1) [     ] o.e.j.s.session No SessionScavenger set, using defaults
   [junit4]   2> 716412 INFO  (closeThreadPool-8955-thread-1) [     ] o.e.j.s.session node0 Scavenging every 600000ms
   [junit4]   2> 716413 INFO  (closeThreadPool-8955-thread-1) [     ] o.e.j.s.h.ContextHandler Started o.e.j.s.ServletContextHandler@3e716cb3{/zg_vs/ju,null,AVAILABLE}
   [junit4]   2> 716413 INFO  (closeThreadPool-8955-thread-1) [     ] o.e.j.s.AbstractConnector Started ServerConnector@5be412a6{HTTP/1.1, (http/1.1, h2c)}{127.0.0.1:41243}
   [junit4]   2> 716413 INFO  (closeThreadPool-8955-thread-1) [     ] o.e.j.s.Server Started @716421ms
   [junit4]   2> 716413 INFO  (closeThreadPool-8955-thread-1) [     ] o.a.s.c.s.e.JettySolrRunner Jetty properties: {hostContext=/zg_vs/ju, solrconfig=solrconfig.xml, solr.data.dir=hdfs://localhost.localdomain:40839/hdfs__localhost.localdomain_40839__home_jenkins_workspace_Lucene-Solr-8.x-Linux_solr_build_solr-core_test_J3_temp_solr.index.hdfs.CheckHdfsIndexTest_4FBD655B33633D2-001_tempDir-002_jetty1, hostPort=41243, coreRootDirectory=/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_4FBD655B33633D2-001/shard-1-001/cores, replicaType=NRT}
   [junit4]   2> 716414 ERROR (closeThreadPool-8955-thread-1) [     ] o.a.s.u.StartupLoggingUtils Missing Java Option solr.log.dir. Logging may be missing or incomplete.
   [junit4]   2> 716414 INFO  (closeThreadPool-8955-thread-1) [     ] o.a.s.s.SolrDispatchFilter Using logger factory org.apache.logging.slf4j.Log4jLoggerFactory
   [junit4]   2> 716414 INFO  (closeThreadPool-8955-thread-1) [     ] o.a.s.s.SolrDispatchFilter  ___      _       Welcome to Apache Solr? version 8.6.0
   [junit4]   2> 716414 INFO  (closeThreadPool-8955-thread-1) [     ] o.a.s.s.SolrDispatchFilter / __| ___| |_ _   Starting in cloud mode on port null
   [junit4]   2> 716414 INFO  (closeThreadPool-8955-thread-1) [     ] o.a.s.s.SolrDispatchFilter \__ \/ _ \ | '_|  Install dir: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr
   [junit4]   2> 716414 INFO  (closeThreadPool-8955-thread-1) [     ] o.a.s.s.SolrDispatchFilter |___/\___/_|_|    Start time: 2020-04-29T17:38:01.993109Z
   [junit4]   2> 716414 INFO  (closeThreadPool-8955-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 716415 INFO  (zkConnectionManagerCallback-8957-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 716415 INFO  (closeThreadPool-8955-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 716516 INFO  (closeThreadPool-8955-thread-1) [     ] o.a.s.s.SolrDispatchFilter Loading solr.xml from SolrHome (not found in ZooKeeper)
   [junit4]   2> 716516 INFO  (closeThreadPool-8955-thread-1) [     ] o.a.s.c.SolrXmlConfig Loading container configuration from /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_4FBD655B33633D2-001/shard-1-001/solr.xml
   [junit4]   2> 716518 INFO  (closeThreadPool-8955-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverWorkLoopDelay is ignored
   [junit4]   2> 716518 INFO  (closeThreadPool-8955-thread-1) [     ] o.a.s.c.SolrXmlConfig Configuration parameter autoReplicaFailoverBadNodeExpiration is ignored
   [junit4]   2> 716518 INFO  (closeThreadPool-8955-thread-1) [     ] o.a.s.c.SolrXmlConfig MBean server found: com.sun.jmx.mbeanserver.JmxMBeanServer@350d6bcc, but no JMX reporters were configured - adding default JMX reporter.
   [junit4]   2> 716638 INFO  (closeThreadPool-8955-thread-1) [     ] o.a.s.h.c.HttpShardHandlerFactory Host whitelist initialized: WhitelistHostChecker [whitelistHosts=null, whitelistHostCheckingEnabled=false]
   [junit4]   2> 716639 WARN  (closeThreadPool-8955-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@47db3993[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 716639 WARN  (closeThreadPool-8955-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@47db3993[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 716641 WARN  (closeThreadPool-8955-thread-1) [     ] o.e.j.u.s.S.config Trusting all certificates configured for Client@79dd3f60[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 716641 WARN  (closeThreadPool-8955-thread-1) [     ] o.e.j.u.s.S.config No Client EndPointIdentificationAlgorithm configured for Client@79dd3f60[provider=null,keyStore=null,trustStore=null]
   [junit4]   2> 716642 INFO  (closeThreadPool-8955-thread-1) [     ] o.a.s.c.ZkContainer Zookeeper client=127.0.0.1:42849/solr
   [junit4]   2> 716643 INFO  (closeThreadPool-8955-thread-1) [     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 716644 INFO  (zkConnectionManagerCallback-8968-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 716644 INFO  (closeThreadPool-8955-thread-1) [     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 716746 INFO  (closeThreadPool-8955-thread-1) [n:127.0.0.1:41243_zg_vs%2Fju     ] o.a.s.c.c.ConnectionManager Waiting for client to connect to ZooKeeper
   [junit4]   2> 716747 INFO  (zkConnectionManagerCallback-8970-thread-1) [     ] o.a.s.c.c.ConnectionManager zkClient has connected
   [junit4]   2> 716747 INFO  (closeThreadPool-8955-thread-1) [n:127.0.0.1:41243_zg_vs%2Fju     ] o.a.s.c.c.ConnectionManager Client is connected to ZooKeeper
   [junit4]   2> 716749 INFO  (closeThreadPool-8955-thread-1) [n:127.0.0.1:41243_zg_vs%2Fju     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (0) -> (1)
   [junit4]   2> 716750 INFO  (closeThreadPool-8955-thread-1) [n:127.0.0.1:41243_zg_vs%2Fju     ] o.a.s.c.ZkController Publish node=127.0.0.1:41243_zg_vs%2Fju as DOWN
   [junit4]   2> 716751 INFO  (closeThreadPool-8955-thread-1) [n:127.0.0.1:41243_zg_vs%2Fju     ] o.a.s.c.TransientSolrCoreCacheDefault Allocating transient cache for 4 transient cores
   [junit4]   2> 716751 INFO  (closeThreadPool-8955-thread-1) [n:127.0.0.1:41243_zg_vs%2Fju     ] o.a.s.c.ZkController Register node as live in ZooKeeper:/live_nodes/127.0.0.1:41243_zg_vs%2Fju
   [junit4]   2> 716751 INFO  (zkCallback-8925-thread-4) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 716751 INFO  (zkCallback-8953-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 716751 INFO  (zkCallback-8969-thread-1) [     ] o.a.s.c.c.ZkStateReader Updated live nodes from ZooKeeper... (1) -> (2)
   [junit4]   2> 716752 INFO  (closeThreadPool-8955-thread-1) [n:127.0.0.1:41243_zg_vs%2Fju     ] o.a.s.p.PackageLoader /packages.json updated to version -1
   [junit4]   2> 716753 WARN  (closeThreadPool-8955-thread-1) [n:127.0.0.1:41243_zg_vs%2Fju     ] o.a.s.c.CoreContainer Not all security plugins configured!  authentication=you make it. Consider configuring authentication/authorization before exposing Solr to users internal or  authorization=external.  See https://s.apache.org/solrsecurity for more info.  Solr is only as secure as disableddisabled
   [junit4]   2> 716761 INFO  (closeThreadPool-8955-thread-1) [n:127.0.0.1:41243_zg_vs%2Fju     ] o.a.s.h.a.MetricsHistoryHandler No .system collection, keeping metrics history in memory.
   [junit4]   2> 716773 INFO  (closeThreadPool-8955-thread-1) [n:127.0.0.1:41243_zg_vs%2Fju     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.node' (registry 'solr.node') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@350d6bcc
   [junit4]   2> 716781 INFO  (closeThreadPool-8955-thread-1) [n:127.0.0.1:41243_zg_vs%2Fju     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jvm' (registry 'solr.jvm') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@350d6bcc
   [junit4]   2> 716781 INFO  (closeThreadPool-8955-thread-1) [n:127.0.0.1:41243_zg_vs%2Fju     ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.jetty' (registry 'solr.jetty') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@350d6bcc
   [junit4]   2> 716782 INFO  (closeThreadPool-8955-thread-1) [n:127.0.0.1:41243_zg_vs%2Fju     ] o.a.s.c.CorePropertiesLocator Found 0 core definitions underneath /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_4FBD655B33633D2-001/shard-1-001/cores
   [junit4]   2> 716787 INFO  (closeThreadPool-8955-thread-1) [     ] o.a.s.c.AbstractFullDistribZkTestBase waitForLiveNode: 127.0.0.1:41243_zg_vs%2Fju
   [junit4]   2> 716789 INFO  (qtp1608228229-15785) [n:127.0.0.1:41243_zg_vs%2Fju     ] o.a.s.h.a.CollectionsHandler Invoked Collection Action :addreplica with params node=127.0.0.1:41243_zg_vs%252Fju&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2 and sendToOCPQueue=true
   [junit4]   2> 716791 INFO  (OverseerCollectionConfigSetProcessor-72183207422918660-127.0.0.1:44563_zg_vs%2Fju-n_0000000000) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000002 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 716792 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 716795 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.core.control_collection.shard1.replica_n1:INDEX.sizeInBytes} status=0 QTime=2
   [junit4]   2> 716796 INFO  (qtp524845801-15725) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 716797 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 716800 INFO  (qtp524845801-15728) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={wt=javabin&version=2&key=solr.core.control_collection.shard1.replica_n1:INDEX.sizeInBytes} status=0 QTime=2
   [junit4]   2> 716801 INFO  (qtp524845801-15725) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/metrics params={prefix=CONTAINER.fs.usableSpace,CONTAINER.fs.totalSpace,CORE.coreName&wt=javabin&version=2&group=solr.node,solr.core} status=0 QTime=0
   [junit4]   2> 716801 INFO  (OverseerThreadFactory-8933-thread-3-processing-n:127.0.0.1:44563_zg_vs%2Fju) [n:127.0.0.1:44563_zg_vs%2Fju c:collection1 s:shard1   ] o.a.s.c.a.c.AddReplicaCmd Node Identified 127.0.0.1:41243_zg_vs%2Fju for creating new replica of shard shard1 for collection collection1
   [junit4]   2> 716802 INFO  (OverseerThreadFactory-8933-thread-3-processing-n:127.0.0.1:44563_zg_vs%2Fju) [n:127.0.0.1:44563_zg_vs%2Fju c:collection1 s:shard1   ] o.a.s.c.a.c.AddReplicaCmd Returning CreateReplica command.
   [junit4]   2> 716806 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju    x:collection1_shard1_replica_n1 ] o.a.s.h.a.CoreAdminOperation core create command qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_n1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT
   [junit4]   2> 717813 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrConfig Using Lucene MatchVersion: 8.6.0
   [junit4]   2> 717825 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.IndexSchema Schema name=test
   [junit4]   2> 717915 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.IndexSchema Loaded schema test/1.0 with uniqueid field id
   [junit4]   2> 717923 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.CoreContainer Creating SolrCore 'collection1_shard1_replica_n1' using configuration from configset conf1, trusted=true
   [junit4]   2> 717923 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.m.r.SolrJmxReporter JMX monitoring for 'solr.core.collection1.shard1.replica_n1' (registry 'solr.core.collection1.shard1.replica_n1') enabled at server: com.sun.jmx.mbeanserver.JmxMBeanServer@350d6bcc
   [junit4]   2> 717923 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory solr.hdfs.home=hdfs://localhost.localdomain:40839/solr_hdfs_home
   [junit4]   2> 717923 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Solr Kerberos Authentication disabled
   [junit4]   2> 717923 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrCore [[collection1_shard1_replica_n1] ] Opening new SolrCore at [/home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_4FBD655B33633D2-001/shard-1-001/cores/collection1_shard1_replica_n1], dataDir=[hdfs://localhost.localdomain:40839/solr_hdfs_home/collection1/core_node2/data/]
   [junit4]   2> 717924 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:40839/solr_hdfs_home/collection1/core_node2/data/snapshot_metadata
   [junit4]   2> 717929 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 717929 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 717929 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 717934 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 717935 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:40839/solr_hdfs_home/collection1/core_node2/data
   [junit4]   2> 717952 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory creating directory factory for path hdfs://localhost.localdomain:40839/solr_hdfs_home/collection1/core_node2/data/index
   [junit4]   2> 717957 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Number of slabs of block cache [1] with direct memory allocation set to [true]
   [junit4]   2> 717957 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Block cache target memory usage, slab size of [4194304] will allocate [1] slabs and use ~[4194304] bytes
   [junit4]   2> 717957 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.HdfsDirectoryFactory Creating new single instance HDFS BlockCache
   [junit4]   2> 717962 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.b.BlockDirectory Block cache on write is disabled
   [junit4]   2> 717962 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.TieredMergePolicy: [TieredMergePolicy: maxMergeAtOnce=18, maxMergeAtOnceExplicit=30, maxMergedSegmentMB=50.7705078125, floorSegmentMB=1.1953125, forceMergeDeletesPctAllowed=19.85105429939854, segmentsPerTier=16.0, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0, deletesPctAllowed=24.850517236888944
   [junit4]   2> 718382 WARN  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.RequestHandlers INVALID paramSet a in requestHandler {type = requestHandler,name = /dump,class = DumpRequestHandler,attributes = {initParams=a, name=/dump, class=DumpRequestHandler},args = {defaults={a=A,b=B}}}
   [junit4]   2> 718421 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateHandler Using UpdateLog implementation: org.apache.solr.update.HdfsUpdateLog
   [junit4]   2> 718421 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateLog Initializing UpdateLog: dataDir= defaultSyncLevel=FLUSH numRecordsToKeep=100 maxNumLogsToKeep=10 numVersionBuckets=65536
   [junit4]   2> 718421 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.HdfsUpdateLog Initializing HdfsUpdateLog: tlogDfsReplication=2
   [junit4]   2> 718430 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.CommitTracker Hard AutoCommit: disabled
   [junit4]   2> 718430 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.CommitTracker Soft AutoCommit: disabled
   [junit4]   2> 718432 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.RandomMergePolicy RandomMergePolicy wrapping class org.apache.lucene.index.LogByteSizeMergePolicy: [LogByteSizeMergePolicy: minMergeSize=1677721, mergeFactor=41, maxMergeSize=2147483648, maxMergeSizeForForcedMerge=9223372036854775807, calibrateSizeByDeletes=false, maxMergeDocs=2147483647, maxCFSSegmentSizeMB=8.796093022207999E12, noCFSRatio=1.0]
   [junit4]   2> 718437 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.s.SolrIndexSearcher Opening [Searcher@24099868[collection1_shard1_replica_n1] main]
   [junit4]   2> 718438 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Configured ZooKeeperStorageIO with znodeBase: /configs/conf1
   [junit4]   2> 718438 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.r.ManagedResourceStorage Loaded null at path _rest_managed.json using ZooKeeperStorageIO:path=/configs/conf1
   [junit4]   2> 718439 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.h.ReplicationHandler Commits will be reserved for 10000 ms
   [junit4]   2> 718439 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.u.UpdateLog Could not find max version in index or recent updates, using new clock 1665329407216058368
   [junit4]   2> 718441 INFO  (searcherExecutor-8981-thread-1-processing-n:127.0.0.1:41243_zg_vs%2Fju x:collection1_shard1_replica_n1 c:collection1 s:shard1) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SolrCore [collection1_shard1_replica_n1] Registered new searcher Searcher@24099868[collection1_shard1_replica_n1] main{ExitableDirectoryReader(UninvertingDirectoryReader())}
   [junit4]   2> 718441 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ZkShardTerms Successful update of terms at /collections/collection1/terms/shard1 to Terms{values={core_node2=0}, version=0}
   [junit4]   2> 718441 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase make sure parent is created /collections/collection1/leaders/shard1
   [junit4]   2> 718443 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext Enough replicas found to continue.
   [junit4]   2> 718443 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I may be the new leader - try and sync
   [junit4]   2> 718443 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync replicas to http://127.0.0.1:41243/zg_vs/ju/collection1_shard1_replica_n1/
   [junit4]   2> 718443 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy Sync Success - now sync replicas to me
   [junit4]   2> 718443 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.SyncStrategy http://127.0.0.1:41243/zg_vs/ju/collection1_shard1_replica_n1/ has no replicas
   [junit4]   2> 718443 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContextBase Creating leader registration node /collections/collection1/leaders/shard1/leader after winning as /collections/collection1/leader_elect/shard1/election/72183207422918665-core_node2-n_0000000000
   [junit4]   2> 718443 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ShardLeaderElectionContext I am the new leader: http://127.0.0.1:41243/zg_vs/ju/collection1_shard1_replica_n1/ shard1
   [junit4]   2> 718546 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1 s:shard1  x:collection1_shard1_replica_n1 ] o.a.s.c.ZkController I am the leader, no recovery necessary
   [junit4]   2> 718548 INFO  (qtp1608228229-15788) [n:127.0.0.1:41243_zg_vs%2Fju     ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/cores params={qt=/admin/cores&collection.configName=conf1&name=collection1_shard1_replica_n1&action=CREATE&collection=collection1&shard=shard1&wt=javabin&version=2&replicaType=NRT} status=0 QTime=1741
   [junit4]   2> 718549 INFO  (qtp1608228229-15785) [n:127.0.0.1:41243_zg_vs%2Fju c:collection1    ] o.a.s.s.HttpSolrCall [admin] webapp=null path=/admin/collections params={node=127.0.0.1:41243_zg_vs%252Fju&action=ADDREPLICA&collection=collection1&shard=shard1&type=NRT&wt=javabin&version=2} status=0 QTime=1759
   [junit4]   2> 718549 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.c.AbstractFullDistribZkTestBase Waiting to see 1 active replicas in collection: collection1
   [junit4]   2> 718649 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.SolrTestCaseJ4 ###Starting testChecksumsOnlyVerbose
   [junit4]   2> 718792 INFO  (OverseerCollectionConfigSetProcessor-72183207422918660-127.0.0.1:44563_zg_vs%2Fju-n_0000000000) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.c.OverseerTaskQueue Response ZK path: /overseer/collection-queue-work/qnr-0000000004 doesn't exist. Requestor may have disconnected from ZooKeeper
   [junit4]   2> 729168 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:40839/solr
   [junit4]   2> 729170 INFO  (TEST-CheckHdfsIndexTest.testChecksumsOnlyVerbose-seed#[4FBD655B33633D2]) [     ] o.a.s.SolrTestCaseJ4 ###Ending testChecksumsOnlyVerbose
   [junit4]   2> 729274 INFO  (closeThreadPool-8988-thread-2) [     ] o.a.s.c.CoreContainer Shutting down CoreContainer instance=437295567
   [junit4]   2> 729274 INFO  (closeThreadPool-8988-thread-2) [     ] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:44563_zg_vs%2Fju
   [junit4]   2> 729275 INFO  (closeThreadPool-8988-thread-2) [     ] o.a.s.c.ZkController Publish this node as DOWN...
   [junit4]   2> 729275 INFO  (closeThreadPool-8988-thread-2) [     ] o.a.s.c.ZkController Publish node=127.0.0.1:44563_zg_vs%2Fju as DOWN
   [junit4]   2> 729275 INFO  (closeThreadPool-8988-thread-1) [     ] o.a.s.c.CoreContainer Shutting down CoreContainer instance=1141230395
   [junit4]   2> 729275 INFO  (closeThreadPool-8988-thread-1) [     ] o.a.s.c.ZkController Remove node as live in ZooKeeper:/live_nodes/127.0.0.1:41243_zg_vs%2Fju
   [junit4]   2> 729275 INFO  (closeThreadPool-8988-thread-1) [     ] o.a.s.c.ZkController Publish this node as DOWN...
   [junit4]   2> 729275 INFO  (closeThreadPool-8988-thread-1) [     ] o.a.s.c.ZkController Publish node=127.0.0.1:41243_zg_vs%2Fju as DOWN
   [junit4]   2> 729276 INFO  (coreCloseExecutor-8994-thread-1) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.c.SolrCore [control_collection_shard1_replica_n1]  CLOSING SolrCore org.apache.solr.core.SolrCore@6b86a5ff
   [junit4]   2> 729276 INFO  (coreCloseExecutor-8994-thread-1) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.control_collection.shard1.replica_n1 tag=SolrCore@6b86a5ff
   [junit4]   2> 729276 INFO  (coreCloseExecutor-8994-thread-1) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@55deb4a9: rootName = null, domain = solr.core.control_collection.shard1.replica_n1, service url = null, agent id = null] for registry solr.core.control_collection.shard1.replica_n1/com.codahale.metrics.MetricRegistry@535a8cb0
   [junit4]   2> 729276 INFO  (zkCallback-8925-thread-1) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 729276 INFO  (zkCallback-8925-thread-4) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 729276 INFO  (zkCallback-8925-thread-3) [     ] o.a.s.c.c.ZkStateReader A cluster state change: [WatchedEvent state:SyncConnected type:NodeDataChanged path:/collections/control_collection/state.json] for collection [control_collection] has occurred - updating... (live nodes size: [2])
   [junit4]   2> 729277 INFO  (coreCloseExecutor-8996-thread-1) [n:127.0.0.1:41243_zg_vs%2Fju     ] o.a.s.c.SolrCore [collection1_shard1_replica_n1]  CLOSING SolrCore org.apache.solr.core.SolrCore@4a716a92
   [junit4]   2> 729277 INFO  (coreCloseExecutor-8996-thread-1) [n:127.0.0.1:41243_zg_vs%2Fju     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.core.collection1.shard1.replica_n1 tag=SolrCore@4a716a92
   [junit4]   2> 729277 INFO  (coreCloseExecutor-8996-thread-1) [n:127.0.0.1:41243_zg_vs%2Fju     ] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@6b4e719f: rootName = null, domain = solr.core.collection1.shard1.replica_n1, service url = null, agent id = null] for registry solr.core.collection1.shard1.replica_n1/com.codahale.metrics.MetricRegistry@69bf9853
   [junit4]   2> 729289 INFO  (coreCloseExecutor-8996-thread-1) [n:127.0.0.1:41243_zg_vs%2Fju     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.collection1.shard1.leader tag=SolrCore@4a716a92
   [junit4]   2> 729289 INFO  (coreCloseExecutor-8996-thread-1) [n:127.0.0.1:41243_zg_vs%2Fju     ] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close()  ... SKIPPED (unnecessary).
   [junit4]   2> 729293 INFO  (coreCloseExecutor-8994-thread-1) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.collection.control_collection.shard1.leader tag=SolrCore@6b86a5ff
   [junit4]   2> 729293 INFO  (coreCloseExecutor-8994-thread-1) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.u.DirectUpdateHandler2 Committing on IndexWriter.close()  ... SKIPPED (unnecessary).
   [junit4]   2> 729298 INFO  (coreCloseExecutor-8996-thread-1) [n:127.0.0.1:41243_zg_vs%2Fju     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:40839/solr_hdfs_home/collection1/core_node2/data/snapshot_metadata
   [junit4]   2> 729298 INFO  (coreCloseExecutor-8996-thread-1) [n:127.0.0.1:41243_zg_vs%2Fju     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:40839/solr_hdfs_home/collection1/core_node2/data
   [junit4]   2> 729299 INFO  (coreCloseExecutor-8994-thread-1) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:40839/solr_hdfs_home/control_collection/core_node2/data
   [junit4]   2> 729301 INFO  (coreCloseExecutor-8994-thread-1) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:40839/solr_hdfs_home/control_collection/core_node2/data/snapshot_metadata
   [junit4]   2> 729302 INFO  (coreCloseExecutor-8996-thread-1) [n:127.0.0.1:41243_zg_vs%2Fju     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:40839/solr_hdfs_home/collection1/core_node2/data/index
   [junit4]   2> 729303 INFO  (coreCloseExecutor-8994-thread-1) [n:127.0.0.1:44563_zg_vs%2Fju     ] o.a.s.s.h.HdfsDirectory Closing hdfs directory hdfs://localhost.localdomain:40839/solr_hdfs_home/control_collection/core_node2/data/index
   [junit4]   2> 729308 ERROR (OldIndexDirectoryCleanupThreadForCore-control_collection_shard1_replica_n1) [     ] o.a.s.c.HdfsDirectoryFactory Error checking if path hdfs://localhost.localdomain:40839/solr_hdfs_home/control_collection/core_node2/data/snapshot_metadata is an old index directory, caused by: java.io.IOException: Filesystem closed
   [junit4]   2> 729308 ERROR (OldIndexDirectoryCleanupThreadForCore-control_collection_shard1_replica_n1) [     ] o.a.s.c.HdfsDirectoryFactory Error checking if path hdfs://localhost.localdomain:40839/solr_hdfs_home/control_collection/core_node2/data/tlog is an old index directory, caused by: java.io.IOException: Filesystem closed
   [junit4]   2> 729309 INFO  (closeThreadPool-8988-thread-1) [     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null
   [junit4]   2> 729309 INFO  (closeThreadPool-8988-thread-1) [     ] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@a778db5: rootName = null, domain = solr.node, service url = null, agent id = null] for registry solr.node/com.codahale.metrics.MetricRegistry@2e9b8b31
   [junit4]   2> 729309 INFO  (closeThreadPool-8988-thread-2) [     ] o.a.s.m.SolrMetricManager Closing metric reporters for registry=solr.node tag=null
   [junit4]   2> 729309 INFO  (closeThreadPool-8988-thread-2) [     ] o.a.s.m.r.SolrJmxReporter Closing reporter [org.apache.solr.metrics.reporters.SolrJmxReporter@27a9acde: rootName = null, domain = solr.node, service url = null, agent id = null] for registry solr.node/com.codahale.metrics.MetricRegistry@5822612c
   [junit4]   2> 729315 INFO  (closeThreadPool-8988-thread-2) [     ] o.a.s.m.SolrMetricM

[...truncated too long message...]

esting-runner-2.7.2.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) ~[java/:?]
   [junit4]   2> 	at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47) ~[java/:?]
   [junit4]   2> 	at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64) ~[java/:?]
   [junit4]   2> 	at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54) ~[java/:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) ~[randomizedtesting-runner-2.7.2.jar:?]
   [junit4]   2> 	at java.lang.Thread.run(Thread.java:835) [?:?]
   [junit4]   2> 851993 WARN  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.a.h.h.s.d.DirectoryScanner DirectoryScanner: shutdown has been called
   [junit4]   2> 852032 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.w.WebAppContext@721063c1{datanode,/,null,UNAVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/datanode}
   [junit4]   2> 852032 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@ca30e1f{HTTP/1.1, (http/1.1)}{localhost:0}
   [junit4]   2> 852032 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 852032 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@7567b15a{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,UNAVAILABLE}
   [junit4]   2> 852033 WARN  (BP-83972163-127.0.0.1-1588181878506 heartbeating to localhost.localdomain/127.0.0.1:40839) [     ] o.a.h.h.s.d.IncrementalBlockReportManager IncrementalBlockReportManager interrupted
   [junit4]   2> 852033 WARN  (BP-83972163-127.0.0.1-1588181878506 heartbeating to localhost.localdomain/127.0.0.1:40839) [     ] o.a.h.h.s.d.DataNode Ending block pool service for: Block pool BP-83972163-127.0.0.1-1588181878506 (Datanode Uuid 4827672a-9822-4e42-85fb-abaf30fb4bc0) service to localhost.localdomain/127.0.0.1:40839
   [junit4]   2> 852048 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.w.WebAppContext@7fd3e644{hdfs,/,null,UNAVAILABLE}{jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/hdfs}
   [junit4]   2> 852049 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.e.j.s.AbstractConnector Stopped ServerConnector@6ab35e3c{HTTP/1.1, (http/1.1)}{localhost.localdomain:0}
   [junit4]   2> 852049 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.e.j.s.session node0 Stopped scavenging
   [junit4]   2> 852049 INFO  (SUITE-CheckHdfsIndexTest-seed#[4FBD655B33633D2]-worker) [     ] o.e.j.s.h.ContextHandler Stopped o.e.j.s.ServletContextHandler@45f6dc5{static,/static,jar:file:/home/jenkins/.ivy2/cache/org.apache.hadoop/hadoop-hdfs/tests/hadoop-hdfs-3.2.0-tests.jar!/webapps/static,UNAVAILABLE}
   [junit4]   2> NOTE: leaving temporary files on disk at: /home/jenkins/workspace/Lucene-Solr-8.x-Linux/solr/build/solr-core/test/J3/temp/solr.index.hdfs.CheckHdfsIndexTest_4FBD655B33633D2-001
   [junit4]   2> Apr 29, 2020 5:40:17 PM com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks
   [junit4]   2> WARNING: Will linger awaiting termination of 64 leaked thread(s).
   [junit4]   2> NOTE: test params are: codec=Lucene84, sim=Asserting(org.apache.lucene.search.similarities.AssertingSimilarity@1cd5367), locale=ur, timezone=Africa/Addis_Ababa
   [junit4]   2> NOTE: Linux 5.3.0-46-generic amd64/AdoptOpenJDK 12.0.2 (64-bit)/cpus=16,threads=5,free=210059296,total=518979584
   [junit4]   2> NOTE: All tests run in this JVM: [BitVectorTest, TriggerSetPropertiesIntegrationTest, PrimUtilsTest, TestStressLiveNodes, SegmentsInfoRequestHandlerTest, DimensionalRoutedAliasUpdateProcessorTest, FullSolrCloudDistribCmdsTest, ConfigSetsAPITest, TestConfigSets, TestInitQParser, TestLMDirichletSimilarityFactory, TestSweetSpotSimilarityFactory, ClusterStateTest, BlockJoinFacetDistribTest, TestDistribPackageStore, TestInPlaceUpdatesStandalone, TestJmxIntegration, TestRetrieveFieldsOptimizer, HdfsUnloadDistributedZkTest, TestPseudoReturnFields, MoveReplicaTest, SolrLogAuditLoggerPluginTest, TestMaxTokenLenTokenizer, TestDistributedStatsComponentCardinality, AssignBackwardCompatibilityTest, TestStandardQParsers, OverriddenZkACLAndCredentialsProvidersTest, ScriptEngineTest, ExecutePlanActionTest, ChaosMonkeySafeLeaderTest, DistributedFacetSimpleRefinementLongTailTest, TestSortableTextField, TestHashPartitioner, HDFSCollectionsAPITest, TestGraphMLResponseWriter, TestTrie, TestBadConfig, SolrCoreMetricManagerTest, TestCloudInspectUtil, CoreMergeIndexesAdminHandlerTest, LukeRequestHandlerTest, TestRebalanceLeaders, TestSearchPerf, AdminHandlersProxyTest, RootFieldTest, TestSystemIdResolver, TestReloadAndDeleteDocs, TestDocTermOrds, TestSimExtremeIndexing, IndexSchemaTest, HdfsRecoverLeaseTest, CheckHdfsIndexTest]
   [junit4] Completed [536/907 (1!)] on J3 in 142.12s, 5 tests, 1 error, 1 skipped <<< FAILURES!

[...truncated 39323 lines...]
-ecj-javadoc-lint-src:
    [mkdir] Created dir: /tmp/ecj565750759
 [ecj-lint] Compiling 931 source files to /tmp/ecj565750759
 [ecj-lint] ----------
 [ecj-lint] 1. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/CodecUtil.java (at line 523)
 [ecj-lint] 	throw new CorruptIndexException("misplaced codec footer (file truncated?): length=" + in.length() + " but footerLength==" + footerLength(), input);
 [ecj-lint] 	^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'in' is not closed at this location
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 2. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingStoredFieldsReader.java (at line 166)
 [ecj-lint] 	FieldsIndexReader fieldsIndexReader = new FieldsIndexReader(d, si.name, segmentSuffix, INDEX_EXTENSION_PREFIX, INDEX_CODEC_NAME, si.getId());
 [ecj-lint] 	                  ^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'fieldsIndexReader' is never closed
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 3. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/codecs/compressing/CompressingTermVectorsReader.java (at line 148)
 [ecj-lint] 	FieldsIndexReader fieldsIndexReader = new FieldsIndexReader(d, si.name, segmentSuffix, VECTORS_INDEX_EXTENSION_PREFIX, VECTORS_INDEX_CODEC_NAME, si.getId());
 [ecj-lint] 	                  ^^^^^^^^^^^^^^^^^
 [ecj-lint] Resource leak: 'fieldsIndexReader' is never closed
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 4. ERROR in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/search/IndexSearcher.java (at line 50)
 [ecj-lint] 	import org.apache.lucene.util.automaton.ByteRunAutomaton;
 [ecj-lint] 	       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
 [ecj-lint] The import org.apache.lucene.util.automaton.ByteRunAutomaton is never used
 [ecj-lint] ----------
 [ecj-lint] ----------
 [ecj-lint] 5. WARNING in /home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/core/src/java/org/apache/lucene/util/automaton/Operations.java (at line 742)
 [ecj-lint] 	Integer q = newstate.get(statesSet);
 [ecj-lint] 	                         ^^^^^^^^^
 [ecj-lint] Unlikely argument type SortedIntSet for get(Object) on a Map<SortedIntSet.FrozenIntSet,Integer>
 [ecj-lint] ----------
 [ecj-lint] 5 problems (1 error, 4 warnings)

BUILD FAILED
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/build.xml:634: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/build.xml:101: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/build.xml:201: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/common-build.xml:2127: The following error occurred while executing this line:
/home/jenkins/workspace/Lucene-Solr-8.x-Linux/lucene/common-build.xml:2166: Compile failed; see the compiler error output for details.

Total time: 38 minutes 47 seconds
Build step 'Invoke Ant' marked build as failure
Archiving artifacts
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
[WARNINGS] Skipping publisher since build result is FAILURE
Recording test results
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Email was triggered for: Failure - Any
Sending email for trigger: Failure - Any
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2
Setting ANT_1_8_2_HOME=/home/jenkins/tools/hudson.tasks.Ant_AntInstallation/ANT_1.8.2